value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
london (city of london), south east england, united kingdom
Capgemini
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
Role: Snowflake Data Architect Location: Hove, UK Type: Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-to-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, logical, and physical data models in Snowflake. Design data pipelines and ingestion frameworks using Snowflake native tools. Collaborate with Data Governance teams to establish data lineage More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Better Placed Ltd - A Sunday Times Top 10 Employer!
and maintaining the company’s data infrastructure and pipelines. This is very much a greenfield environment — the business is currently operating on a serverless setup with plans to implement Snowflake as their data warehouse and lake. You’ll play a key role in designing and deploying the data architecture, integrating data sources, and helping the business get more value … independently and guiding the early stages of data platform implementation. Key Responsibilities Design, build, and manage scalable data pipelines and integrations between core business systems. Lead the implementation of Snowflake as the central data warehouse/lake. Work with third-party providers and internal teams to ingest, transform, and structure data effectively. Replace and modernise existing integrations (e.g. Celigo … environment. Skills & Experience Proven experience as a Data Engineer within a modern data or cloud environment. Strong knowledge of ETL/ELT pipeline design and data modelling. Experience with Snowflake (or similar cloud-based data warehouse solutions). Familiarity with integration and automation tools such as Boomi, Celigo, or equivalent. Experience working in a serverless or modern cloud environment More ❯
bradford, yorkshire and the humber, united kingdom Hybrid / WFH Options
Better Placed Ltd - A Sunday Times Top 10 Employer!
and maintaining the company’s data infrastructure and pipelines. This is very much a greenfield environment — the business is currently operating on a serverless setup with plans to implement Snowflake as their data warehouse and lake. You’ll play a key role in designing and deploying the data architecture, integrating data sources, and helping the business get more value … independently and guiding the early stages of data platform implementation. Key Responsibilities Design, build, and manage scalable data pipelines and integrations between core business systems. Lead the implementation of Snowflake as the central data warehouse/lake. Work with third-party providers and internal teams to ingest, transform, and structure data effectively. Replace and modernise existing integrations (e.g. Celigo … environment. Skills & Experience Proven experience as a Data Engineer within a modern data or cloud environment. Strong knowledge of ETL/ELT pipeline design and data modelling. Experience with Snowflake (or similar cloud-based data warehouse solutions). Familiarity with integration and automation tools such as Boomi, Celigo, or equivalent. Experience working in a serverless or modern cloud environment More ❯
both logical and physical, familiarity with data normalization and denormalization techniques is essential Strong understanding of cloud-based data solutions and experience with platforms like AWS, Azure, Google Cloud, Snowflake Knowledge of advanced analytics tools, this includes creating dashboards, reports, and analytics that provide actionable insights Knowledge of data governance, data quality management, and data security best practices, with … emphasis on adhering to regulatory standards specific to financial services About the candidate: Strong understanding of cloud-based data solutions and experience with platforms like AWS, Azure, Google Cloud, Snowflake Proficiency in data modelling, both logical and physical, familiarity with data normalization and denormalization techniques is essential Demonstrate a strong understanding of the Financial Services Industry and how a More ❯
Columbia, Maryland, United States Hybrid / WFH Options
Redsun Solutions LLC
Data Engineer (Snowflake) Columbia, MD (Hybrid) Were looking for an experienced Data Engineer to design and build scalable data pipelines and solutions inSnowflake for a leading client. Responsibilities: Develop and optimize Snowflake data models, pipelines, and performance. Build and maintain CI/CD processes and Snowflake features (Streams, Tasks, Views Collaborate with data architects and analysts to … with Snowflake. Strong SQL, Python (Snowpark), and data modeling experience. Familiar with CI/CD (Jenkins), dbt/Kafka integration. Excellent communication and problem-solving skills. Nice to Have: Snowflake certification, healthcare domain experience. Location: Columbia, MD (Hybrid 2 days/month onsite)Type: 12-month Contract-to-Hire More ❯
Sacramento, California, United States Hybrid / WFH Options
KK Tech LLC
Snowflake Developer (SQL Specialist) Location: 100% Remote Looking: Chinese Candidates Client: Virtuoso Job Summary: We are looking for a Snowflake Developer with strong SQL and data engineering experience. The ideal candidate will design, develop, and maintain Snowflake data solutions, ensuring performance, scalability, and data integrity. Key Responsibilities: Design and implement Snowflake data models, pipelines, and transformations. … Develop, optimise, and troubleshoot complex SQL queries. Integrate Snowflake with other data sources and ETL tools. Ensure data accuracy, performance tuning, and security best practices. Collaborate with BI, analytics, and data science teams to support reporting and insights. Work with cross-functional teams in both English and Mandarin. Required Skills & Qualifications: Bachelor's degree in Computer Science, Data Engineering … or related field. 8+ years of experience in Snowflake development. Strong hands-on experience with SQL, ETL processes, and data warehousing concepts. Knowledge of Snow pipe, Tasks, Streams, and Time Travel features. Experience with Python or ETL tools (e.g., Airflow, Talend, Informatica) is a plus. Fluent in Mandarin Chinese (spoken and written) and proficient in English. More ❯
good working knowledge in data models viz. Dimensional Data Model, ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles, dynamic tables, streams and tasks, policies etc. Very good working experience in data-related projects or … both Good working knowledge in investment banking and finance Good working knowledge in Statistics Good working knowledge in Power BI Ability to work in multiple projects Skills: Mandatory Skills: Snowflake, ANSI-SQL, Dimensional Data Modeling, Snowpark Container services, Snowflake-Data Science More ❯
warrington, cheshire, north west england, united kingdom
PRIMUS Global Solutions (PRIMUS UK & Europe)
good working knowledge in data models viz. Dimensional Data Model, ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles, dynamic tables, streams and tasks, policies etc. Very good working experience in data-related projects or … both Good working knowledge in investment banking and finance Good working knowledge in Statistics Good working knowledge in Power BI Ability to work in multiple projects Skills: Mandatory Skills: Snowflake, ANSI-SQL, Dimensional Data Modeling, Snowpark Container services, Snowflake-Data Science More ❯
bolton, greater manchester, north west england, united kingdom
PRIMUS Global Solutions (PRIMUS UK & Europe)
good working knowledge in data models viz. Dimensional Data Model, ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles, dynamic tables, streams and tasks, policies etc. Very good working experience in data-related projects or … both Good working knowledge in investment banking and finance Good working knowledge in Statistics Good working knowledge in Power BI Ability to work in multiple projects Skills: Mandatory Skills: Snowflake, ANSI-SQL, Dimensional Data Modeling, Snowpark Container services, Snowflake-Data Science More ❯
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data models built on SQL Server and guide the development of optimized structures on Snowflake, ensuring alignment with business requirements and enterprise standards. You will collaborate with Business Analysts and cross-functional teams to translate complex reporting and analytics needs into efficient, well-governed … data requirements. Ensure data models align with industry standards and regulatory guidelines. Required 5+ years of experience in data modeling and database design. Strong expertise in SQL Server and Snowflake data modeling. Proficiency in SQL for data analysis and validation. Experience in data architecture principles and data governance. Solid understanding of Insurance domain concepts (Specialized Insurance, London Market, Regulatory More ❯
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data models built on SQL Server and guide the development of optimized structures on Snowflake, ensuring alignment with business requirements and enterprise standards. You will collaborate with Business Analysts and cross-functional teams to translate complex reporting and analytics needs into efficient, well-governed … data requirements. Ensure data models align with industry standards and regulatory guidelines. Required 5+ years of experience in data modeling and database design. Strong expertise in SQL Server and Snowflake data modeling. Proficiency in SQL for data analysis and validation. Experience in data architecture principles and data governance. Solid understanding of Insurance domain concepts (Specialized Insurance, London Market, Regulatory More ❯
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data models built on SQL Server and guide the development of optimized structures on Snowflake, ensuring alignment with business requirements and enterprise standards. You will collaborate with Business Analysts and cross-functional teams to translate complex reporting and analytics needs into efficient, well-governed … data requirements. Ensure data models align with industry standards and regulatory guidelines. Required 5+ years of experience in data modeling and database design. Strong expertise in SQL Server and Snowflake data modeling. Proficiency in SQL for data analysis and validation. Experience in data architecture principles and data governance. Solid understanding of Insurance domain concepts (Specialized Insurance, London Market, Regulatory More ❯
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data models built on SQL Server and guide the development of optimized structures on Snowflake, ensuring alignment with business requirements and enterprise standards. You will collaborate with Business Analysts and cross-functional teams to translate complex reporting and analytics needs into efficient, well-governed … data requirements. Ensure data models align with industry standards and regulatory guidelines. Required 5+ years of experience in data modeling and database design. Strong expertise in SQL Server and Snowflake data modeling. Proficiency in SQL for data analysis and validation. Experience in data architecture principles and data governance. Solid understanding of Insurance domain concepts (Specialized Insurance, London Market, Regulatory More ❯
london (city of london), south east england, united kingdom
Infosys
Role – Technology Lead Technology – Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location – UK Business Unit – DNAINS Compensation – Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data models built on SQL Server and guide the development of optimized structures on Snowflake, ensuring alignment with business requirements and enterprise standards. You will collaborate with Business Analysts and cross-functional teams to translate complex reporting and analytics needs into efficient, well-governed … data requirements. Ensure data models align with industry standards and regulatory guidelines. Required 5+ years of experience in data modeling and database design. Strong expertise in SQL Server and Snowflake data modeling. Proficiency in SQL for data analysis and validation. Experience in data architecture principles and data governance. Solid understanding of Insurance domain concepts (Specialized Insurance, London Market, Regulatory More ❯
good working knowledge in data models viz. Dimensional Data Model, ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles, dynamic tables, streams and tasks, policies etc. Very good working experience in data-related projects or … both Good working knowledge in investment banking and finance Good working knowledge in Statistics Good working knowledge in Power BI Ability to work in multiple projects Skills: Mandatory Skills: Snowflake, ANSI-SQL, Dimensional Data Modeling, Snowpark Container services, Snowflake-Data Science More ❯
Role Technology Lead Technology Snowflake, SQL Server, PostgreSQL, Data Warehousing, Data Lake, Medallion, Data Vault, Data Fabric, Semantic modelling Location UK Business Unit DNAINS Compensation Competitive (including bonus) Job Description We are seeking an experienced Data Modeler who can analyse and understand the existing customer data and analytics models on SQL Server, and design efficient data models for Snowflake. … data models to support scalable, high-performance analytics and reporting solutions. You will analyse existing data models built on SQL Server and guide the development of optimized structures on Snowflake, ensuring alignment with business requirements and enterprise standards. You will collaborate with Business Analysts and cross-functional teams to translate complex reporting and analytics needs into efficient, well-governed … data requirements. Ensure data models align with industry standards and regulatory guidelines. Required 5+ years of experience in data modeling and database design. Strong expertise in SQL Server and Snowflake data modeling. Proficiency in SQL for data analysis and validation. Experience in data architecture principles and data governance. Solid understanding of Insurance domain concepts (Specialized Insurance, London Market, Regulatory More ❯
expertise, innovation, and collaboration. Responsibilities: Design, develop, and maintain scalable data pipelines and transformation processes utilizing modern tools and frameworks Implement and optimise data workflows and ETL procedures within Snowflake Create robust data models to support advanced analytics and machine learning initiatives Collaborate with cross-functional stakeholders to understand business data requirements and deliver effective solutions Establish and enforce … models and interpret results to inform business decisions Candidate Profile: strong background in building enterprise data solutions Extensive hands-on experience with Python and data transformation techniques Expertise in Snowflake cloud data platform and ETL process optimisation Familiarity with machine learning tools such as TensorFlow or scikit-learn Strong communication skills, capable of translating complex technical concepts to non More ❯
expertise, innovation, and collaboration. Responsibilities: Design, develop, and maintain scalable data pipelines and transformation processes utilizing modern tools and frameworks Implement and optimise data workflows and ETL procedures within Snowflake Create robust data models to support advanced analytics and machine learning initiatives Collaborate with cross-functional stakeholders to understand business data requirements and deliver effective solutions Establish and enforce … models and interpret results to inform business decisions Candidate Profile: strong background in building enterprise data solutions Extensive hands-on experience with Python and data transformation techniques Expertise in Snowflake cloud data platform and ETL process optimisation Familiarity with machine learning tools such as TensorFlow or scikit-learn Strong communication skills, capable of translating complex technical concepts to non More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
initiatives. What we’d love you to bring: Proven, hands-on expertise in data modelling, with a strong track record of designing and implementing complex dimensional models, star and snowflake schemas, and enterprise-wide canonical data models Proficiency in converting intricate insurance business processes into scalable and user-friendly data structures that drive analytics, reporting, and scenarios powered by … Delta Live Tables Strong background in building high-performance, scalable data models that support self-service BI and regulatory reporting requirements Direct exposure to cloud-native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plus Experience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or Spark Familiarity with governance frameworks, access controls (RBAC More ❯
guildford, south east england, united kingdom Hybrid / WFH Options
esure Group
initiatives. What we’d love you to bring: Proven, hands-on expertise in data modelling, with a strong track record of designing and implementing complex dimensional models, star and snowflake schemas, and enterprise-wide canonical data models Proficiency in converting intricate insurance business processes into scalable and user-friendly data structures that drive analytics, reporting, and scenarios powered by … Delta Live Tables Strong background in building high-performance, scalable data models that support self-service BI and regulatory reporting requirements Direct exposure to cloud-native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plus Experience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or Spark Familiarity with governance frameworks, access controls (RBAC More ❯