value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
london (city of london), south east england, united kingdom
Capgemini
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
Implementation partner : TCS Lociaotn: New York-2, NY (Day One Onsite) Job Description/Skills: Data Analyst Strong SQL hands-on experience Very good at Python coding Knowledgeable in Snowflake and DBT models Familiar with Azure Data Factory Familiar with Azure Data Bricks More ❯
Role: Snowflake Data Architect Location: Hove, UK Type: Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-to-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, logical, and physical data models in Snowflake. Design data pipelines and ingestion frameworks using Snowflake native tools. Collaborate with Data Governance teams to establish data lineage More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
maintain conceptual, logical, and physical data models. Develop and implement Data Vault 2.0 models for scalable and auditable data warehousing. Collaborate with engineering teams to implement models using DBT, Snowflake, Azure Data Factory, and Microsoft Fabric. Translate business requirements into robust data models. Ensure alignment with data governance, quality, and security standards. Provide technical guidance on data modelling best … Strong expertise in Data Vault modelling (preferably DV 2.0). Experience with semantic modelling for BI tools (eg, Power BI, Fabric). Hands-on experience with Azure Data Factory, Snowflake, DBT, and Microsoft Fabric. Understanding of Data Mesh principles and domain-oriented data product design. Familiarity with data governance, metadata management, and data quality frameworks. Experience in financial services More ❯
of 10 years of commercial experience working in a data-centric environment with a proven track record in financial services.Understanding of data modelling principles (eg: relationships, normalisation, star/snowflake schemas) You will have experience with enterprise data modelling tools (eg: ER/Studio, Erwin). Deep knowledge of capital markets and/or investment banking products (bonds, repos More ❯
of 10 years of commercial experience working in a data-centric environment with a proven track record in financial services.Understanding of data modelling principles (eg: relationships, normalisation, star/snowflake schemas) You will have experience with enterprise data modelling tools (eg: ER/Studio, Erwin). Deep knowledge of capital markets and/or investment banking products (bonds, repos More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
Senior Engineer to join our Data & Analytics team. This role is instrumental in delivering clean, modern, and efficient data solutions across cloud-native platforms. Key Responsibilities Develop solutions across Snowflake, Azure, and DBT platforms. Lead migration and optimisation of applications using Azure cloud-native services. Write clean, testable, and maintainable code following industry standards. Implement CI/CD pipelines … deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of data warehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. CI/CD pipeline implementation and test automation. Familiarity with AI-powered development tools More ❯
Snowflake Data Modeller Location: Remote (Mostly remote with some occasional travel to Hemel Hempstead) Contract: Outside IR35 Day rate: Up to £550 per day Duration: 6 months Start date: ASAP Key skills: Snowflake, DBT, SQL, Python, AWS and Kimball The Client who are in the process of migrating to SnowFlake therefore require extra support. As a result … opportunity to be at the cutting edge of data engineering. YOUR SKILLS AND EXPERIENCE A successful Senior Data Engineer here will have experience in the following: - Advanced SQL knowledge - SnowFlake (ideally certified) - Python development - AWS cloud experience essential, relating to data tooling and development - Working knowledge of Data Build Tool (DBT). o Develop staging, intermediate and marts in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
with data scientists, analysts, and engineers to deliver clean, structured, and reliable data. Develop robust data transformations in Python and SQL, ensuring performance and accuracy. Work hands-on with Snowflake to model, optimise, and manage data flows. Continuously improve data engineering practices - from automation to observability. Bring ideas to the table: help shape how data is collected, processed, and … years of experience in data engineering or a similar role. Strong Python skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/ More ❯
Role: Snowflake Data Architect Location: Hove, UK Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, Logical and Physical Data Models in Snowflake. Design Data Pipelines and ingestion frameworks using Snowflake native tools. Work with Data Governance teams to establish Data lineage, Data quality More ❯
Role/Job Title: AWS Solution Architect Work Location: London 250 Bishopsgate (Onsite) The Role As an AWS Solution Architect , you will be working closely with the technical team on development and implementation journeys that transform traditional banking infrastructure and More ❯
Senior Data Engineer page is loaded Senior Data Engineerlocations: Londontime type: Full timeposted on: Posted Todayjob requisition id: JR0149 What you will deliver Hands on development on Nexus Platform Responsible for the code quality and simplicity in the system Working More ❯
/hour C2C Location: San Francisco, CA Duration: 12+ months/long-term Interview Criteria: Telephonic + Skype Direct Client Requirement Must haves: (BI, Tableau, SQL, DWH, Star Schema, date lake/Snowflake) and at least one of the following: Qlik, Cognos, Business Objects, or MicroStrategy) In this client is looking between 9-15 years of experience. … with large-scale data warehouses and strong SQL skills. The individual should understand the extraction and transformation process for reporting needs. Strong understanding of data modeling concepts, including star schema, fact and dimension tables, and SCDs. The individual will collaborate with data modelers to design efficient data models and optimize reporting structures. Experience working in nearshore and offshore team More ❯