Data Engineer (Snowflake) Position Description If you're looking for a challenge that stretches your talents and want to make a real difference in how modern businesses harness cloud-native data solutions, come and help us grow our Data Engineering capability at CGI. We need a skilled Data Engineer with a focus on Snowflake to help us build scalable, impactful … travel in the London area. All applicants must have the right to live and work in the UK. Your future duties and responsibilities As a Data Engineer specialising in Snowflake, you'll contribute to the design, development, and optimisation of cloud data platforms, often working with a wide array of cloud services and tools. You'll play a hands-on … delivering data solutions that help clients extract insight and business value, while also promoting engineering best practices. Key responsibilities will include: - Designing and implementing scalable data warehouse solutions using Snowflake - Building efficient ELT/ETL pipelines using DBT and other modern tooling - Writing and optimising complex SQL queries for large datasets - Applying software engineering principles to data systems, including version More ❯
driven team culture. Effectively partner with stakeholders and other teams to define and deliver the right solution for the right business problem statement. Our Tech Stack: Cloud Data Warehouse - Snowflake AWS Data Solutions - Kinesis, SNS, SQS, S3, ECS, Lambda Data Governance & Quality - Collate & Monte Carlo Infrastructure as Code - Terraform Data Integration & Transformation - Python, DBT, Fivetran, Airflow CI/CD - Github … Like To See: Extensive experience in data engineering, including designing and maintaining robust data pipelines. Proficient in SQL and relational databases; familiarity with modern data warehousing solutions such as Snowflake is preferred. Proficient in programming, capable of developing and maintaining tested, production-ready data solutions, preferably in Python. Familiarity with major cloud providers and their data services, concrete experience in More ❯
bring... Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
be great if you have: Experience of relevant cloud services within AWS, Azure or GCP. Experience working in an Agile environment. Experience working with common vendor products such as Snowflake or Data Bricks. Experience working with CI/CD tooling. What you'll get in return is: 25 days' annual leave, rising to 30 days with each year of service. More ❯
About You Strong experience as a Product Owner or Data Product Owner in Agile environments. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
needs. About You Strong experience as a Product Owner or Data Product Owner in Agile environments. Ideally experience managing Finance and data products or platforms, ideally data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics More ❯
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
customer needs. Qualifications Strong experience as a Product Owner or Data Product Owner in Agile environments. Ideally experience managing Finance and data products or platforms, ideally data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
customer needs. Qualifications Strong experience as a Product Owner or Data Product Owner in Agile environments. Ideally experience managing Finance and data products or platforms, ideally data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics More ❯
Employment Type: Permanent, Part Time, Work From Home
Edinburgh, Midlothian, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations
needs. Qualifications Strong experience as a Product Owner or Data Product Owner in Agile environments. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
needs. Qualifications Strong experience as a Product Owner or Data Product Owner in Agile environments. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks More ❯
Employment Type: Permanent, Part Time, Work From Home
innovation. Youll contribute to major customer programmes and benefit from the support of a well-established knowledge-sharing community. What you'll do Design and implement data solutions using Snowflake across cloud platforms (Azure and AWS) Build and maintain scalable data pipelines and ETL processes Optimise data models, storage, and performance for analytics and reporting Ensure data integrity, security, and … best practice compliance Serve as a subject matter expert in Snowflake engineering efforts within delivery teams Collaborate with internal teams and customer stakeholders to define and deliver robust data solutions Participate in customer meetings to present and discuss technical solutions Identify and escalate technical risks where necessary What you'll need Proven experience designing and delivering enterprise-scale data warehouse … solutions using Snowflake In-depth understanding of Snowflake architecture, performance optimisation, and best practices Strong experience in ETL development, data modelling, and data integration Proficient in SQL, Python, and/or Java for data processing Hands-on experience with Azure and AWS cloud environments Familiarity with Agile methodologies such as Scrum, Kanban, or Lean Excellent communication skillsable to articulate complex More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay updated … Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay updated … processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding of More ❯
drive transformation? We're looking for Data Engineers who have a broad range of data engineering skills, with a focus on Microsoft Databricks . Experience with Microsoft Fabric or Snowflake is also highly desirable. ABOUT THE ROLE As a Data Engineer, you will play a key role in delivering high-quality data solutions for our clients. This role offers the … Data Warehousing, Master Data Management, and Business Intelligence. Engineering Delivery Practices : Solid understanding of Agile, DevOps, Git, APIs, Containers, Microservices, and Data Pipelines. Modern Data Tools : Experience with Kafka, Snowflake, Azure Data Factory, Azure Synapse, or Microsoft Fabric (highly desirable). Data Architecture Frameworks : Knowledge of Inmon, Kimball, and Data Vault methodologies. Nice to have certifications: Databricks Certified Data Engineer More ❯
bring... Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
bring... Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
DI, SAS Viya). An ability to write complex SQL queries. Project experience using one or more of the following technologies: Tableau, Python, Power BI, Cloud (Azure, AWS, GCP, Snowflake, Databricks). Project lifecycle experience, having played a leading role in the delivery of end-to-end projects, as 14, rue Pergolèse, Paris, Francewell as a familiarity with different development More ❯
business each day. You will work with the Lead Data Engineer and other members of the Data Engineering team to deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Development of data ingestion … hours including weekends, evenings and public holidays. Your profile Key Skills and Competency Requirements: At least 2 years' experience designing and implementing a full-scale data warehouse solution using Snowflake Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting. Data Modelling using the Kimball Methodology. Experience in developing CI/ More ❯
Familiarity with Database Replication & CDC technologies such as Debezium Familiarity with message & event-driven architecture, including tools like AWS MQ, Kafka Exposure to cloud database services (e.g., AWS RDS, Snowflake) 25 days of holiday Bonus Pension contribution Private medical, dental, and vision coverage Life assurance Critical illness cover Wellness contribution program with access to ClassPass More ❯
and data lake architectures. Develop conceptual, logical, and physical data models to support analytical requirements. Build and optimise data pipelines (ETL/ELT) using tools such as Azure Synapse, Snowflake, Redshift, or similar. Ensure robust data governance, security, and quality management practices. Support cloud data migrations and architecture modernisation initiatives. Front-End BI & Analytics Translate complex data into clear, actionable More ❯