clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
London, England, United Kingdom Hybrid / WFH Options
P&I Insurance Services
Salary - £70-80k with 15% bonus Hybrid working - couple of days in the office City of London Good understanding of data engineering principles A good technical grasp of Snowflake and automating it, and transforming complex datasets AWS Skillset Delivery experience Building solutions in snowflake Insurance experience - advantageous but not necessary Salary - £70-80k with 15% bonus Hybrid working … couple of days in the office City of London We Are Looking For Good understanding of data engineering principles A good technical grasp of Snowflake and automating it, and transforming complex datasets SnowPro core certification. AWS Skillset Delivery experience Building solutions in snowflake Insurance experience - advantageous but not necessary Key Responsibilities Lead the design and implementation of Snowflake [and Redshift … architecture, ensuring optimal health and efficiency. Skills And Experience Bachelor's degree or higher in a technical discipline Proven experience as a data engineer with a strong focus on Snowflake and AWS services in large-scale enterprise environments Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Strong SQL skills for complex data queries and More ❯
Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay updated … Build and optimize data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay updated … processors. Proven experience in SQL,DBT and Snowflake. Proficiency in building and managing data transformations with dbt, with experience in optimizing complex transformations and documentation. Hands-on experience with Snowflake as a primary data warehouse, including knowledge of performance optimization, data modeling, and query tuning. Strong proficiency in data analysis tools and languages (e.g., SQL, Python). Strong understanding of More ❯
day. You will work with the Lead Data Engineer and other members of the Data Engineering team to design and deliver our new strategic enterprise data platform based on Snowflake and DBT, while also maintaining our legacy data platform. Key Responsibilities: Data warehouse design and implementation working towards the creation of a single source of truth. Contributing to the architectural … hours including weekends, evenings and public holidays. Your profile Key Skills and Competency Requirements: At least 3 years' experience designing and implementing a full-scale data warehouse solution using Snowflake Expertise and excellent proficiency with Snowflake internals and integration of Snowflake with other technologies for data processing and reporting. Data Modelling using the Kimball Methodology. Experience in developing CI/ More ❯
Experience with analytical and real-time/streaming data solutions. Hands-on experience with data modeling tools (e.g., Erwin, Lucidchart, SAP PowerDesigner). Knowledge of Python and SQL (e.g., Snowflake or similar warehousing technology, real-time systems). Experience with AWS services such as Lambda, SNS, S3, EKS, API Gateway. Knowledge of data warehouse design, ETL/ELT processes, and … big data technologies (e.g., Snowflake, Spark). Understanding of data governance and compliance frameworks (e.g., GDPR, HIPAA). Strong communication and stakeholder management skills. Analytical mindset with attention to detail. Leadership and mentoring abilities in data modeling best practices. Preferred Skills and Qualifications Certifications in data modeling, cloud platforms, or database technologies. Experience developing and implementing enterprise data models. Experience … with interface/API data modeling. Knowledge of CI/CD tools like GitHub Actions or similar. AWS certifications such as AWS Certified Data Engineer. Knowledge of Snowflake, SQL, Apache Airflow, and DBT. Familiarity with Atlan for data cataloging and metadata management. Understanding of iceberg tables. Who we are: We're a global business empowering local teams with exciting work More ❯
in their professional development. Promote Hippo’s Engineering Herd internally and externally (for example through writing Blogs, Workshops, Seminars or Conferences). About the Candidate: Proven track record in Snowflake, Python, dbt, AWS and Terraform. Proficiency in R, Bash, Java/.NET and/or PowerShell desirable. Broad knowledge across cloud architectures, networking and distributed computing systems. Experience of data … Lakes, Data Meshes and Data Warehouses. Experience of a wide range of data sources, SQL, NoSQL and Graph. A proven track record of infrastructure delivery on any data platform (Snowflake, Elastic, Redshift, Data Bricks, Splunk, etc.). Strong and demonstrable experience writing regular expressions and/or JSON parsing, etc. Strong experience in log processing (Cribl, Splunk, Elastic, Apache NiFi More ❯
Build and optimise data pipelines using dbt, ensuring clean and accessible data. Monitor data quality and implement validation processes in collaboration with data engineers. Create scalable data models in Snowflake using dbt and identify opportunities for efficiency gains. Optimize workflows and monitor system performance for continuous improvements. Ensure data practices meet regulatory standards and assist in compliance reporting. Stay updated … DBT) Extensive experience in building and managing data transformations with dbt, with experience in optimising complex transformations and documentation. Hands-on experience with popular Cloud data-warehouses such as Snowflake or Redshift, including knowledge of performance optimisation, data modeling, and query tuning. Highly Proficient in data analysis tools and languages (e.g., SQL, Python). Strong understanding of data modeling principles More ❯
London, England, United Kingdom Hybrid / WFH Options
Whitehall Resources Ltd
engineer and modernize existing ETL processes from legacy systems into scalable cloud-native solutions • Contribute to the development and optimization of a cloud-based data platform, leveraging tools like snowflake, AWS and airflow • Work closely with data architects, analysts and other engineers to deliver high-quality, production-ready code • Participate in code reviews, ensuring adherence to best practices and high … techniques • Hands-on exposure to cloud platforms, especially AWS • Experience working in agile teams and using version control and CI/CD practices Desirable skills and experience: • Experience with Snowflake or other cloud-native data warehouse technologies • Familiarity with GraphQL and its use in data-driven APIs • Exposure to data governance, data quality and metadata management tools • Interest or experience More ❯
London, England, United Kingdom Hybrid / WFH Options
Substance Global
best practices. Take initiative to improve and optimise analytics engineering workflows and platforms Key Responsibilities Design, develop, and maintain scalable data pipelines on GCP (Or other cloud services ie. Snowflake) using services such as BigQuery and Cloud Functions. Designing and building data models for analytics, reporting, and data science applications in Looker/Metabase(Visualisation solutions) using (explores, views, etc … which will enable users across the organization to self-serve analytics. Strong Python (Pandas/SQL Alchemy) & SQL(BigQuery/Snowflake) skills for maintaining ETL/ELT pipelines. Developing tools and processes to ensure the quality and observability of data Monitoring and improving the performance and efficiency of the data stack Producing excellent documentation on all of the above Working … warehousing skills demonstrated in data environments Excellent Python & SQL and data transformation skills (e.g. ideally proficient in dbt or similar) Familiarity with at least one of these Cloud technologies: Snowflake, AWS, Google Cloud, Microsoft Azure Good attention to detail to highlight and address data quality issues Excellent time management and proactive problem-solving skills What’s in it for you More ❯
London, England, United Kingdom Hybrid / WFH Options
ScanmarQED
Professional Experience: 3–5 years in Data Engineering, Data Warehousing, or programming within a dynamic (software) project environment. Data Infrastructure and Engineering Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft … Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in Python and SQL. Data Modeling and Query Optimization: Data Modeling: Designing star/snowflake schemas and understanding normalization and denormalization. SQL Expertise: Writing efficient queries and optimizing for performance. DevOps and CI/CD: Version Control: Using Git and platforms like GitHub, GitLab, or Bitbucket. Data Governance and Security More ❯
improvement and innovation Build data integrations from multiple sources, including CRM, digital, and social platforms Design, implement and optimize data models with a medallion architecture using Star Schema and Snowflake techniques to enhance query performance and support analytical workloads Ensure data quality, consistency, and reliability across all marketing datasets Collaborate with analysts and data scientists to deliver clean, usable data … with new tools, technologies, and approaches in data engineering and marketing analytics YOU’LL THRIVE IN THIS ROLE IF YOU HAVE THE FOLLOWING SKILLS AND QUALITIES: Significant experience with Snowflake, DBT, Python Dagster, Airflow or similar orchestrating tool Knowledge of additional technologies is a plus: Azure, Microsoft SQL Server, Power BI Strong proficiency in SQL & Python Familiarity with additional languages More ❯
and architectural direction across entire programmes and client estates. Key Responsibilities Senior Data Architects: Lead the design and delivery of cloud-native data solutions using modern platforms (e.g. Databricks, Snowflake, Kafka, Confluent) Architect data lakes, lakehouses, streaming pipelines, and event-driven architectures Oversee engineering teams and collaborate with analysts and QA functions Translate complex requirements into scalable, robust data products … Proven track record in data architecture , either from a delivery or enterprise strategy perspective Deep experience with cloud platforms (Azure, AWS, GCP) and modern data ecosystems (Spark, Databricks, Kafka, Snowflake) Strong understanding of Data Mesh , Data Fabric , and data product-led approaches Data modelling expertise (relational, dimensional) and familiarity with tools like Erwin , Sparx , Archi Experience with ETL/ELT More ❯
Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data solutions We'd love to talk to you More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava
business objectives. Key Responsibilities Data Pipeline Development Architect, implement and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target mappings and quality requirements. Build ETL/ELT workflows, validation … and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage: Relational (PostgreSQL, SQL Server), NoSQL (MongoDB, Cassandra), Dimensional modelling. DevOps & Automation: Docker, Kubernetes More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data solutions We'd love to talk to you More ❯
London, England, United Kingdom Hybrid / WFH Options
Marshmallow
than others. Meaningful Work: Your efforts will positively impact people who are often financially disadvantaged when they arrive in the UK. Innovative Tech Stack: Work with modern tools like Snowflake, DBT, AWS Kinesis, Step Functions, Airflow, Datahub, and Looker. What You’ll Be Doing Platform Ownership: As the subject matter expert for our data platform, you'll manage and enhance … core components like DBT, Snowflake, AWS (S3, Kinisis, DMS, Step Functions, Dynamo CDC, Backup), Airbyte, Transcend, Segment and Looker. Holistic Understanding: Develop a deep understanding of our entire data platform to safely and effectively work across the data stack. User Collaboration: Work closely with platform users to ensure optimal usability and performance, driving long-term strategies to maximise value for More ❯
London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
solid grasp of data governance, data modeling, and business intelligence best practices. Knowledge of Agile, DevOps, Git, APIs, microservices, and data pipeline development . Familiarity with Spark, Kafka, or Snowflake is a plus. Desirable Certifications: Microsoft Certified: Fabric Analytics Engineer Associate Why Join Us? Competitive salary up to £70,000 per year Opportunities for growth, training, and development Collaborative, innovative More ❯
bring... Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
bring... Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
skills including an ability to effectively communicate with both business and technical teams. Some knowledge of Linux and the command line. Understanding of ETL and event streaming e.g. Kafka. Snowflake, Kubernetes and Airflow experience is desirable. Experience with Amazon Web Services (AWS) would be beneficial. Basic knowledge of data science topics like machine learning, data mining, statistics, and visualisation. #J More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Jet2
modelling stages of data warehouse projects and be comfortable with conceptual, logical and physical data modelling techniques as well as dimensional modelling techniques. Experience using cloud data warehouse technology- Snowflake (preferred), Google BigQuery, AWS Redshift or Azure Synapse and experience working with key services on either GCP, AWS or Azure. Key services include cloud storage, containerisation, event-driven services, orchestration More ❯