in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps More ❯
in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps More ❯
between systems Experience with Google Cloud Platform (GCP) is highly preferred.(Experience with other cloud platforms like AWS, Azure can be considered.) Familiarity with data pipeline scheduling tools like ApacheAirflow Ability to design, build, and maintain data pipelines for efficient data flow and processing Understanding of data warehousing best practices and experience in organising and cleaning up More ❯
in Python with libraries like TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and data solutions. Knowledge of MLOps More ❯
driven decision-making. What we'd like to see from you: 3–5 years of experience in data integration, orchestration, or automation roles Solid experience with orchestration tools (e.g., ApacheAirflow, MuleSoft, Dell Boomi, Informatica Cloud). Familiarity with cloud data platforms (e.g., AWS, Microsoft Azure, Google Cloud Platform) and related data movement technologies, including AWS Lambda and More ❯
London, England, United Kingdom Hybrid / WFH Options
Workato
activation pipelines. Strong understanding of modern data architecture, including data lakes, data warehouses, structured and semi-structured data processing. Experience with data transformation tools (DBT, Coalesce) and orchestration frameworks (Airflow, Dagster) to build scalable pipelines. Knowledge of real-time data movement, databases (Oracle, SQL Server, PostgreSQL), and cloud analytics platforms (Snowflake, Databricks, BigQuery). Familiarity with emerging data technologies … like Open Table Format, Apache Iceberg, and their impact on enterprise data strategies. Hands-on experience with data virtualization and analytics platforms (Denodo, Domo) to enable seamless self-service data exploration and analytics. Strong background in cloud platforms (AWS, Azure, Google Cloud) and their data ecosystems. AI & Intelligent Data Automation Experience integrating AI/ML-driven insights into data More ❯
Castleford, England, United Kingdom Hybrid / WFH Options
PTSG
or in a similar role. Strong proficiency in SQL and experience with database management systems (e.g., MySQL, PostgreSQL, MongoDB). Experience with data pipeline and workflow management tools (e.g., ApacheAirflow, Luigi). Familiarity with cloud platforms and services and a particular knowledge of Google Cloud would be preferable. Proficiency in programming languages such as Python, Java, or More ❯
Python, data analytics, deep learning (Scikit-learn, Pandas, PyTorch, Jupyter, pipelines), and practical knowledge of data tools like Databricks, Ray, Vector Databases, Kubernetes, and workflow scheduling tools such as ApacheAirflow, Dagster, and Astronomer. GPU Computing: Familiarity with GPU computing, both on-premises and on cloud platforms, and experience in building end-to-end scalable ML infrastructure with More ❯
in data modelling, specifically using star schema methodology, and building performant dimensional models to support high-velocity datasets. Strong experience with Google Cloud Platform (GCP), including BigQuery, Dataflow, Composer (ApacheAirflow), Pub/Sub, Cloud Storage, DBT/Dataform, Datastream, and Cloud Run. Experience supporting agile, high-performing engineering teams in fluid delivery environments, with an emphasis on More ❯
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
in the office in average. In this role you will be working within a new, greenfield division of the business, using a brand-new technology stack including Snowflake, dbt, Airflow and AWS. This function provide data for Machine Learning and Artificial Intelligence capabilities, helping them to provide the best possible service offering to their customers. You'll work on … a strong financial backing, and the chance to make a real impact! We're looking for the following experience: Extensive hands-on experience with Snowflake Extensive experience with dbt, Airflow, AWS and Terraform Exellent scripting skills in SQL Experience developing solutions entirely from scratch Great communication skills, with the ability to understand and translate complex requirements into technical solutions More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Immersum
Data Engineer (leading a team of 5). Salary: £130,000 – £150,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By combining cutting … product, data science, and engineering teams Leading a small team of 5 data engineers What you’ll bring: Strong leadership experience in data engineering Deep expertise with AWS, Snowflake, Airflow, and DBT A pragmatic, product-first approach to building data systems Excellent communication and stakeholder management skills Solid understanding of agile data development lifecycles Why Join: Be a key More ❯
Data Engineer (leading a team of 5). Salary: £130,000 – £150,000 + benefits Location: West London - Hybrid (3 days p/w in-office) Tech: AWS, Snowflake, Airflow, DBT, Python The Company: Immersum have engaged with a leading PropTech company on a mission to revolutionise how the property sector understands people, places, and data. By combining cutting … product, data science, and engineering teams Leading a small team of 5 data engineers What you’ll bring: Strong leadership experience in data engineering Deep expertise with AWS, Snowflake, Airflow, and DBT A pragmatic, product-first approach to building data systems Excellent communication and stakeholder management skills Solid understanding of agile data development lifecycles Why Join: Be a key More ❯
pipelines using AWS Lambda, API Gateway, and Kinesis. Integrating third-party APIs into the data platform and transforming data for CRM delivery. Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. Ensuring observability and reliability using CloudWatch and automated monitoring. Supporting both BAU and new feature development within the data engineering function. KEY SKILLS AND … REQUIREMENTS Proven experience with AWS services including Lambda, API Gateway, S3, Kinesis, and CloudWatch. Strong programming ability in Python and data transformation skills using SQL and DBT. Experience with Airflow for orchestration and scheduling. Familiarity with third-party API integration and scalable data delivery methods. Excellent communication and the ability to work in a collaborative, agile environment. HOW TO More ❯
pipelines using AWS Lambda, API Gateway, and Kinesis. Integrating third-party APIs into the data platform and transforming data for CRM delivery. Migrating R-based data streams into modern Airflow-managed Python/DBT pipelines. Ensuring observability and reliability using CloudWatch and automated monitoring. Supporting both BAU and new feature development within the data engineering function. KEY SKILLS AND … REQUIREMENTS Proven experience with AWS services including Lambda, API Gateway, S3, Kinesis, and CloudWatch. Strong programming ability in Python and data transformation skills using SQL and DBT. Experience with Airflow for orchestration and scheduling. Familiarity with third-party API integration and scalable data delivery methods. Excellent communication and the ability to work in a collaborative, agile environment. HOW TO More ❯
pipelines to power next-gen data products in the commodities industry. Ensure data quality using the latest analytics and monitoring tools. Design and build robust pipelines with tools like Airflow and DBT. Create scalable infrastructure on Azure using technologies like Terraform. Write clean, high-quality, reusable code aligned with best practices. Drive innovation by bringing your own ideas-your … in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. More ❯
pipelines to power next-gen data products in the commodities industry. Ensure data quality using the latest analytics and monitoring tools. Design and build robust pipelines with tools like Airflow and DBT. Create scalable infrastructure on Azure using technologies like Terraform. Write clean, high-quality, reusable code aligned with best practices. Drive innovation by bringing your own ideas-your … in a fast-paced startup or agile environment. Strong background in schema design and dimensional data modeling. Able to communicate data architecture clearly with internal stakeholders. Experience with Azure, Airflow, DBT, Kubernetes, GitHub. Bonus points for: open-source contributions, an active GitHub profile, and curiosity for the latest in tech. A natural problem-solver who loves making things work. More ❯
degree in Computer Science. -Python and SQL experience. -1-5 years experience in a data or software engineering role. -Familiarity with cloud/data warehousing. -Experience with Snowflake, Kafka, Airflow would be helpful. -Experience with financial data sets/vendors would be helpful. More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
exploring the latest tools and tech. The team are using Databricks and AWS and they’re keen for someone who’s worked across data warehouse architecture, orchestration tools like Airflow, and configuration-driven development. You’ll also work closely with analysts, scientists and other business teams, so you’ll need to be able to explain complex technical concepts in … Solid understanding of data architecture, modelling, and ETL/ELT pipelines Experience using tools like Databricks, Redshift, Snowflake, or similar Comfortable working with APIs, CLIs, and orchestration tools like Airflow Confident using Git and familiar with CI/CD processes (Azure DevOps or similar) Experience working in an Agile environment A proactive mindset — you ask questions, think critically, and More ❯
Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element … and system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models More ❯
Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element … and system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models More ❯
Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each element … and system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models More ❯
services. Core Responsibilities CUBE Architecture Implementation: Lead the design and implementation of data pipeline migrations from legacy systems to CUBE o Design and optimize ETL/ELT workflows using Airflow and cloud-native services Develop and maintain data infrastructure using Terraform and related IaC tools Contribute to the implementation of Kubernetes for containerized workloads o Automate and productionise dbt … and performance Data Engineering Excellence: Utilize expert SQL skills (Postgres, Cloud SQL) for data manipulation and analysis Develop robust Python code for building, maintaining, and orchestrating data pipelines Create Airflow Operators to promote component reusability across the data pipelines. Implement and optimize cloud database solutions (BigQuery, Redshift) Ensure data quality through implementation of testing frameworks and monitoring tools Ensure … proven ability to optimize complex queries Strong Python development experience for data pipeline creation and automation Experience with cloud data warehouses (BigQuery, Redshift) Proficiency with workflow orchestration tools, particularly Airflow Experience with Terraform or similar IaC tools for infrastructure management Required Experience: 8+ years of experience in data engineering or related roles Proven track record implementing and optimizing ETL More ❯
current data solution as well as advancing it to the next level. We have created an initial gem of a Data Lake and Lakehouse (Azure Data Lake, ADF, Databricks, Airflow, DBT) to enable Business Intelligence and Data Analytics (Superset, RStudio Connect). Our Data Lake is fully metadata driven, cost efficient, documented, and reproducible. We need our one-source … Engineering experience (Preferably Databricks/Azure - or Snowflake/Redshift/BigQuery) (Required) Experience with infrastructure as code (e.g. Terraform) (Required) Proficiency in using Python both for scheduling (e.g. Airflow) and manipulating data (PySpark) (Required) Experience building deployment pipelines (e.g. Azure Pipelines) (Required) Deployment of web apps using Kubernetes (Preferably ArgoCD & Helm) (Preferred) Experience working on Analytics and Data More ❯
London, England, United Kingdom Hybrid / WFH Options
Zoopla
data engineering best practices, including CI/CD, observability, versioning, and testing in data workflows Architect and evolve our data platform, including data warehousing (Redshift), lakehouse (Databricks), and orchestration (Airflow, Step Functions) capabilities Lead efforts around data governance/cataloging, compliance, and security, ensuring data is trustworthy and well-managed Requirements Essential skills & experience: Proven experience in a technical … Track record of designing and implementing scalable data platforms and ETL/ELT pipelines Knowledge of data warehousing and data lake architectures, and modern orchestration tools (e.g. Step Functions, Airflow) Experience with infrastructure as code (e.g. Terraform) Understanding of data governance and data quality practices Ability to communicate technical concepts clearly and influence senior stakeholders Desirable: Experience building data More ❯