pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of new services as More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
and practices Proficiency in Python and SQL for data engineering tasks Experience with DBT and a good understanding of data modelling approaches (e.g. star schema, dimensional modelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure Experience using version control and CI/CD tools like Git More ❯
able to explain complex concepts clearly to technical and business audiences. Consulting mindset - comfortable collaborating with diverse stakeholders and adapting to different industries. Nice-to-Haves Data pipelines (Kafka, Airflow, Celery) or real-time processing. Data modeling and warehousing. Transforming data ready for analytics/ML and storing it. Infrastructure as Code (Terraform, CloudFormation) and CI/CD experience More ❯
CI/CD skills to automate the development lifecycle: proficiency with Github Actions preferred. Advanced proficiency in Python for data engineering tasks. Experience with data orchestration tools such as Airflow, Dagster, or similar. Solid understanding of data governance, security principles, and privacy best practices, ideally in regulated or sensitive data environments. Experience with dbt for data modelling and quality More ❯
members to deliver outcome Highly motivated, with strong problem-solving skills and business acumen Desirable Skills Experience with data visualization tools (e.g., Looker, Tableau, Power BI) Familiarity with dbt, Airflow, or other modern data tools Exposure to marketing or e-commerce industries Additional Information Publicis Groupe operates a hybrid working pattern with full time employees being office-based three More ❯
Desirable Skills Background in e-commerce, marketing, or operations analytics Experience with data visualization tools (e.g., Looker, Tableau, Power BI) Familiarity with modern data stack tools such as dbt, Airflow, and Git-based workflows Experience contributing to data product development, working closely with engineers and product managers Additional Information Publicis Groupe operates a hybrid working pattern with full time More ❯
time event processing databases. On prem data warehouse experience is valuable. Data Modeling: Expertise in designing and implementing data models for analytics. Data orchestration and transformation tools like dbt, Airflow, or similar Product mindset: Able to work directly with business stakeholders to translate analytics needs into user-friendly, highly-performant data products. Troubleshooter: Experience diagnosing and resolving data issues More ❯
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Starling Bank Limited
systems (e.g. Git) Desirables: Experience with cloud-based ML infrastructure, particularly GCP (Vertex AI, BigQuery), or equivalent (e.g. AWS, Azure) Exposure to orchestration tools such as Kubeflow pipelines or Airflow Familiarity with DBT or similar tools for modelling data in data warehouses Desire to build interpretable and explainable ML models (using techniques such as SHAP) Desire to quantify the More ❯
and reliability across our platform. Working format: full-time, remote. Schedule: Monday to Friday (the working day is 8+1 hours). Responsibilities: Design, develop, and maintain data pipelines using ApacheAirflow . Create and support data storage systems (Data Lakes/Data Warehouses) based on AWS (S3, Redshift, Glue, Athena, etc.). Integrate data from various sources, including … pipeline into a standalone, scalable service. What we expect from you: 4+ years of experience as a Data Engineer, including 1+ year at a Senior level. Deep knowledge of Airflow : DAGs, custom operators, and monitoring. Strong command of PostgreSQL databases; familiarity with the AWS stack (S3, Glue or Redshift, Lambda, CloudWatch) is a significant plus. Excellent SQL skills and More ❯
of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally ApacheAirflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems More ❯
of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally ApacheAirflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems More ❯
Strong grasp of API design and integration methods (REST, GraphQL, Webhooks) - Knowledge of OAuth2, JWT, and secure authentication protocols - Experience with ETL/ELT pipelines, workflow orchestration tools (e.g., ApacheAirflow) - Solid understanding of both SQL and NoSQL databases - Familiarity with AWS and its integration services - Strong problem-solving, communication, and collaboration skills - Agile team experience Nice to More ❯
streaming data solutions Proficiency in Python, SQL, and data modelling tools (e.g. Erwin, Lucidchart) Strong knowledge of AWS services (Lambda, SNS, S3, EKS, API Gateway) Familiarity with Snowflake, Spark, Airflow, DBT, and data governance frameworks Preferred: Certifications in cloud/data technologies Experience with API/interface modelling and CI/CD (e.g. GitHub Actions) Knowledge of Atlan and More ❯
and backend services Experience designing, building, and managing cloud-native data pipelines Comfortable collaborating with cross-functional teams (engineering, product, stakeholders) Nice to have: Familiarity with tools like dbt, Airflow, or Python scripting Knowledge of data warehousing, ETL frameworks, and modern data stack best practices Why this role: Our client has a clear vision for what success looks like More ❯
and backend services Experience designing, building, and managing cloud-native data pipelines Comfortable collaborating with cross-functional teams (engineering, product, stakeholders) Nice to have: Familiarity with tools like dbt, Airflow, or Python scripting Knowledge of data warehousing, ETL frameworks, and modern data stack best practices Why this role: Our client has a clear vision for what success looks like More ❯
in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
london (city of london), south east england, united kingdom
Harnham
in data leadership – ideally within fintech, SaaS, or regulated tech environments. Technical depth across data engineering, analytics, or data science. Hands-on familiarity with modern data stacks – SQL, dbt, Airflow, Snowflake, Looker/Power BI. Understanding of the AI/ML lifecycle – including tooling (Python, MLflow) and best-practice MLOps. Comfortable working across finance, risk, and commercial functions. Experience More ❯
development. Solid understanding of data processing and engineering workflows. Experience building APIs or services to support data or ML applications. Familiarity with ML model lifecycle and tooling (e.g. MLflow, Airflow, Docker). Strong problem-solving skills and the ability to work autonomously in a dynamic environment. DESIRABLE SKILLS Experience supporting LLM training or retrieval-augmented generation (RAG). Familiarity More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
teams Strong technical background (5+ years) in building scalable data platforms Excellent communication and stakeholder management skills Hands-on experience with modern data tools and technologies — Python, SQL, Snowflake, Airflow, dbt, Spark, AWS, Terraform A collaborative mindset and a passion for mentoring and developing others Comfortable balancing technical decisions with business needs Nice to have: experience with Data Mesh More ❯
leadership, ideally within fintech or cloud-native organisations (AWS preferred). Strong technical background in data engineering, analytics, or data science. Experience with modern data stacks (e.g., SQL, dbt, Airflow, Snowflake, Looker/Power BI) and AI/ML tooling (e.g., Python, MLflow, MLOps). A track record of building and managing high-performing data teams. Strategic thinking and More ❯
uphold data quality, security, and governance standards. Collaborate with teams to establish KPIs and core business metrics. Innovation & Future Tools (10%) Explore and implement new tools (e.g. dbt, Fivetran, Airflow) to enhance data capabilities. Stay current with evolving trends in data engineering and BI. What They’re Looking For Technical Experience 7+ years’ experience across data engineering, analytics, or More ❯