London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
with a passion for new technologies Experience in startups or top-tier consultancies is a plus Nice to Have: Familiarity with dashboarding tools, Typescript, and API development Exposure to Airflow, DBT, Databricks Experience with ERP (e.g. SAP, Oracle) and CRM systems What's On Offer: Salary: £50,000-£75,000 + share options Hybrid working: 2-3 days per More ❯
optimized. YOUR BACKGROUND AND EXPERIENCE 5 years of commercial experience working as a Data Engineer 3 years exposure to the Azure Stack - Data bricks, Synapse, ADF Python and PySpark Airflow for Orchestration Test-Driven Development and Automated Testing ETL Development More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
. Ability to collaborate effectively with senior engineers, data scientists, and architects. Proactive, detail-oriented, and eager to contribute within a greenfield project environment. Nice to Have: Experience with Airflow/Astro . Prior work with notebook-based development environments. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community More ❯
and Python programming languages. · Strong understanding of graph databases (e.g., RDF, Neo4j , GraphDB). · Experience with data modeling and schema design. · Knowledge of data pipeline tools and frameworks (e.g., ApacheAirflow, Luigi). · Excellent problem-solving and analytical skills. · Ability to work independently and as part of a team. Clinical knowledge More ❯
and Python programming languages. · Strong understanding of graph databases (e.g., RDF, Neo4j , GraphDB). · Experience with data modeling and schema design. · Knowledge of data pipeline tools and frameworks (e.g., ApacheAirflow, Luigi). · Excellent problem-solving and analytical skills. · Ability to work independently and as part of a team. Clinical knowledge More ❯
london (city of london), south east england, united kingdom
Vallum Associates
and Python programming languages. · Strong understanding of graph databases (e.g., RDF, Neo4j , GraphDB). · Experience with data modeling and schema design. · Knowledge of data pipeline tools and frameworks (e.g., ApacheAirflow, Luigi). · Excellent problem-solving and analytical skills. · Ability to work independently and as part of a team. Clinical knowledge More ❯
research, staging, and production environments. Design and implement model registries, versioning systems, and experiment tracking to ensure full reproducibility of all model releases. Deploy ML workflows using tools like Airflow or similar, managing dependencies from data ingestion through model deployment and serving. Instrument comprehensive monitoring for model performance, data drift, prediction quality, and system health. Manage infrastructure as code More ❯
research, staging, and production environments. Design and implement model registries, versioning systems, and experiment tracking to ensure full reproducibility of all model releases. Deploy ML workflows using tools like Airflow or similar, managing dependencies from data ingestion through model deployment and serving. Instrument comprehensive monitoring for model performance, data drift, prediction quality, and system health. Manage infrastructure as code More ❯
london (city of london), south east england, united kingdom
algo1
research, staging, and production environments. Design and implement model registries, versioning systems, and experiment tracking to ensure full reproducibility of all model releases. Deploy ML workflows using tools like Airflow or similar, managing dependencies from data ingestion through model deployment and serving. Instrument comprehensive monitoring for model performance, data drift, prediction quality, and system health. Manage infrastructure as code More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Xact Placements Limited
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if you’ve worked More ❯
london, south east england, united kingdom Hybrid / WFH Options
iO Associates
x Contract Data Engineers - Snowflake/AWS/Python/Airflow/Iceberg Location: London (Hybrid - 3 days per week onsite) Duration: 6 months Day Rate: £550 - £600 (Inside IR35) A highly reputable consultancy is seeking 2 x Contract Data Engineers to join their data team on a 6-month engagement. You will play a key role in building … and reporting capabilities. Key Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
iO Associates
x Contract Data Engineers - Snowflake/AWS/Python/Airflow/Iceberg Location: London (Hybrid - 3 days per week onsite) Duration: 6 months Day Rate: £550 - £600 (Inside IR35) A highly reputable consultancy is seeking 2 x Contract Data Engineers to join their data team on a 6-month engagement. You will play a key role in building … and reporting capabilities. Key Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates More ❯
london, south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
slough, south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
london (city of london), south east england, united kingdom
Kubrick Group
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
and consumption layers Design and implement secure, performant Snowflake environments including RBAC, data masking, and policies/entitlement understanding Build and optimise ELT pipelines (using tools such as dbt, Airflow, Fivetran, or native Snowflake tasks) to support batch and real-time use cases Collaborate with Kubrick and client stakeholders to inform delivery planning, data strategy, and architecture decisions Promote … language (Python preferred) for automation. Experience with cloud platforms (AWS, Azure, or GCP), including security, IAM, and storage services Experience deploying and maintaining production pipelines using tools such as Airflow or Dagster Understanding of CI/CD principles, version control (Git) and software development lifecycle. Strong communication and stakeholder-management skills with the ability to influence technical and business More ❯
with researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader … Strong programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering More ❯
with researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader … Strong programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering More ❯
london (city of london), south east england, united kingdom
Mondrian Alpha
with researchers, technologists, and analysts to enhance the quality, timeliness, and accessibility of data. Contribute to the evolution of modern cloud-based data infrastructure , working with tools such as Airflow, Kafka, Spark, and AWS . Monitor and troubleshoot data workflows, ensuring continuous delivery of high-quality, analysis-ready datasets. Play a visible role in enhancing the firm’s broader … Strong programming ability in Python (including libraries such as pandas and NumPy ) and proficiency with SQL . Confident working with ETL frameworks , data modelling principles, and modern data tools (Airflow, Kafka, Spark, AWS). Experience working with large, complex datasets from structured, high-quality environments — e.g. consulting, finance, or enterprise tech. STEM degree in Mathematics, Physics, Computer Science, Engineering More ❯