and Python programming languages. Strong understanding of graph databases (e.g., RDF, Neo4j , GraphDB). Experience with data modeling and schema design. Knowledge of data pipeline tools and frameworks (e.g., ApacheAirflow, Luigi). Excellent problem-solving and analytical skills. Ability to work independently and as part of a team. Clinical knowledge More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Xact Placements Limited
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if you’ve worked More ❯
B testing Experiment design and hypothesis testing MLOps & Engineering Scalable ML systems (batch and real-time) ML pipelines, CI/CD, monitoring, deployment Familiarity with tools like MLflow, Kubeflow, Airflow, Docker, Kubernetes Strategic skills Align ML initiatives with business goals Prioritize projects based on ROI, feasibility, and risk Understand market trends and competitive ML strategies Communicate ML impact to More ❯
london, south east england, united kingdom Hybrid / WFH Options
iO Associates
x Contract Data Engineers - Snowflake/AWS/Python/Airflow/Iceberg Location: London (Hybrid - 3 days per week onsite) Duration: 6 months Day Rate: £550 - £600 (Inside IR35) A highly reputable consultancy is seeking 2 x Contract Data Engineers to join their data team on a 6-month engagement. You will play a key role in building … and reporting capabilities. Key Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
iO Associates
x Contract Data Engineers - Snowflake/AWS/Python/Airflow/Iceberg Location: London (Hybrid - 3 days per week onsite) Duration: 6 months Day Rate: £550 - £600 (Inside IR35) A highly reputable consultancy is seeking 2 x Contract Data Engineers to join their data team on a 6-month engagement. You will play a key role in building … and reporting capabilities. Key Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates More ❯
london, south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
slough, south east england, united kingdom Hybrid / WFH Options
twentyAI
critical client outcomes. Analytics Engineering Leaders: Architect and optimise data pipelines, ELT workflows, and cloud warehouse platforms (Snowflake, BigQuery, Redshift). Lead teams working with dbt, SQL, Python, and Airflow to drive data transformation at scale. Ensure data governance, quality, and modelling standards are upheld across solutions. Work closely with data scientists and stakeholders to turn clean data into … engineering. Have led engineering teams and mentored technical talent in high-performance environments. Are proficient in either modern software stacks (Python, React, cloud-native) or analytics tooling (SQL, dbt, Airflow, cloud warehouses) . Bring a strategic mindset , with the ability to connect technical execution to business value. Are committed to innovation, collaboration, and data-driven transformation. Meet eligibility requirements More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
self-organising, fast-paced environment. Nice-to-Have Skills : Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
self-organising, fast-paced environment. Nice-to-Have Skills : Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with More ❯
of data modelling and data warehousing concepts Familiarity with version control systems, particularly Git Desirable Skills: Experience with infrastructure as code tools such as Terraform or CloudFormation Exposure to Apache Spark for distributed data processing Familiarity with workflow orchestration tools such as Airflow or AWS Step Functions Understanding of containerisation using Docker Experience with CI/CD pipelines More ❯
and fast-paced, giving you the opportunity to work across a wide variety of data sources and tools. Day-to-day responsibilities include: Designing and developing DBT models and Airflow pipelines within a modern data stack. Building robust data ingestion pipelines across multiple sources - including external partners, internal platforms, and APIs. Implementing automated testing and CI/CD pipelines … on forecasting and predictive analytics initiatives. Bringing modern engineering practices, testing frameworks, and design patterns to the wider data function. Tech Stack & Skills Core skills: Strong experience with DBT, Airflow, Snowflake, and Python Proven background in automated testing, CI/CD, and test-driven development Experience building and maintaining data pipelines and APIs in production environments Nice to have More ❯
Support A/B testing, funnel analysis, and data modelling to enhance performance Contribute to the evolution of the company's data warehouse and pipelines (experience with dbt or Airflow a plus) Collaborate with product, marketing, and commercial teams to translate data into actionable recommendations Communicate insights clearly across teams to influence business outcomes Role Requirements Strong technical skills … in SQL and dashboarding tools (Looker Studio/BigQuery) Experience with A/B testing, funnel analysis, and data modelling Familiarity with data warehouse concepts and pipeline development (dbt, Airflow experience advantageous) Ability to work collaboratively across multiple teams and communicate insights effectively Proactive, detail-oriented, and able to drive impact in a high-growth environment The Company You More ❯