and evaluation through continuous monitoring and scaling. Build & Optimise AI models in Python: fine-tune state-of-the-art architectures on our in-house GPU cluster. Orchestrate Workflows with ApacheAirflow: schedule, monitor, and maintain complex data and model pipelines. Engineer Cloud Services on AWS (Lambda, ECS/EKS, S3, Redshift, etc.) and automate deployments using GitHub Actions … testing, and monitoring. Startup mindset: proactive, resourceful, ambitious, driven to innovate, eager to learn, and comfortable wearing multiple hats in a fast-moving environment. Desirable: hands-on experience with ApacheAirflow, AWS services (especially Redshift, S3, ECS/EKS), and IaC tools like Pulumi. Why Permutable AI? Hybrid Flexibility: Spend 2+ days/week in our Vauxhall hub. More ❯
analysis, and supporting complex client agreements. This is a hands-on engineering role working closely with stakeholders and system owners. You'll be expected to code daily (Python), manage Airflow pipelines (MWAA), build ETL processes from scratch, and improve existing workflows for better performance and scalability. Key responsibilities Design and build robust ETL pipelines using Python and AWS services … Own and maintain Airflow workflows Ensure high data quality through rigorous testing and validation Analyse and understand complex data sets before pipeline design Collaborate with stakeholders to translate business requirements into data solutions Monitor and improve pipeline performance and reliability Maintain documentation of systems, workflows, and configs Tech environment Python, SQL/PLSQL (MS SQL + Oracle), PySpark ApacheAirflow (MWAA), AWS Glue, Athena AWS services (CDK, S3, data lake architectures) Git, JIRA You should apply if you have: Strong Python and SQL skills Proven experience designing data pipelines in cloud environments Hands-on experience with Airflow (ideally MWAA) Background working with large, complex datasets Experience in finance or similar high-volume, regulated industries (preferred but More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of ApacheAirflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources including CMS, analytics, ad tech, and social platforms. Lead engineering efforts to automate workflows using tools like Airflow, dbt, and Spark. Build robust data models to support dashboards, A/B testing, and revenue analytics. Collaborate with cross-functional teams to deliver actionable insights and support strategic More ❯
Strong interpersonal and communication skills with an ability to lead a team and keep them motivated. Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflowMore ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
secure use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
background in distributed systems (Hadoop, Spark, AWS) Skilled in SQL/NoSQL (PostgreSQL, Cassandra) and messaging tech (Kafka, RabbitMQ) Experience with orchestration tools (Chef, Puppet, Ansible) and ETL workflows (Airflow, Luigi) Familiarity with cloud platforms (AWS, GCP) and monitoring tools (ELK stack) Proven problem-solving mindset and ability to adapt solutions to complex challenges Hands-on use of Gen More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Boston Hale
looking for a Data Engineer to join their London-based team. Key Responsibilities: Design and maintain scalable data pipelines across diverse sources. Automate and optimise workflows using tools like Airflow, dbt, and Spark. Support data modelling for analytics, dashboards, and A/B testing. Collaborate with cross-functional teams to deliver data-driven insights. Work with cloud platforms (GCP More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
in a commercial setting. Solid understanding of distributed systems (e.g. Hadoop, AWS, Kafka). Experience with SQL/NoSQL databases (e.g. PostgreSQL, Cassandra). Familiarity with orchestration tools (e.g. Airflow, Luigi) and cloud platforms (e.g. AWS, GCP). Passion for solving complex problems and mentoring others. Package: Salary from £100-140,000 depending on experience Remote-first with flexible More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Circle Recruitment
and practices Proficiency in Python and SQL for data engineering tasks Experience with DBT and a good understanding of data modelling approaches (e.g. star schema, dimensional modelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure Experience using version control and CI/CD tools like Git More ❯
Strong grasp of API design and integration methods (REST, GraphQL, Webhooks) - Knowledge of OAuth2, JWT, and secure authentication protocols - Experience with ETL/ELT pipelines, workflow orchestration tools (e.g., ApacheAirflow) - Solid understanding of both SQL and NoSQL databases - Familiarity with AWS and its integration services - Strong problem-solving, communication, and collaboration skills - Agile team experience Nice to More ❯
and practices Proficiency in Python and SQL for data engineering tasks Experience with dbt and a good understanding of data modelling approaches (e.g. star schema, dimensional modelling) Familiarity with Airflow or similar orchestration tools Comfortable working with AWS services such as Glue and S3, or equivalent cloud infrastructure Experience using version control and CI/CD tools like Git More ❯
streaming data solutions Proficiency in Python, SQL, and data modelling tools (e.g. Erwin, Lucidchart) Strong knowledge of AWS services (Lambda, SNS, S3, EKS, API Gateway) Familiarity with Snowflake, Spark, Airflow, DBT, and data governance frameworks Preferred: Certifications in cloud/data technologies Experience with API/interface modelling and CI/CD (e.g. GitHub Actions) Knowledge of Atlan and More ❯
development. Solid understanding of data processing and engineering workflows. Experience building APIs or services to support data or ML applications. Familiarity with ML model lifecycle and tooling (e.g. MLflow, Airflow, Docker). Strong problem-solving skills and the ability to work autonomously in a dynamic environment. DESIRABLE SKILLS Experience supporting LLM training or retrieval-augmented generation (RAG). Familiarity More ❯
leadership, ideally within fintech or cloud-native organisations (AWS preferred). Strong technical background in data engineering, analytics, or data science. Experience with modern data stacks (e.g., SQL, dbt, Airflow, Snowflake, Looker/Power BI) and AI/ML tooling (e.g., Python, MLflow, MLOps). A track record of building and managing high-performing data teams. Strategic thinking and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Salt Search
communicating customer insights Comfortable working with large datasets from sources like CRM, web analytics, product telemetry, etc. Exposure to cloud platforms (AWS, GCP, Azure) and modern data pipelines (e.g. Airflow, dbt) is a plus Soft Skills: Business-oriented thinker with strong communication skills Able to clearly explain complex models to non-technical audiences Skilled in stakeholder engagement and translating More ❯
expertise in PySpark, SQL, Java, Spark, Databricks, dbt, AWS, and Azure Familiarity with European jurisdictions and global reporting requirements Experience with orchestration and CI/CD tools such as Airflow, Databricks Workflows, and Azure DevOps Strong problem-solving skills and the ability to optimise data processes Excellent communication skills and the ability to engage technical and non-technical stakeholders. More ❯
GDPR). Strong communication and stakeholder management skills. Preferred: Cloud certifications (AWS, etc.). Experience with API data modeling, CI/CD (GitHub Actions). Familiarity with tools like Airflow, DBT, and Atlan. DONT WAIT APPLY NOW !!! Reference: DRI/AMC/DA Postcode: SW1 4DF #dari More ❯
metadata, and reproducibility. Collaborate with cross-functional teams (science, software, product). Mentor engineers and contribute to institutional data strategy and standards. Tech Requirements Strong Python skills; experience with Airflow, Luigi, or Spark. Proven track record building robust, scalable ETL/ELT systems. Experience with scientific, semi-structured, or biodiversity data. Familiarity with semantic web standards (e.g. Darwin Core More ❯
AND EXPERIENCE: The ideal Head of Data Platform will have: Extensive experience with Google Cloud Platform (GCP), particularly BigQuery Proficiency with a modern data tech stack, including SQL, Python, Airflow, dbt, Dataform, Terraform Experience in a mid-large sized company within a regulated industry, with a strong understanding of data governance. A strategic mindset, leadership skills, and a hands More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience within a regulated or mid-to-large tech More ❯
Brighton, East Sussex, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
best practices in testing, data governance, and observability. Lead roadmap planning and explore emerging technologies (e.g. GenAI). Ensure operational stability and support incident resolution. Tech Stack Python , SQL , Airflow , AWS , Fivetran , Snowflake , Looker , Docker (You don't need to tick every box - if you've worked with comparable tools, that's great too.) What We're Looking For More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
contexts. Bonus Experience (Nice to Have) Exposure to large language models (LLMs) or foundational model adaptation. Previous work in cybersecurity, anomaly detection, or behavioural analytics. Familiarity with orchestration frameworks (Airflow or similar). Experience with scalable ML systems, pipelines, or real-time data processing. Advanced degree or equivalent experience in ML/AI research or applied science. Cloud platform More ❯