CD. Proven track record building and maintaining scalable data platforms in production, enabling advanced users such as ML and analytics engineers. Hands-on experience with modern data stack tools - Airflow, DBT, Databricks, and data catalogue/observability solutions like Monte Carlo, Atlan, or DataHub. Solid understanding of cloud environments (AWS or GCP), including IAM, S3, ECS, RDS, or equivalent More ❯
architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
research and technology teams. Exposure to low-latency or real-time systems. Experience with cloud infrastructure (AWS, GCP, or Azure). Familiarity with data engineering tools such as Kafka, Airflow, Spark, or Dask. Knowledge of equities, futures, or FX markets. Company Rapidly growing hedge fund offices globally including London Salary & Benefits The salary range/rates of pay is More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
practices (testing, CI/CD, automation). Proven track record of designing, building, and scaling data platforms in production environments. Hands-on experience with big data technologies such as Airflow, DBT, Databricks, and data catalogue/observability tools (e.g. Monte Carlo, Atlan, Datahub). Knowledge of cloud infrastructure (AWS or GCP) - including services such as S3, RDS, EMR, ECS More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Lorien
data storytelling and operational insights. Optimise data workflows across cloud and on-prem environments, ensuring performance and reliability. Skills & Experience: Strong experience in ETL pipeline development using tools like ApacheAirflow, Informatica, or similar. Advanced SQL skills and experience with large-scale relational and cloud-based databases. Hands-on experience with Tableau for data visualisation and dashboarding. Exposure More ❯
solutions using Databricks within a secure environment Critical Skills Extensive experience with Databricks (Spark, Delta Lake, and MLflow). Proficiency in ETL/ELT development and orchestration tools (DBT, Airflow, or similar). Hands-on experience with cloud platforms (AWS, Azure, or GCP). Solid understanding of SQL, Python, and PySpark for data processing. Familiarity with CI/CD More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Searchability
are giving express consent for us to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: GCP | Python | SQL | MongoDB | Airflow | dbt | Terraform | Docker | ETL | AI | Machine Learning More ❯
Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (Data Build Tool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Sanderson
for individuals with: Experience: Proven background as a Machine Learning Engineer. Technical Skills: Strong in SQL and Python (Pandas, Scikit-learn, Jupyter, Matplotlib). Data transformation & manipulation : experience with Airflow, DBT and Kubeflow Cloud: Experience with GCP and Vertex AI (developing ML services). Expertise: Solid understanding of computer science fundamentals and time-series forecasting. Machine Learning: Strong grasp More ❯
Poole, Dorset, United Kingdom Hybrid/Remote Options
Aspire Personnel Ltd
stakeholders, understanding and translating their needs into technical requirements. Possess outstanding communication and interpersonal skills, facilitating clear and effective collaboration within and outside the team. Desirables: Familiarity with the ApacheAirflow platform. Basic knowledge of BI tools such as Power BI to support data visualization and insights. Experience with version control using GIT for collaborative and organized code More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Sanderson Recruitment
on development/engineering background Machine Learning or Data Background Technical Experience: PySpark, Python, SQL, Jupiter Cloud: AWS, Azure (Cloud Environment) - Moving towards Azure Nice to Have: Astro/Airflow, Notebook Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
Job Title: Airflow/AWS Data Engineer Location: Manchester Area (3 days per week in the office) Rate: Up to £400 per day inside IR35 Start Date: 03/11/2025 Contract Length: Until 31st December 2025 Job Type: Contract Company Introduction: An exciting opportunity has become available with one of our sector-leading financial services clients. They … to join their growing data engineering function. This role will play a key part in designing, deploying, and maintaining modern cloud infrastructure and data pipelines, with a focus on Airflow, AWS, and data platform automation. Key Responsibilities: Deploy and manage cloud infrastructure across Astronomer Airflow and AccelData environments. Facilitate integration between vendor products and core systems, including data … Establish and enforce best practices for cloud security, scalability, and performance. Configure and maintain vendor product deployments, ensuring reliability and optimized performance. Ensure high availability and fault tolerance for Airflow clusters. Implement and manage monitoring, alerting, and logging solutions for Airflow and related components. Perform upgrades, patches, and version management for platform components. Oversee capacity planning and resource More ❯
cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement skills Experience More ❯
cross-functional teams to build scalable data pipelines and contribute to digital transformation initiatives across government departments. Key Responsibilities Design, develop and maintain robust data pipelines using PostgreSQL and Airflow or Apache Spark Collaborate with frontend/backend developers using Node.js or React Implement best practices in data modelling, ETL processes and performance optimisation Contribute to containerised deployments … essential Operate within Agile teams and support DevOps practices What We're Looking For Proven experience as a Data Engineer in complex environments Strong proficiency in PostgreSQL and either Airflow or Spark Solid understanding of Node.js or React for integration and tooling Familiarity with containerisation technologies (Docker/Kubernetes) is a plus Excellent communication and stakeholder engagement skills Experience More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Opus Recruitment Solutions Ltd
SC cleared Software developers (Python & AWS) to join a contract till April 2026.Inside IR35SC clearedWeekly travel to Newcastle Around £400 per dayContract till April 2026Skills:- Python- AWS Services- Terraform- Apache Spark- Airflow- Docker More ❯
code and testing principles. Develop tools and frameworks for data governance, privacy, and quality monitoring, ensuring full compliance with data protection standards. Create resilient data workflows and automation within Airflow, Databricks, and other modern big data ecosystems. Implement and manage data observability and cataloguing tools (e.g., Monte Carlo, Atlan, DataHub) to enhance visibility and reliability. Partner with ML engineers … deploy, and scale production-grade data platforms and backend systems. Familiarity with data governance frameworks, privacy compliance, and automated data quality checks. Hands-on experience with big data tools (Airflow, Databricks) and data observability platforms. Collaborative mindset and experience working with cross-functional teams including ML and analytics specialists. Curiosity and enthusiasm for continuous learning - you stay up to More ❯