search, RAG, and feature engineering Implement secure access and governance controls including RBAC, SSO, token policies, and pseudonymisation frameworks Support batch and streaming data flows using technologies like Kafka, Airflow, and Terraform Monitor and optimise cloud resource usage to ensure performance and cost efficiency Collaborate with cross-functional teams on architecture decisions, technical designs, and data governance standards You More ❯
Kubernetes & Helm: Deploying and managing containerized applications at scale with reliability and fault tolerance. Kafka (Confluent): Familiarity with event-driven architectures; experience with Flink or KSQL is a plus. Airflow: Experience configuring, maintaining, and optimizing DAGs. Energy or commodity trading: Understanding the data challenges and workflows in this sector. Trading domain knowledge: Awareness of real-time decision-making and More ❯
Kubernetes & Helm: Deploying and managing containerized applications at scale with reliability and fault tolerance. Kafka (Confluent): Familiarity with event-driven architectures; experience with Flink or KSQL is a plus. Airflow: Experience configuring, maintaining, and optimizing DAGs. Energy or commodity trading: Understanding the data challenges and workflows in this sector. Trading domain knowledge: Awareness of real-time decision-making and More ❯
Look For 5+ yrs of experience Bachelors degree in Computer Science or similar field Experience in Product Management (inbound, outbound, cross-functional processes) Solid understanding of major orchestration tools (Airflow, Dagster, Prefect etc.) Technical skills understanding of data pipelines (Spark, dbt, Lakeflow Pipelines) Strong data analysis and operationalization skills (SQL, Python, building operational dashboards) About Databricks Databricks is the More ❯
research, staging, and production environments. Design and implement model registries, versioning systems, and experiment tracking to ensure full reproducibility of all model releases. Deploy ML workflows using tools like Airflow or similar, managing dependencies from data ingestion through model deployment and serving. Instrument comprehensive monitoring for model performance, data drift, prediction quality, and system health. Manage infrastructure as code More ❯
research, staging, and production environments. Design and implement model registries, versioning systems, and experiment tracking to ensure full reproducibility of all model releases. Deploy ML workflows using tools like Airflow or similar, managing dependencies from data ingestion through model deployment and serving. Instrument comprehensive monitoring for model performance, data drift, prediction quality, and system health. Manage infrastructure as code More ❯
passion for new technologies Experience in startups or top-tier consultancies is a plus Nice to Have (not essential): Familiarity with dashboarding tools, Typescript, and API development Exposure to Airflow, DBT, Databricks Experience with ERP (e.g. SAP, Oracle) and CRM systems What's On Offer: Salary: £50,000-£75,000 + share options Hybrid working: 2-3 days per More ❯
This is a global role leading across Singapore, UK & US. Track record of driving Agile and DevOps practices in high-growth environments. Technical depth across: Node.js,Typrescript,React.js,ECS,Airflow,Kafka,AWS 10+ years experience building and managing distributed engineering teams across multiple regions/global. More ❯
collaborative environment. Expertise in time series modelling, back testing, and extracting insights from complex multi-dimensional data. Hands on experience with AWS, Docker, and ETL/ELT pipelines using Airflow, Dagster, or Prefect. More ❯
collaborative environment. Expertise in time series modelling, back testing, and extracting insights from complex multi-dimensional data. Hands on experience with AWS, Docker, and ETL/ELT pipelines using Airflow, Dagster, or Prefect. More ❯
performance data What You’ll Need 5+ years’ experience in data analytics, ideally with a marketing or growth focus Strong SQL skills and experience building robust data pipelines (DBT, Airflow) Confident using tools like Looker, Amplitude, GA, and Optimizely A/B testing expertise and deep understanding of marketing KPIs and attribution models Great communication skills — able to turn More ❯
performance data What You’ll Need 5+ years’ experience in data analytics, ideally with a marketing or growth focus Strong SQL skills and experience building robust data pipelines (DBT, Airflow) Confident using tools like Looker, Amplitude, GA, and Optimizely A/B testing expertise and deep understanding of marketing KPIs and attribution models Great communication skills — able to turn More ❯
driving clarity from data What We’re Looking For 5+ years of analytics experience, ideally in digital or performance marketing Advanced SQL and experience with tools like DBT or Airflow Hands-on experience with marketing analytics tools (GA, Amplitude, Optimizely, etc.) Confident running A/B tests , building attribution models, and defining marketing KPIs Skilled with visualisation platforms (Looker More ❯
driving clarity from data What We’re Looking For 5+ years of analytics experience, ideally in digital or performance marketing Advanced SQL and experience with tools like DBT or Airflow Hands-on experience with marketing analytics tools (GA, Amplitude, Optimizely, etc.) Confident running A/B tests , building attribution models, and defining marketing KPIs Skilled with visualisation platforms (Looker More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Xact Placements Limited
of building performant, maintainable, and testable systems Solid background in microservices architecture Proficiency with Postgres & MongoDB (relational + non-relational) Experience with event-driven architectures and asynchronous workflows (Kafka, Airflow, etc.) Solid coding practices (clean, testable, automated) The mindset of a builder: thrives in fast-paced startup environments, takes ownership, solves complex challenges Bonus points if you’ve worked More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Propel
for: 5+ years’ experience in analytics, ideally covering both data and business insight within a fast-paced environment. Strong SQL skills with experience building and maintaining data pipelines (dbt, Airflow, or similar). Comfortable working with marketing and growth data - experience with tools like Optimizely, Amplitude, or GA a plus. Proven experience designing and analysing A/B tests More ❯
for: 5+ years’ experience in analytics, ideally covering both data and business insight within a fast-paced environment. Strong SQL skills with experience building and maintaining data pipelines (dbt, Airflow, or similar). Comfortable working with marketing and growth data - experience with tools like Optimizely, Amplitude, or GA a plus. Proven experience designing and analysing A/B tests More ❯
Telford, Shropshire, United Kingdom Hybrid / WFH Options
Experis - ManpowerGroup
execution. Ability to work under pressure and manage competing priorities. Desirable Qualifications: Familiarity with DevOps practices and cloud-hosted platforms (AWS, Azure, SAS PaaS). Knowledge of scheduling tools - Airflow Exposure to SAS Migration Factory Delivery Models. The successful candidate will also need to be SC Vetted. All profiles will be reviewed against the required skills and experience. Due More ❯
Bolton, Greater Manchester, UK Hybrid / WFH Options
Four Recruitment
marketing analytics (Google Analytics 4, Google Ads, Meta Ads, SEO/PPC metrics). Skilled in Google Tag Manager and familiar with CRM/data integration tools (HubSpot, Zapier, Airflow, etc.). Excellent analytical and problem-solving skills with great attention to detail. Strong communication skills and able to explain complex data clearly to non-technical audiences. Comfortable presenting More ❯
Job Title: Airflow/AWS Data Engineer Location: Manchester Area (3 days per week in the office) Rate: Up to £400 per day inside IR35 Start Date: 03/11/2025 Contract Length: Until 31st December 2025 Job Type: Contract Company Introduction: An exciting opportunity has become available with one of our sector-leading financial services clients. They … to join their growing data engineering function. This role will play a key part in designing, deploying, and maintaining modern cloud infrastructure and data pipelines, with a focus on Airflow, AWS, and data platform automation. Key Responsibilities: Deploy and manage cloud infrastructure across Astronomer Airflow and AccelData environments. Facilitate integration between vendor products and core systems, including data … Establish and enforce best practices for cloud security, scalability, and performance. Configure and maintain vendor product deployments, ensuring reliability and optimized performance. Ensure high availability and fault tolerance for Airflow clusters. Implement and manage monitoring, alerting, and logging solutions for Airflow and related components. Perform upgrades, patches, and version management for platform components. Oversee capacity planning and resource More ❯
in AWS. Strong expertise with AWS services, including Glue, Redshift, Data Catalog, and large-scale data storage solutions such as data lakes. Proficiency in ETL/ELT tools (e.g. Apache Spark, Airflow, dbt). Skilled in data processing languages such as Python, Java, and SQL. Strong knowledge of data warehousing, data lakes, and data lakehouse architectures. Excellent analytical More ❯
Strong background in SaaS and B2B product development. Experience in cybersecurity platforms or large-scale data systems is highly valued. Tech Stack Node.js | TypeScript | React | MySQL | AWS | ECS | Kubernetes | Airflow | Kafka | Serverless More ❯