london (city of london), south east england, United Kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
london (west end), south east england, United Kingdom Hybrid / WFH Options
Realtime Recruitment
for smooth deployments, ensuring platform stability and scalability through upgrades, and managing the data analytics platform. Responsibilities: Utilize modern data platform tools (dbt Core, Airflow, Airbyte, Snowflake, etc.). Collaborate with DevOps, Data Engineering, Infrastructure, and InfoSec for seamless application integration. Design, implement, and maintain scalable cloud data platform More ❯
in computer science or a relevant area. Excellent coding skills specifically in Python. Very desirable commercial technical experience with tools such as Spark, Databricks, Airflow, Docker etc Commercial Containerisation & Infrastructure as code experience Previous work in a CI/CD environment AWS is the preferred cloud platform - Azure and More ❯
BigQuery or similar cloud platforms) • Proficiency in Tableau or other data visualisation tools • Understanding of data modelling (Kimball methodology preferred) • Experience with DBT/Airflow and orchestration tools • Strong communication and stakeholder management skills • Familiarity with GitHub for version control and documentation Bonus Points For: • Python skills • Knowledge of More ❯
london (chessington), south east england, United Kingdom
Omnis Partners
BigQuery or similar cloud platforms) • Proficiency in Tableau or other data visualisation tools • Understanding of data modelling (Kimball methodology preferred) • Experience with DBT/Airflow and orchestration tools • Strong communication and stakeholder management skills • Familiarity with GitHub for version control and documentation Bonus Points For: • Python skills • Knowledge of More ❯
Oxford, Oxfordshire, United Kingdom Hybrid / WFH Options
Connect Centric LLC
to enhance scalability and efficiency. Collaboration & Leadership : Work closely with software and AI engineering teams while mentoring junior engineers. Legacy Workflow Integration : Manage ArgoCD, Airflow, Jenkins, Bitbucket, and Bamboo pipelines. Technical Ownership : Act as a tech owner for software products, liaising with stakeholders and presenting cloud solutions. Continuous Learning More ❯
requirements impact analysis, platform selection, technical architecture design, application design and development, testing, and deployment. Proficiency in tools such as Snowflake, DBT, Glue, and Airflow, you will help define the technical strategy and ensure scalable, high-performing data architecture Your Profile Essential skills/knowledge/experience: Extensive experience More ❯
Databricks/Snowflake, ideally on AWS Strong Python experience, including deep knowledge of the Python data ecosystem, with hands-on expertise in Spark and Airflow Hands-on experience in all phases of data modelling from conceptualization to database optimization supported by advanced SQL skills Hands-on Experience with implementing More ❯
Staines, Middlesex, United Kingdom Hybrid / WFH Options
Industrial and Financial Systems
in data pipelines across cloud/on-premises, using Azure and other technologies. Experienced in orchestrating data workflows and Kubernetes clusters on AKS using Airflow, Kubeflow, Argo, Dagster or similar. Skilled with data ingestion tools like Airbyte, Fivetran, etc. for diverse data sources. Expert in large-scale data processing More ❯
patterns and delivery practices Continually improve with our internal development program, including mentoring and paid training/certifications Skills Data Modelling - SQLDBM Snowflake AWS Airflow DBT Note: Experience in the energy industry is highly advantageous. More ❯
commodities markets (e.g., oil, gas). The PM is keen to get someone from a data engineering background. Tech: Python, SQL, Pandas, AWS, ETL, Airflow Please apply if this fits. More ❯
years of commercial experience as a Data Engineer. Proven experience using Python, Spark, AWS & Databricks. Prior experience using some of the following; Docker/Airflow/Terraform. Solid experience in CI/CD and Software Engineering best practice. HOW TO APPLY Please register your interest by clicking the Apply More ❯
sources to ensure optimal value extraction. Required Skills: Proficiency in Python. Solid experience with Linux, SQL, relational databases, and version control systems. Familiarity with Airflow, Kafka, and GCP (AWS experience is also acceptable). If this sounds like you, please apply directly or reach out to Daniel O'Connell More ❯
SQL expertise designing and maintaining ETL/data pipelines ideally proficiency in multiple cloud infrastructures, databases and data warehousing solutions - AWS and GCP being Airflow 👍 Bonus points for: Experience working in a small, fast-growing start-up, comfortable navigating unstructured/fuzzy environments Experience with RudderStack, Expo and/ More ❯
modelling, and extracting value from large, disconnected datasets Familiarity with data warehouse design and modelling Familiarity with cloud data services (preferably Azure) Experience with Airflow 🛂 Please note : unfortunately, this role does not offer VISA sponsorship. More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
So they offer very flexible working arrangements through both WFH options and flexi working hours. Experience required... Expert in Snowflake Strong DBT experience Strong Airflow experience Expert knowledge and understanding of Data Warehousing Strong AWS experience More ❯
Data Expertise – Experience with AWS, Azure, GCP, ETL pipelines, and scalable data architectures. DevOps & CI/CD – Strong knowledge of GitOps, Kubernetes, Terraform, Jenkins, Airflow, and related tools. Strategic Thinking – Ability to drive long-term engineering strategy while delivering incremental value. Technical Debt Management – Experience identifying and remediating inefficient More ❯
Staines, Middlesex, United Kingdom Hybrid / WFH Options
Industrial and Financial Systems
Semantic Kernel, and tools such as MS tooling, Co-Pilot Studio, ML Studio, Prompt flow, Kedro, etc. Proficiency with pipeline orchestration tools, such as Airflow, Kubeflow, and Argo. Outstanding communication skills, combining subject matter expertise with a flair for statistics. A results-driven attitude, a passion for innovation, and More ❯
of IAM, VPCs, networking, and cloud security A love for solving problems, sharing knowledge, and working as a team Experience with Kafka, Consul, HAProxy, Airflow, or Tableau Cloud cost optimisation and disaster recovery experience Container-first mindset and a passion for automation You'll be working on high-impact More ❯
to-Have: Experience with marketing data or customer-level models (e.g. uplift, attribution, causal inference, campaign optimization) Familiarity with MLOps tools (e.g. MLflow, FastAPI, Airflow) Exposure to A/B testing and experimentation frameworks WHY THIS ROLE IS DIFFERENT This isn’t a narrow data science role — you won More ❯
to-Have: Experience with marketing data, customer-level modelling, or decision science (e.g. uplift, attribution, causal AI, optimization) Familiarity with MLOps tooling (MLflow, FastAPI, Airflow, etc.) Experience designing and interpreting A/B tests or other experimental frameworks Background in consulting, agency, or fast-paced environments where autonomy and More ❯
sectors, and how data supports operational outcomes. Strong coding ability with SQL and Python, as well as experience working with data orchestration tools like Airflow or Dataform. Commercial experience with Spark and Databricks. Familiarity with leading integration and data platforms such as Mulesoft, Talend, or Alteryx. A natural ability More ❯
sectors, and how data supports operational outcomes. Strong coding ability with SQL and Python, as well as experience working with data orchestration tools like Airflow or Dataform. Commercial experience with Spark and Databricks. Familiarity with leading integration and data platforms such as Mulesoft, Talend, or Alteryx. A natural ability More ❯
technical foundation that enables our AI-driven workflows Skills and Qualifications Experience building end-to-end platform solutions that integrate workflow orchestration systems (like Airflow, Temporal, AWS Step Functions) with real-world business processes and data pipelines Strong background in integration engineering and data modelling Exceptional Python skills for More ❯