robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, ApacheAirflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
london (city of london), south east england, united kingdom
Vallum Associates
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
Note: This is a Sr. Level role! About the Role- We are looking for an experienced Senior Airflow Developer with over 5 years of experience to help transition our existing Windows scheduler jobs to ApacheAirflow DAGs. In this role, you’ll play a critical part in modernizing and optimizing our task automation processes by converting existing … into efficient, manageable, and scalable workflows in Airflow. You will also work on security hardening, implementing data pipelines, and designing ETL processes. Key Responsibilities- Convert Windows Scheduler Jobs to Airflow: Migrate existing Windows-based scheduled jobs into Airflow DAGs, ensuring smooth execution and reliability. Develop and Optimize DAGs: Author, schedule, and monitor DAGs (Directed Acyclic Graphs) to handle … data workflows, ETL tasks, and various automation processes. Programming and Scripting: Use Python as the primary language for Airflow DAGs and task logic, with experience in SQL for data manipulation. Set Up and Configure Airflow: Provide comprehensive instructions and configurations for setting up Airflow environments, including deployment, resource allocation, and high availability. Security Hardening: Implement security best More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
monitor machine learning models for anomaly detection and failure prediction. Analyze sensor data and operational logs to support predictive maintenance strategies. Develop and maintain data pipelines using tools like ApacheAirflow for efficient workflows. Use MLflow for experiment tracking, model versioning, and deployment management. Contribute to data cleaning, feature engineering, and model evaluation processes. Collaborate with engineers and … science libraries (Pandas, Scikit-learn, etc.). Solid understanding of machine learning concepts and algorithms . Interest in working with real-world industrial or sensor data . Exposure to ApacheAirflow and/or MLflow (through coursework or experience) is a plus. A proactive, analytical mindset with a willingness to learn and collaborate. Why Join Us Work on More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
Snowflake, Databricks) Strong DevOps mindset with experience in CI/CD pipelines, monitoring, and observability tools (Grafana or equivalent). Exposure to analytics, reporting, and BI tools such as Apache Superset, Lightdash or OpenSearch Willingness to work across the stack by contributing to API development and, at times, UI components (Vue.js, Zoho, or similar). Excellent communication and collaboration More ❯
in C# and Python with additional skills in Java, JavaScript/Typescript, Angular, very strong SQL, Windows server, UNIX, and .Net. Strong research skills. Strong experience of Terraform, AWS, Airflow, Docker, Github/Github actions, Jenkins/Teamcity• Strong AWS specific skills for Athena, Lambda, ECS, ECR, S3 and IAM Strong knowledge in industry best practices in development and More ❯
move us towards our vision of scaling up through product led growth. This role will be focused on our backend system (Symfony, PHP) and our data products (BigQuery, DBT, Airflow), but there will be opportunities to work across the platform including, agentic AI (Python, Langchain), frontend (React, TypeScript), the APIs (GraphQL, REST), our integration tool of choice (Tray.ai) and More ❯
and Responsibilities While in this position your duties may include but are not limited to: Support the design, development, and maintenance of scalable data pipelines using tools such as ApacheAirflow, dbt, or Azure Data Factory. Learn how to ingest, transform, and load data from a variety of sources, including APIs, databases, and flat files. Assist in the More ❯
Southwark, London, United Kingdom Hybrid / WFH Options
Involved Productions Ltd
candidate for this role will likely have: a solid foundation in Python and JavaScript, ideally with proficiency in other programming languages. experience designing and implementing ETL pipelines, specifically using ApacheAirflow (Astronomer). hands-on experience with ETL frameworks, particularly dbt (data build tool). SQL and various database management system skills. a good understanding of different database More ❯
data modeling (star schema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or Google BigQuery/Cloud Storage ). Exposure to workflow orchestration … tools ( ApacheAirflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
data modeling (star schema, snowflake schema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or Google BigQuery/Cloud Storage ). Exposure to workflow orchestration … tools ( ApacheAirflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
deploy solutions in cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability More ❯
deploy solutions in cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability More ❯
deploy solutions in cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability More ❯
deploy solutions in cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability More ❯
london (city of london), south east england, united kingdom
Vertus Partners
deploy solutions in cloud and on-prem environments. Required Skills & Experience Strong proficiency in Python , including libraries such as Pandas, NumPy, and PySpark. Experience with data engineering tools (e.g., Airflow, Kafka, SQL, Parquet). Solid understanding of commodities markets , trading workflows, and financial instruments. Familiarity with cloud platforms (AWS, Azure, or GCP) and containerization (Docker, Kubernetes). Proven ability More ❯
with cloud platforms (AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). EMI share options Training budget Private healthcare Pension 25 days holiday + bank holidays Central London office & socials Work abroad up to 1 month/year Join a More ❯
or may consider with an exposure to credit, rates, equities, options, etc. Experience with market data providers (Bloomberg, Refinitiv, etc.) would be useful Any familiarity with tools such as Airflow, prefect, or other orchestration frameworks would be advantageous. Experience building internal tools or dashboards using Dash, Streamlit, or similar web-based data analytics platforms would be nice to have More ❯
or may consider with an exposure to credit, rates, equities, options, etc. Experience with market data providers (Bloomberg, Refinitiv, etc.) would be useful Any familiarity with tools such as Airflow, prefect, or other orchestration frameworks would be advantageous. Experience building internal tools or dashboards using Dash, Streamlit, or similar web-based data analytics platforms would be nice to have More ❯
or may consider with an exposure to credit, rates, equities, options, etc. Experience with market data providers (Bloomberg, Refinitiv, etc.) would be useful Any familiarity with tools such as Airflow, prefect, or other orchestration frameworks would be advantageous. Experience building internal tools or dashboards using Dash, Streamlit, or similar web-based data analytics platforms would be nice to have More ❯