and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
london (city of london), south east england, united kingdom
Vallum Associates
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
Integrate asynchronous workflows using message brokers such as AWS SQS. *Own and evolve containerised deployment pipelines using Docker and CI/CD principles. *Develop and manage data pipelines with ApacheAirflow, with data transformation using Python and Pandas. *Guide and mentor a team of engineers, setting high standards for clean code, testing, and technical design. *Promote engineering excellence … and CI/CD practices. Data Engineering & Scripting *Strong Scripting and data manipulation skills using Python. *Proficient with Pandas for handling and transforming complex datasets. *Hands-on experience with ApacheAirflow for data orchestration and pipeline scheduling. Architecture & Communication *Proven experience designing and leading architectural changes in enterprise systems. *Exceptional communication skills with the ability to present complex More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
monitor machine learning models for anomaly detection and failure prediction. Analyze sensor data and operational logs to support predictive maintenance strategies. Develop and maintain data pipelines using tools like ApacheAirflow for efficient workflows. Use MLflow for experiment tracking, model versioning, and deployment management. Contribute to data cleaning, feature engineering, and model evaluation processes. Collaborate with engineers and … science libraries (Pandas, Scikit-learn, etc.). Solid understanding of machine learning concepts and algorithms . Interest in working with real-world industrial or sensor data . Exposure to ApacheAirflow and/or MLflow (through coursework or experience) is a plus. A proactive, analytical mindset with a willingness to learn and collaborate. Why Join Us Work on More ❯
delivering end-to-end AI/ML projects. Nice to Have: Exposure to LLMs (Large Language Models), generative AI , or transformer architectures . Experience with data engineering tools (Spark, Airflow, Snowflake). Prior experience in fintech, healthtech, or similar domains is a plus. More ❯
or may consider with an exposure to credit, rates, equities, options, etc. Experience with market data providers (Bloomberg, Refinitiv, etc.) would be useful Any familiarity with tools such as Airflow, prefect, or other orchestration frameworks would be advantageous. Experience building internal tools or dashboards using Dash, Streamlit, or similar web-based data analytics platforms would be nice to have More ❯
or may consider with an exposure to credit, rates, equities, options, etc. Experience with market data providers (Bloomberg, Refinitiv, etc.) would be useful Any familiarity with tools such as Airflow, prefect, or other orchestration frameworks would be advantageous. Experience building internal tools or dashboards using Dash, Streamlit, or similar web-based data analytics platforms would be nice to have More ❯
or may consider with an exposure to credit, rates, equities, options, etc. Experience with market data providers (Bloomberg, Refinitiv, etc.) would be useful Any familiarity with tools such as Airflow, prefect, or other orchestration frameworks would be advantageous. Experience building internal tools or dashboards using Dash, Streamlit, or similar web-based data analytics platforms would be nice to have More ❯
london (city of london), south east england, united kingdom
NP Group
or may consider with an exposure to credit, rates, equities, options, etc. Experience with market data providers (Bloomberg, Refinitiv, etc.) would be useful Any familiarity with tools such as Airflow, prefect, or other orchestration frameworks would be advantageous. Experience building internal tools or dashboards using Dash, Streamlit, or similar web-based data analytics platforms would be nice to have More ❯
following engineering disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith More ❯
Role Within the Kingdom Work closely with stakeholders to understand their data needs and design scalable solutions Build, maintain and optimise data pipelines and models using SQL, Python and Airflow Design and develop BI and reporting products such as Looker models, dashboards and data visualisations Contribute to our data modelling standards and best practices to ensure quality, reliability and … Thrills Strong SQL skills, able to write complex and performant queries with ease. Solid experience in Python development for data workflows Experience building and maintaining ETL pipelines, ideally with ApacheAirflow or a similar orchestration tool Hands-on experience with Google Cloud Platform (BigQuery, GCS, etc.) or another major cloud provider Good understanding of data modelling principles and More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, ApacheAirflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python language. Exposed to python More ❯
platforms. Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able More ❯
london, south east england, united kingdom Hybrid / WFH Options
Fruition Group
platforms. Proven experience in a leadership or technical lead role, with official line management responsibility. Strong experience with modern data stack technologies, including Python, Snowflake, AWS (S3, EC2, Terraform), Airflow, dbt, Apache Spark, Apache Iceberg, and Postgres. Skilled in balancing technical excellence with business priorities in a fast-paced environment. Strong communication and stakeholder management skills, able More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
record in full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., ApacheAirflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with More ❯
in Java/Scala Cloud (AWS preferred) + Infrastructure-as-Code familiarity Solid understanding of ETL/ELT and data modeling Nice to have: Iceberg/Delta/Hudi, Airflow/Dagster/Prefect, Python/TypeScript, data governance. More ❯
in Java/Scala Cloud (AWS preferred) + Infrastructure-as-Code familiarity Solid understanding of ETL/ELT and data modeling Nice to have: Iceberg/Delta/Hudi, Airflow/Dagster/Prefect, Python/TypeScript, data governance. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
twentyAI
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
london, south east england, united kingdom Hybrid / WFH Options
twentyAI
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
twentyAI
AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension • 25 days holiday + bank holidays • Central London office & socials • Work abroad up to 1 month/year Why More ❯