and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
london (city of london), south east england, united kingdom
Vallum Associates
and contribute to technical roadmap planning Technical Skills: Great SQL skills with experience in complex query optimization Strong Python programming skills with experience in data processing libraries (pandas, NumPy, Apache Spark) Hands-on experience building and maintaining data ingestion pipelines Proven track record of optimising queries, code, and system performance Experience with open-source data processing frameworks (Apache Spark, Apache Kafka, ApacheAirflow) Knowledge of distributed computing concepts and big data technologies Experience with version control systems (Git) and CI/CD practices Experience with relational databases (PostgreSQL, MySQL or similar) Experience with containerization technologies (Docker, Kubernetes) Experience with data orchestration tools (ApacheAirflow or Dagster) Understanding of data warehousing concepts and More ❯
the success of the data department overall. TLA works with the modern data stack, utilising Snowflake for our data warehouse, dbt to transform data across our medallion architecture, and ApacheAirflow for orchestration. Microsoft Azure is our choice of cloud provider for hosting infrastructure. Within the role you will be hands-on with all these exciting technologies. Many … Nice-to-Have Skills: Experience with both batch and near real-time data pipelines Familiarity with Infrastructure as Code (Terraform) Experience with dbt and medallion architecture patterns Knowledge of ApacheAirflow or similar orchestration tools Azure cloud platform experience Why Join TLA? TLA is a fast-moving, innovative digital business that partners with some of the biggest automotive More ❯
IR35Immediate start12 month contract Essential Been to school in the UK Data Ingestion of APIs GCP based (Google Cloud Platform) Snowflake BigQuery DBT Semantic layer (Cube/Looker) Desirable Airflow (ApacheAirflowMore ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
monitor machine learning models for anomaly detection and failure prediction. Analyze sensor data and operational logs to support predictive maintenance strategies. Develop and maintain data pipelines using tools like ApacheAirflow for efficient workflows. Use MLflow for experiment tracking, model versioning, and deployment management. Contribute to data cleaning, feature engineering, and model evaluation processes. Collaborate with engineers and … science libraries (Pandas, Scikit-learn, etc.). Solid understanding of machine learning concepts and algorithms . Interest in working with real-world industrial or sensor data . Exposure to ApacheAirflow and/or MLflow (through coursework or experience) is a plus. A proactive, analytical mindset with a willingness to learn and collaborate. Why Join Us Work on More ❯
and Responsibilities While in this position your duties may include but are not limited to: Support the design, development, and maintenance of scalable data pipelines using tools such as ApacheAirflow, dbt, or Azure Data Factory. Learn how to ingest, transform, and load data from a variety of sources, including APIs, databases, and flat files. Assist in the More ❯
of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git ● Knowledge of latest data pipeline orchestration tools such as Airflow ● Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation). ● Familiarity with data quality, data governance, and observability tools (e.g., Great More ❯
of the programming languages like Python, C++[2] , Java ● Experience with version control and code review tools such as Git ● Knowledge of latest data pipeline orchestration tools such as Airflow ● Experience with cloud platforms (AWS, GCP, or Azure) and infrastructure-as-code tools (e.g., Docker, Terraform, CloudFormation). ● Familiarity with data quality, data governance, and observability tools (e.g., Great More ❯
Role Within the Kingdom Work closely with stakeholders to understand their data needs and design scalable solutions Build, maintain and optimise data pipelines and models using SQL, Python and Airflow Design and develop BI and reporting products such as Looker models, dashboards and data visualisations Contribute to our data modelling standards and best practices to ensure quality, reliability and … Thrills Strong SQL skills, able to write complex and performant queries with ease. Solid experience in Python development for data workflows Experience building and maintaining ETL pipelines, ideally with ApacheAirflow or a similar orchestration tool Hands-on experience with Google Cloud Platform (BigQuery, GCS, etc.) or another major cloud provider Good understanding of data modelling principles and More ❯
frameworks, and clear documentation within your pipelines Experience in the following areas is not essential but would be beneficial: Data Orchestration Tools: Familiarity with modern workflow management tools like ApacheAirflow, Prefect, or Dagster Modern Data Transformation: Experience with dbt (Data Build Tool) for managing the transformation layer of the data warehouse BI Tool Familiarity : An understanding of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating them into data architecture Strong problem More ❯
research and technology teams. Exposure to low-latency or real-time systems. Experience with cloud infrastructure (AWS, GCP, or Azure). Familiarity with data engineering tools such as Kafka, Airflow, Spark, or Dask. Knowledge of equities, futures, or FX markets. Company Rapidly growing hedge fund offices globally including London Salary & Benefits The salary range/rates of pay is More ❯
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
inclusive and collaborative culture, encouraging peer to peer feedback and evolving healthy, curious and humble teams. Tech Stack: Python, Javascript/Typescript, React/React Native, AWS, GraphQL, Snowflake, Airflow, DDD. This is an incredible opportunity for a Senior Software Engineer to join a unique company as they embark on a period of significant growth to take their fantastic More ❯
architectures (e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What were looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset adaptable, collaborative, and delivery-focused More ❯
Edinburgh, York Place, City of Edinburgh, United Kingdom
Bright Purple
e.g. Databricks, Snowflake) Collaborating with multidisciplinary teams to deliver real business value What we’re looking for Strong experience with Python, SQL , and pipeline tools such as dbt or Airflow Proven background in data modelling, warehousing, and performance optimisation Hands-on experience with cloud data services (Glue, Lambda, Synapse, BigQuery, etc.) A consultancy mindset – adaptable, collaborative, and delivery-focused More ❯
Machine Learning: Experience supporting and understanding ML pipelines and models in a production setting. Direct experience with Google Cloud Platform, BigQuery, and associated tooling. Experience with workflow tools like Airflow or Kubeflow. Familiarity with dbt (Data Build Tool). Please send your CV for more information on these roles. Reasonable Adjustments: Respect and equality are core values to us. More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
bolton, greater manchester, north west england, united kingdom Hybrid / WFH Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
Client Server
As a Senior Data Engineer you will take ownership of the data platform, optimising it for scalability to ensure successful client onboarding. You'll use modern tools (such as Airflow, Prefect, Dagster or AWS Step Functions) for ETL design and orchestration, work on transformation logic to clean, validate and enrich data (including handling missing values, standardising formats and duplication More ❯
communication skills - able to collaborate across technical and business functions. A passion for data-driven problem solving, innovation, and continuous improvement. Nice-to-haves: Experience with data orchestration tools (Airflow, Prefect). Knowledge of data observability or cataloguing tools (Monte Carlo, OpenMetadata). Familiarity with large-scale consumer data or survey environments. More ❯