Oxford, England, United Kingdom Hybrid / WFH Options
Akrivia Health
development lifecycles, cloud technologies and modern engineering practices. ● Experience with the following technologies: o Cloud Provider: AWS o Languages: Python, PHP, Rust & SQL o Hosting: Kubernetes o Tooling & Analytics: Airflow, RabbitMQ, Apache Spark, PowerBI ● Proven ability to complete projects according to outlined scope, budget, and timeline ● Experience with industry standard tools such as Microsoft products, Jira, confluence, project More ❯
banbury, south east england, united kingdom Hybrid / WFH Options
Akrivia Health
development lifecycles, cloud technologies and modern engineering practices. ● Experience with the following technologies: o Cloud Provider: AWS o Languages: Python, PHP, Rust & SQL o Hosting: Kubernetes o Tooling & Analytics: Airflow, RabbitMQ, Apache Spark, PowerBI ● Proven ability to complete projects according to outlined scope, budget, and timeline ● Experience with industry standard tools such as Microsoft products, Jira, confluence, project More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Birchwell Associates Ltd
automation, reliability, and agility. Key Responsibilities Design, build, and optimise data pipelines across a modern data platform. Ingest, clean, and transform data using tools such as dbt, Snowflake, and Airflow . Collaborate with cross-functional teams to deliver data products aligned to business priorities. Develop scalable data models that support BI and analytics platforms including Tableau and Power BI. … and optimise complex queries. Hands-on experience with dbt (including testing and layered modelling). Practical knowledge of Snowflake for loading, transforming, and exporting datasets. Experience building and managing Airflow DAGs for pipeline orchestration. Understanding of BI tool requirements (e.g., Tableau, Power BI) and related performance considerations. Advanced Excel capability, including pivot tables and complex formulas. Familiarity with data More ❯
automated pipelines, and shaping the foundational framework for how we leverage data to succeed. What You'll Do You'll develop and maintain data pipelines and automated processes in Airflow and Python You'll create SQL data models with dbt to power dashboards and applications You'll integrate third-party APIs and databases into our data flows You'll … notebook analytics and collaboration Circle CI for continuous deployment AWS cloud infrastructure Kubernetes for data services and task orchestration Google Analytics, Amplitude and Firebase for client applications event processing Airflow for job scheduling and tracking Parquet and Delta file formats on S3 for data lake storage Streamlit for data applications Why else you'll love it here Wondering what More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Medialab Group
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
london, south east england, united kingdom Hybrid / WFH Options
Medialab Group
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Medialab Group
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
and governance through robust access controls, including RBAC, SSO, token policies, and pseudonymisation frameworks. Develop resilient data flows for both batch and streaming workloads using technologies such as Kafka, Airflow, DBT, and Terraform. Shape data strategy and standards by contributing to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement initiatives. Qualifications What we’d … requirements Direct exposure to cloud-native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plus Experience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or Spark Familiarity with governance frameworks, access controls (RBAC), and implementation of pseudonymisation and retention policies Exposure to enabling GenAI and ML workloads by preparing model-ready and vector More ❯
Join our rapidly expanding team as a hands-on Cloud Data Analytics Platform Engineer and play a pivotal role in shaping the future of data at Citi. We're building a cutting-edge, multi-cloud data analytics platform that empowers More ❯
Newcastle Upon Tyne, Tyne and Wear, England, United Kingdom
Norton Rose Fulbright LLP
Azure/Microsoft Fabric/Data Factory) and modern data warehouse technologies (Databricks, Snowflake) Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB) Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that explore, capture, transform, and utilize More ❯
Huddersfield, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Oscar Associates (UK) Limited
Job Title: Data Engineer Salary: £40k - £60k + Excellent Benefits Package Location: Huddersfield (Hybrid) Key Skills: SQL, PowerBI, Airflow Summary A new role has opened up for a Data Engineering with SQL, BI, Cloud and Airflow DAG to join a media-focused business. The role has opened up as the company are heavily investing, and have exciting plans … responsibilities will cover: Develop data models to support company Business Intelligence Write and optimize complex SQL queries Build, maintain and improve data pipelines Transform data using DBT, Snowflake and Airflow Ensure data is handled correctly and to relevant standards Collaborate with tech teams to collectively solve shared data challenges Key Skills SQL DBT Snowflake Airflow DAG Cloud Platform … successful candidate to Oscar. Email: to recommend someone for this role Job Title: Data Engineer Salary: £40k - £60k + Excellent Benefits Package Location: Huddersfield (Hybrid) Key Skills: SQL, PowerBI, Airflow Oscar Associates (UK) Limited is acting as an Employment Agency in relation to this vacancy. To understand more about what we do with your data please review our privacy More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
methodology and a Medallion architecture (bronze, silver, gold layers). Develop and maintain DBT projects and configure incremental loads with built-in unit testing. Support data pipeline orchestration with Airflow and work with AWS cloud tools. Help deliver a production-ready Data Mart with star schema design to power business reporting and dashboards (PowerBI experience a plus). Skills … Experience: Strong SQL expertise and hands-on experience with DBT. Familiarity with Kimball dimensional modelling concepts. Experience working with cloud data warehouses such as Redshift or Snowflake. Knowledge of Airflow for workflow management. Comfortable in AWS environments and data orchestration. Bonus: Python programming skills and familiarity with dashboarding tools. Contract Details: Duration: 3 months Rate: £450/day onside More ❯
Collaborate with ML researchers and biologists to translate raw data into actionable insights and high-quality training data Scale distributed systems using Kubernetes, Terraform, and orchestration tools such as Airflow, Flyte, or Temporal Write clean, extensible, and well-tested code to ensure long-term maintainability and collaboration across teams About You We are looking for data and platform engineers … Experience designing and implementing large-scale data storage systems (feature stores, timeseries databases, warehouses, or object stores) Strong distributed systems and infrastructure skills (Kubernetes, Terraform, orchestration frameworks such as Airflow/Flyte/Temporal) Hands-on cloud engineering experience (AWS, GCP, or Azure) Strong software engineering fundamentals, with a track record of writing maintainable, testable, and extensible code Familiarity More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hlx Technology
Collaborate with ML researchers and biologists to translate raw data into actionable insights and high-quality training data Scale distributed systems using Kubernetes, Terraform, and orchestration tools such as Airflow, Flyte, or Temporal Write clean, extensible, and well-tested code to ensure long-term maintainability and collaboration across teams About You We are looking for data and platform engineers … Experience designing and implementing large-scale data storage systems (feature stores, timeseries databases, warehouses, or object stores) Strong distributed systems and infrastructure skills (Kubernetes, Terraform, orchestration frameworks such as Airflow/Flyte/Temporal) Hands-on cloud engineering experience (AWS, GCP, or Azure) Strong software engineering fundamentals, with a track record of writing maintainable, testable, and extensible code Familiarity More ❯
the development of internal ML platforms, and help identify high-impact opportunities for rapid capability delivery. You'll also be responsible for building robust MLOps pipelines using tools like Airflow and MLflow, ensuring scalable and automated model deployment and governance. What you'll need to succeed Essential Skills: * Bachelor's degree (2:1 or above) in science, engineering, mathematics … or computer science * Strong Python development skills * Practical experience with machine learning or deep learning in scientific/engineering contexts * Familiarity with MLOps tools and practices (Airflow, MLflow) and containerisation (e.g., Docker) * Passion for working with real-world datasets and data visualisation * Interest in material discovery, computer vision, big data, and optimisation techniques * Excellent communication and collaboration skills * Problem More ❯
the development of internal ML platforms, and help identify high-impact opportunities for rapid capability delivery. You'll also be responsible for building robust MLOps pipelines using tools like Airflow and MLflow, ensuring scalable and automated model deployment and governance. What you'll need to succeed Essential Skills: * Bachelor's degree (2:1 or above) in science, engineering, mathematics … or computer science * Strong Python development skills * Practical experience with machine learning or deep learning in scientific/engineering contexts * Familiarity with MLOps tools and practices (Airflow, MLflow) and containerisation (e.g., Docker) * Passion for working with real-world datasets and data visualisation * Interest in material discovery, computer vision, big data, and optimisation techniques * Excellent communication and collaboration skills * Problem More ❯
concepts in simple, actionable terms. Collaborate with the central data engineering team managing the cloud infrastructure (BigQuery, GCP). Coordinate with third party partners responsible for pipeline orchestration and Airflow maintenance. Drive standardisation across naming conventions, definitions, and documentation to strengthen governance and data discoverability. Influence the organisation’s data roadmap, ensuring it aligns with evolving business priorities. Take … cloud data warehouses (ideally BigQuery). Proficiency in modular data modelling frameworks such as Dataform or dbt, with an emphasis on testing and documentation. Experience with orchestration tools (e.g. Airflow) for managing data pipelines. Familiarity with modern data stacks, version control (Git), and CI/CD best practices. Strong ability to validate and interrogate large datasets, with a high More ❯
them into SDKs, and use them as the foundation for first-class integrations with platforms like Snowflake, Databricks, and BigQuery, as well as orchestration and streaming technologies such as Airflow, Kafka, and dbt. You will ensure these APIs and SDKs are designed with a best-in-class developer experience - consistent, intuitive, well-documented, and secure. Your work will be … ideally with a focus on APIs, SDKs, data platforms, data integration, or enterprise SaaS. Data platform knowledge: Strong familiarity with data warehouses/lakehouses (Snowflake, Databricks, BigQuery), orchestration tools (Airflow, Prefect), streaming (Kafka, Flink), and transformation (dbt). Technical proficiency: Solid understanding of REST/GraphQL APIs, SDK development, authentication/authorization standards (OAuth, SSO), and best practices in More ❯