London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Experience in ELT processes and best practices for cloud-based data warehousing. Knowledge of performance tuning techniques for optimising BigQuery queries and costs. Familiarity with cloud services (GCP, Terraform, Airflow, etc.) and their integration with BigQuery. HOW TO APPLY Please register your interest by sending your CV via the apply link on this page. More ❯
System Reliability Engineer. Experience with building, maintaining and continuously enhancing automations needed for scalability & efficiency in running the Network Infrastructure. Experience in infrastructure Automation and orchestration Frameworks e.g. Ansible, Airflow, Terraform, Chef, Salt. Proven experience with object-oriented programming languages preferably in Python. A bachelor's or master's degree in computer science, Engineering, Mathematics, a similar field of More ❯
System Reliability Engineer. Experience with building, maintaining and continuously enhancing automations needed for scalability & efficiency in running the Network Infrastructure. Experience in infrastructure Automation or orchestration Frameworks e.g. Ansible, Airflow, Terraform, Chef, Salt. Proven experience with object-oriented programming languages preferably in Python. A bachelor's or master's degree in computer science, Engineering, Mathematics, a similar field of More ❯
/medical devices preferred but not required) Strong Python programming and data engineering skills (Pandas, PySpark, Dask) Proficiency with databases (SQL/NoSQL), ETL processes, and modern data frameworks (Apache Spark, Airflow, Kafka) Solid experience with cloud platforms (AWS, GCP, or Azure) and CI/CD for data pipelines Understanding of data privacy and healthcare compliance (GDPR, HIPAA More ❯
Experience with modern data stacks is essential for building robust, scalable data pipelines across cloud and hybrid platforms. Key technologies include: Spark Databricks Python/Scala SQL Delta Lake Airflow Experience with Containerization and Orchestration Proficiency in containerization tools like Docker and Kubernetes, as well as understanding orchestration workflows, is highly beneficial for Data Architects. More ❯
optimisation. This is an ideal role for someone looking to hit the ground running, work on complex challenges with autonomy. Role Requirements: Exceptional ability with tools such as Python, Airflow, SQL, and at least one Cloud provider Experience forecasting, customer, and propensity models Experience with building machine learning models and deploying at scale 2:1 or above in Mathematics More ❯
optimisation. This is an ideal role for someone looking to hit the ground running, work on complex challenges with autonomy. Role Requirements: Exceptional ability with tools such as Python, Airflow, SQL, and at least one Cloud provider Experience forecasting, customer, and propensity models Experience with building machine learning models and deploying at scale 2:1 or above in Mathematics More ❯
centred around a software product, and have solid Python coding skills, and expertise with cloud infrastructure (preferably AWS). Familiarity with Containers and MLE tools such as MLflow and Airflow is essential, with any knowledge of AI SaaS or GenAI APIs being is a bonus. But what truly matters is your passion for learning and advancing technology. In return More ❯
in AWS. Strong expertise with AWS services, including Glue, Redshift, Data Catalog, and large-scale data storage solutions such as data lakes. Proficiency in ETL/ELT tools (e.g. Apache Spark, Airflow, dbt). Skilled in data processing languages such as Python, Java, and SQL. Strong knowledge of data warehousing, data lakes, and data lakehouse architectures. Excellent analytical More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize More ❯
in delta one, store of value, and/or FICC options trading Experience with Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster) Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases Have a Bachelors or advanced degree in Computer Science, Mathematics More ❯
in Python, ensuring scalability and reliability. Extract data from multiple external sources via APIs, and where necessary, web scraping/browser automation (Playwright, Selenium, Puppeteer). Orchestrate pipelines using Airflow, and manage data quality workflows. Model and transform data in SQL and Snowflake to create clean, analytics-ready datasets. Ensure data quality, observability, and governance across workflows. Collaborate closely … who bring: Strong hands-on experience with Python for API ingestion, pipeline automation, and data transformation. Solid SQL skills with Snowflake (or similar cloud data warehouses). Experience with Airflow or other orchestration tools. Knowledge of data modelling, warehouse performance optimisation, and governance. Cloud experience (AWS preferred; Terraform/Docker a plus). Nice-to-have: browser automation/ More ❯
Oxford, England, United Kingdom Hybrid / WFH Options
Akrivia Health
development lifecycles, cloud technologies and modern engineering practices. ● Experience with the following technologies: o Cloud Provider: AWS o Languages: Python, PHP, Rust & SQL o Hosting: Kubernetes o Tooling & Analytics: Airflow, RabbitMQ, Apache Spark, PowerBI ● Proven ability to complete projects according to outlined scope, budget, and timeline ● Experience with industry standard tools such as Microsoft products, Jira, confluence, project More ❯
a petabyte-scale Data Lake and create secure, efficient, and scalable environments for our data platforms. Leveraging cloud-native technologies and AWS tools such as AWS S3, EKS, Glue, Airflow, Trino, and Parquet, you will prepare to adopt Apache Iceberg for greater performance and flexibility. You'll address high-performance data workloads, ensuring seamless execution of massive queries More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Birchwell Associates Ltd
automation, reliability, and agility. Key Responsibilities Design, build, and optimise data pipelines across a modern data platform. Ingest, clean, and transform data using tools such as dbt, Snowflake, and Airflow . Collaborate with cross-functional teams to deliver data products aligned to business priorities. Develop scalable data models that support BI and analytics platforms including Tableau and Power BI. … and optimise complex queries. Hands-on experience with dbt (including testing and layered modelling). Practical knowledge of Snowflake for loading, transforming, and exporting datasets. Experience building and managing Airflow DAGs for pipeline orchestration. Understanding of BI tool requirements (e.g., Tableau, Power BI) and related performance considerations. Advanced Excel capability, including pivot tables and complex formulas. Familiarity with data More ❯
automated pipelines, and shaping the foundational framework for how we leverage data to succeed. What You'll Do You'll develop and maintain data pipelines and automated processes in Airflow and Python You'll create SQL data models with dbt to power dashboards and applications You'll integrate third-party APIs and databases into our data flows You'll … notebook analytics and collaboration Circle CI for continuous deployment AWS cloud infrastructure Kubernetes for data services and task orchestration Google Analytics, Amplitude and Firebase for client applications event processing Airflow for job scheduling and tracking Parquet and Delta file formats on S3 for data lake storage Streamlit for data applications Why else you'll love it here Wondering what More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience More ❯
What you'll do Lead the design of scalable, secure data architectures on AWS. Build and optimise ETL/ELT pipelines for batch and streaming data. Deploy and manage Apache Spark jobs on Databricks and Delta Lake. Write production-grade Python and SQL for large-scale data transformations. Drive data quality, governance, and automation through CI/CD and … scientists, analysts, and business stakeholders. Mentor and guide data engineering teams. What we're looking for Proven experience in senior/principal data engineering roles. Expertise in AWS, Databricks, Apache Spark, Python, and SQL . Strong background in cloud-native data platforms, real-time processing, and data lakes. Hands-on experience with tools such as Airflow, Kafka, Docker More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
and governance through robust access controls, including RBAC, SSO, token policies, and pseudonymisation frameworks. Develop resilient data flows for both batch and streaming workloads using technologies such as Kafka, Airflow, DBT, and Terraform. Shape data strategy and standards by contributing to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement initiatives. Qualifications What we’d … requirements Direct exposure to cloud-native data infrastructures (Databricks, Snowflake) especially in AWS environments is a plus Experience in building and maintaining batch and streaming data pipelines using Kafka, Airflow, or Spark Familiarity with governance frameworks, access controls (RBAC), and implementation of pseudonymisation and retention policies Exposure to enabling GenAI and ML workloads by preparing model-ready and vector More ❯
Join our rapidly expanding team as a hands-on Cloud Data Analytics Platform Engineer and play a pivotal role in shaping the future of data at Citi. We're building a cutting-edge, multi-cloud data analytics platform that empowers More ❯
Huddersfield, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Oscar Associates (UK) Limited
Job Title: Data Engineer Salary: £40k - £60k + Excellent Benefits Package Location: Huddersfield (Hybrid) Key Skills: SQL, PowerBI, Airflow Summary A new role has opened up for a Data Engineering with SQL, BI, Cloud and Airflow DAG to join a media-focused business. The role has opened up as the company are heavily investing, and have exciting plans … responsibilities will cover: Develop data models to support company Business Intelligence Write and optimize complex SQL queries Build, maintain and improve data pipelines Transform data using DBT, Snowflake and Airflow Ensure data is handled correctly and to relevant standards Collaborate with tech teams to collectively solve shared data challenges Key Skills SQL DBT Snowflake Airflow DAG Cloud Platform … successful candidate to Oscar. Email: to recommend someone for this role Job Title: Data Engineer Salary: £40k - £60k + Excellent Benefits Package Location: Huddersfield (Hybrid) Key Skills: SQL, PowerBI, Airflow Oscar Associates (UK) Limited is acting as an Employment Agency in relation to this vacancy. To understand more about what we do with your data please review our privacy More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
methodology and a Medallion architecture (bronze, silver, gold layers). Develop and maintain DBT projects and configure incremental loads with built-in unit testing. Support data pipeline orchestration with Airflow and work with AWS cloud tools. Help deliver a production-ready Data Mart with star schema design to power business reporting and dashboards (PowerBI experience a plus). Skills … Experience: Strong SQL expertise and hands-on experience with DBT. Familiarity with Kimball dimensional modelling concepts. Experience working with cloud data warehouses such as Redshift or Snowflake. Knowledge of Airflow for workflow management. Comfortable in AWS environments and data orchestration. Bonus: Python programming skills and familiarity with dashboarding tools. Contract Details: Duration: 3 months Rate: £450/day onside More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hlx Technology
Collaborate with ML researchers and biologists to translate raw data into actionable insights and high-quality training data Scale distributed systems using Kubernetes, Terraform, and orchestration tools such as Airflow, Flyte, or Temporal Write clean, extensible, and well-tested code to ensure long-term maintainability and collaboration across teams About You We are looking for data and platform engineers … Experience designing and implementing large-scale data storage systems (feature stores, timeseries databases, warehouses, or object stores) Strong distributed systems and infrastructure skills (Kubernetes, Terraform, orchestration frameworks such as Airflow/Flyte/Temporal) Hands-on cloud engineering experience (AWS, GCP, or Azure) Strong software engineering fundamentals, with a track record of writing maintainable, testable, and extensible code Familiarity More ❯
Collaborate with ML researchers and biologists to translate raw data into actionable insights and high-quality training data Scale distributed systems using Kubernetes, Terraform, and orchestration tools such as Airflow, Flyte, or Temporal Write clean, extensible, and well-tested code to ensure long-term maintainability and collaboration across teams About You We are looking for data and platform engineers … Experience designing and implementing large-scale data storage systems (feature stores, timeseries databases, warehouses, or object stores) Strong distributed systems and infrastructure skills (Kubernetes, Terraform, orchestration frameworks such as Airflow/Flyte/Temporal) Hands-on cloud engineering experience (AWS, GCP, or Azure) Strong software engineering fundamentals, with a track record of writing maintainable, testable, and extensible code Familiarity More ❯