and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., ApacheAirflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge More ❯
expertise in PySpark, SQL, Java, Spark, Databricks, dbt, AWS, and Azure Familiarity with European jurisdictions and global reporting requirements Experience with orchestration and CI/CD tools such as Airflow, Databricks Workflows, and Azure DevOps Strong problem-solving skills and the ability to optimise data processes Excellent communication skills and the ability to engage technical and non-technical stakeholders. More ❯
Bletchley, Buckinghamshire, United Kingdom Hybrid / WFH Options
Tria
knowledge. A key part of your role will be setting up monitoring and alerting across the platform. Experience Required: Hands-on with Data Platform tools: Snowflake (Must Have), dbt, Airflow, and Fivetran Experience implementing DataOps best practices: CI/CD, modular pipelines, automated testing, code promoting Access management using Azure Active Directory (Entra ID) Proven experience setting up monitoring More ❯
Milton Keynes, Buckinghamshire, United Kingdom Hybrid / WFH Options
Tria
knowledge. A key part of your role will be setting up monitoring and alerting across the platform. Experience Required: Hands-on with Data Platform tools: Snowflake (Must Have), dbt, Airflow, and Fivetran Experience implementing DataOps best practices: CI/CD, modular pipelines, automated testing, code promoting Access management using Azure Active Directory (Entra ID) Proven experience setting up monitoring More ❯
using cloud-based architectures and tools Experience delivering data engineering solutions on cloud platforms, preferably Oracle OCI, AWS, or Azure Proficient in Python and workflow orchestration tools such as Airflow or Prefect Expert in data modeling, ETL, and SQL Experience with real-time analytics from telemetry and event-based streaming (e.g., Kafka) Experience managing operational data stores with high … availability, performance, and scalability Expertise in data lakes, lakehouses, Apache Iceberg, and data mesh architectures Proven ability to build, deliver, and support modern data platforms at scale Strong knowledge of data governance, data quality, and data cataloguing Experience with modern database technologies, including Iceberg, NoSQL, and vector databases Embraces innovation and works closely with scientists and partners to explore More ❯
Oxford, Oxfordshire, United Kingdom Hybrid / WFH Options
Elsevier
we are relied upon to ensure our systems are trusted, reliable and available.The technology underpinning these capabilities includes industry leading data and analytics products such as Snowflake, Astronomer/Airflow, Kubernetes , DBT, Tableau, Sisense , Collibra, and Kafka/Debezium . Our mission is to enable frictionless experiences for our AIS colleagues and customers so that they can openly and … governance by identifying capability gaps, implementing necessary tooling and processes, and promoting DataOps through leadership and user feedback initiatives. Requirements: Deploy and govern modern data stack technologies (e.g., Snowflake, Airflow, DBT, Fivetran, Airbyte, Tableau, Sisense, AWS, GitHub, Terraform, Docker) at enterprise scale for data engineering workloads. Develop deployable, reusable ETL/ELT solutions using Python, advanced SQL, and Jinja More ❯