MySQL Exposure to Docker, Kubernetes, AWS, Helm, Terraform, Vault, Grafana, ELK Stack, New Relic Relevant experience in the maintenance of data APIs and data lake architectures, including experience with Apache Iceberg, Trino/Presto, Clickhouse, Snowflake, BigQuery. Master's degree in Computer Science or Engineering-related field Get to know us better YouGov is a global online research company More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
strategies. Strong experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a cloud More ❯
London, England, United Kingdom Hybrid / WFH Options
Intercom
might be more valuable than your direct technical contributions on a project. You care about your craft In addition it would be a bonus if you have Worked with Apache Airflow - we use Airflow extensively to orchestrate and schedule all of our data workflows. A good understanding of the quirks of operating Airflow at scale would be helpful. Experience More ❯
London, England, United Kingdom Hybrid / WFH Options
Deel
based languages is a plus. Data Warehousing: Hands-on experience with cloud-based data warehouses. Data Modeling: Proficiency in designing efficient database schemas. Workflow Orchestration: Familiarity with tools like Apache Airflow. Data Streaming: Experience with data streaming and Change Data Capture (CDC). Infrastructure: Proficiency in Terraform and GitHub Actions. Compliance: Experience in setting up PII anonymization and RBAC. More ❯
Manage deployments with Helm and configuration in YAML. Develop shell scripts and automation for deployment and operational workflows. Work with Data Engineering to integrate and manage data workflows using Apache Airflow and DAG-based models. Perform comprehensive testing, debugging, and optimization of backend components. Required Skills Bachelor's degree in Computer Science, Software Engineering, or a related field (or … and YAML for defining deployment configurations and managing releases. Proficiency in shell scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with Apache Airflow for managing complex data processing workflows. Familiarity with database systems (SQL and NoSQL) and proficiency in writing efficient queries. Solid understanding of software development best practices, including version More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Roc Search
Manage deployments with Helm and configuration in YAML. Develop shell scripts and automation for deployment and operational workflows. Work with Data Engineering to integrate and manage data workflows using Apache Airflow and DAG-based models. Perform comprehensive testing, debugging, and optimization of backend components. Required Skills Bachelor's degree in Computer Science, Software Engineering, or a related field (or … and YAML for defining deployment configurations and managing releases. Proficiency in shell scripting for automating deployment and maintenance tasks. Understanding of DAG (Directed Acyclic Graph) models and experience with Apache Airflow for managing complex data processing workflows. Familiarity with database systems (SQL and NoSQL) and proficiency in writing efficient queries. Solid understanding of software development best practices, including version More ❯
London, England, United Kingdom Hybrid / WFH Options
Nadara
lake, data warehouse, lakehouse, and cloud-native designs. Experience with Inmon, Data Vault 2.0, Kimball, and dimensional modelling. Knowledge of integration patterns, ETL/ELT processes, and tools (e.g., Apache Airflow, Azure Data Factory, Informatica, Talend) to orchestrate data workflows. Familiarity with DevOps/MLOps principles, CI/CD pipelines, and infrastructure as code (e.g., Terraform, CloudFormation). Basic More ❯
technical direction to a growing team of developers globally. The platform is a Greenfield build using standard modern technologies such as Java, Spring Boot, Kubernetes, Kafka, MongoDB, RabbitMQ, Solace, Apache Ignite. The platform runs in a hybrid mode both on-premise and in AWS utilising technologies such as EKS, S3, FSX. The main purpose of this role is to More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS Platform) Technologies: Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS Environment: Large-scale data environment, Fully remote UK, Microservices architecture About the Role Are you a Data Engineering enthusiast who thrives on designing and implementing robust ETL processes, highly scalable data structures, and data pipelines within an enterprise-scale data processing … infrastructure background, understanding of system migrations, and experience with data warehousing concepts. Technical Skills Deep understanding of SQL and NoSQL databases (MongoDB or similar) Experience with streaming platforms like Apache Kafka Development and maintenance of ELT pipelines Knowledge of data warehousing best practices High proficiency in Apache Kafka and Apache Airflow Strong AWS experience Additional Attributes Agile More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (AI-Driven SaaS plaform) (Python, Snowflake, Data Modelling, ETL/ELT, Apache Airflow, Kafka, AWS) Large-scale data environment Up to £70,000 plus benefits FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalable data structures and data pipelines within a truly enterprise-scale data processing … platform integrates Python and Snowflake and you'll need a deep understanding of SQL and NoSQL databases (MongoDB or similar!) You'll also have experience with streaming platforms like Apache Kafka and be able to develop and maintain ELT and essentially bring a solid understanding of data warehousing concepts and best practice. You will understanding Apache Kafka to … a high standard and have solid knowledge of Apache Airflow - from a Cloud perspective, you will be an AWS enthuiast! Naturally you will have good understanding on AWS. I'd love you to be an advocate of Agile too - these guys are massive on Agile Delivery and Scrum - so it's importantly you share a similar mind-set and More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Eden Scott
cutting-edge technologies. About the Role You’ll be part of an agile, cross-functional team building a powerful data platform and intelligent search engine. Working with technologies like Apache Lucene, Solr, and Elasticsearch, you'll contribute to the design and development of scalable systems, with opportunities to explore machine learning, AI-driven categorisation models, and vector search. What … You’ll Be Doing Design and build high-performance data pipelines and search capabilities. Develop solutions using Apache Lucene, Solr, or Elasticsearch. Implement scalable, test-driven code in Java and Python. Work collaboratively with Business Analysts, Data Engineers, and UI Developers. Contribute across the stack – from React/TypeScript front end to Java-based backend services. Leverage cloud infrastructure … leading data sets. Continuous improvements to how data is processed, stored, and presented. Your Profile Strong experience in Java development, with some exposure to Python. Hands-on knowledge of Apache Lucene, Solr, or Elasticsearch (or willingness to learn). Experience in large-scale data processing and building search functionality. Skilled with SQL and NoSQL databases. Comfortable working in Agile More ❯
London, England, United Kingdom Hybrid / WFH Options
Trudenty
real-time data pipelines for processing large-scale data. Experience with ETL processes for data ingestion and processing. Proficiency in Python and SQL. Experience with big data technologies like Apache Hadoop and Apache Spark. Familiarity with real-time data processing frameworks such as Apache Kafka or Flink. MLOps & Deployment: Experience deploying and maintaining large-scale ML inference More ❯
and well-tested solutions to automate data ingestion, transformation, and orchestration across systems. Own data operations infrastructure: Manage and optimise key data infrastructure components within AWS, including Amazon Redshift, Apache Airflow for workflow orchestration and other analytical tools. You will be responsible for ensuring the performance, reliability, and scalability of these systems to meet the growing demands of data … pipelines , data warehouses , and leveraging AWS data services . Strong proficiency in DataOps methodologies and tools, including experience with CI/CD pipelines, containerized applications , and workflow orchestration using Apache Airflow . Familiar with ETL frameworks, and bonus experience with Big Data processing (Spark, Hive, Trino), and data streaming. Proven track record - You've made a demonstrable impact in More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava Limited
delivering high-quality solutions aligned with business objectives. Key Responsibilities Architect, implement, and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake, or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target mappings and quality requirements. Build ETL … security measures (RBAC, encryption) and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Data Modelling: Designing dimensional, relational, and hierarchical data models. Scalability & Performance: Building fault-tolerant, highly available data architectures. Security & Compliance: Enforcing role-based access More ❯
London, England, United Kingdom Hybrid / WFH Options
Merantix
Linux systems and bash terminals Preferred Qualifications Hands-on experience with: Distributed computing frameworks, such as Ray Data and Spark. Databases and/or data warehousing technologies, such as Apache Hive. Data transformation via SQL and DBT. Orchestration platforms, such as Apache Airflow. Data catalogs and metadata management tools. o Vector data stores. Familiarity with: Data lake architectures More ❯
Bath, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Newbury, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Sheffield, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
Stockton-on-Tees, England, United Kingdom Hybrid / WFH Options
Autodesk
such as AWS, Azure, or GCP · Docker · Documenting code, architectures, and experiments · Linux systems and bash terminals Preferred Qualifications o Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. o Data transformation via SQL and DBT. o Orchestration platforms such as Apache Airflow, Argo Workflows, etc. o Data catalogs and metadata management tools o More ❯
London, England, United Kingdom Hybrid / WFH Options
Autodesk
platforms such as AWS, Azure, or GCP Docker Documenting code, architectures, and experiments Linux systems and bash terminals Preferred Qualifications Databases and/or data warehousing technologies, such as Apache Hive, Iceberg etc. Data transformation via SQL and DBT. Orchestration platforms such as Apache Airflow, Argo Workflows, etc. Data catalogs and metadata management tools Vector databases Relational and More ❯
Columbia, South Carolina, United States Hybrid / WFH Options
Systemtec Inc
technologies and cloud-based technologies AWS Services, State Machines, CDK, Glue, TypeScript, CloudWatch, Lambda, CloudFormation, S3, Glacier Archival Storage, DataSync, Lake Formation, AppFlow, RDS PostgreSQL, Aurora, Athena, Amazon MSK, Apache Iceberg, Spark, Python ONSITE: Partially onsite 3 days per week (Tue, Wed, Thurs) and as needed. Standard work hours: 8:30 AM - 5:00 PM Required Qualifications of the More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Noir
They're Looking For: Experience in a data-focused role, with a strong passion for working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and Apache Spark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services experience is a plus but not More ❯
Bath, England, United Kingdom Hybrid / WFH Options
Noir
They’re Looking For: * Experience in a data-focused role, with a strong passion for working with data and delivering value to stakeholders. * Strong proficiency in SQL, Python, and Apache Spark, with hands-on experience using these technologies in a production environment. * Experience with Databricks and Microsoft Azure is highly desirable. * Financial Services experience is a plus but not More ❯