quickly and apply new skills Desirable Solid understanding of microservices development SQL and NoSQL databases working set Familiar with or able to quickly learn Apache NiFi, ApacheAirflow, Apache Kafka, KeyCloak, Serverless Computing, GraphQL, APIs, APIM Good skills working with JSON, XML, YAML files Knowledge in More ❯
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
AWS preferred) Solid understanding of libraries like Pandas and NumPy Experience in data warehousing tools like Snowflake, Databricks, BigQuery Familiar with AWS Step Functions, Airflow, Dagster, or other workflow orchestration tools Commercial experience with performant database programming in SQL Capability to solve complex technical issues, comprehending risks prior to More ❯
platforms (AWS, Azure, GCP) Experience building ETL/ELT pipelines specifically using DBT for structured and semi-structured datasets Any orchestration toolings such as Airflow, Dagster, Azure Data Factory, Fivetran etc It will be nice to have: Software engineering background Exposure to building or deploying AI/ML models More ❯
Science or equivalent Experience in developing Finance or HR related applications Working experience with Tableau Working experience with Terraform Experience in creating workflows for ApacheAirflow and Jenkins Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Open to working in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics More ❯
Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) • Experience in working with process scheduling platforms like Apache Airflow. • Should be ready to work in GS proprietary technology like Slang/SECDB • An understanding of compute resources and the ability to interpret More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, ApacheAirflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, ApacheAirflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and More ❯
Engineer II in distributed systems, you'll typically be working in Java or Python, and with a technology stack which includes AWS, Kubernetes, Spark, Airflow, gRPC, New Relic and more. Don't worry if you're not familiar with these, though - much more important is your understanding of how More ❯
skills in languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow orchestration platforms (e.g., Airflow, Fivetran, dbt) Proficiency with SQL and solid grounding in data modeling concepts Familiarity with cloud services and architectures (AWS, GCP, or Azure) Proven experience More ❯
as Redshift, Snowflake, or BigQuery. Strong command of SQL and programming languages like Python, Scala, or Java. Familiarity with ETL/ELT tools (e.g., Airflow, Fivetran, dbt) and cloud data stacks (AWS/GCP/Azure). A deep understanding of data modelling, access controls, and infrastructure performance tuning. More ❯
bristol, south west england, United Kingdom Hybrid / WFH Options
Peaple Talent
and familiar with setting up CI/CD workflows using platforms like Azure DevOps or similar tools. Hands-on experience with orchestration tools like ApacheAirflow for managing complex data workflows. Practical familiarity with low-code or no-code platforms such as Talend and SnapLogic for streamlined pipeline More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Block MB
integrate data best practices across teams Champion data quality, governance, and documentation Key Requirements: Strong experience with Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc. More ❯
Have touched cloud services (AWS, GCP, etc.) Enjoy solving problems, learning fast, and working with good people Bonus points if you’ve played with Airflow, Docker, or big data tools — but they’re more interested in mindset than buzzwords. The team’s based onsite in London — they’re collaborative More ❯
platforms (AWS preferred) and Infrastructure-as-Code tools. Solid understanding of relational databases and SQL. Proven track record building robust ETL pipelines , ideally using Airflow or a similar tool. Familiarity with best practices in software engineering: version control, testing, packaging, and code reviews. Quantitative problem-solving skills with an More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Intellect Group
in Cambridge Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the More ❯
cambridge, east anglia, United Kingdom Hybrid / WFH Options
Intellect Group
in Cambridge Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the More ❯
Cambridge, south west england, United Kingdom Hybrid / WFH Options
Intellect Group
in Cambridge Nice to Have Experience working within a consultancy or client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the More ❯
in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery processes Hands on experience of best engineering practices (handling More ❯
develop and deploy Feature Engineering and Modeling applications to data platforms built on Databricks or similar platforms and platform components (e.g., Snowflake, ML Flow, Airflow, etc.). Demonstrated experience in using Azure-based cloud applications, services and infrastructure or significant, transferrable experience with other Cloud Providers (e.g., AWS or More ❯
and storage. Strong programming skills in Python, Java, or Scala . Proficiency in SQL, NoSQL, and time-series databases . Knowledge of orchestration tools (ApacheAirflow, Kubernetes). If you are a passionate and experienced Senior Data Engineer seeking a Lead role, or a Lead Data Engineer aiming More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Noir
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
TensorFlow, PyTorch, or Scikit-learn for ML, and Pandas, PySpark, or similar for data processing. Experience designing and orchestrating data pipelines with tools like ApacheAirflow, Spark, or Kafka. Strong understanding of SQL, NoSQL, and data modeling. Familiarity with cloud platforms (AWS, Azure, GCP) for deploying ML and More ❯
and flexible systems. Influence Opinion and decision-making across AI and ML Skills Python SQL/Pandas/Snowflake/Elasticsearch Docker/Kubernetes Airflow/Spark Familiarity with GenAI models/libraries Requirements 6+ years of relevant software engineering experience post-graduation A degree (ideally a Master’s More ❯