chelmsford, east anglia, united kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
saffron walden, east anglia, united kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
services, including microservice deployment Developing and maintaining data pipelines to process and manage large-scale structural biology data Working with workflow orchestration tools like ApacheAirflow and Nextflow To liaise with the EMBL-EBI Technical Services Cluster To support the development and deployment of other scientific software The … Proficiency in containerisation Proficiency in relational databases (Oracle, PostgreSQL) Experience in developing and maintaining data pipelines Hands-on experience with workflow orchestration tools (e.g., ApacheAirflow, Nextflow) Strong interpersonal and communication skills Proficiency in oral and written English You may also have PhD in computer science, IT or … a related field, or in bioinformatics with a demonstrated IT expertise Experience in using Kubernetes Experience with web servers (Apache/Nginx) Hands-on experience with CI/CD (GitLab CI/GitHub Actions) Familiarity with networking Familiarity with Java Knowledge of, or affinity with, structural biology and bioinformatics More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
TalentHawk
early and providing strategic guidance. Support the ongoing development of integration strategies involving Managed File Transfer solutions (e.g., GoAnywhere) and data orchestration platforms (e.g., ApacheAirflow). Provide hands-on support and detailed guidance on particularly complex integration designs where necessary. Maintain current knowledge of industry trends, technology … and migrations. Familiarity with IBM Maximo asset management platform. Knowledge and experience with Managed File Transfer solutions (e.g., GoAnywhere). Understanding and experience with ApacheAirflow orchestration platform. Strong grasp of integration best practices, security considerations, and data flow management. Ability to work collaboratively across distributed teams and More ❯
diagram of proposed tables to enable discussion. Good communicator and comfortable with presenting ideas and outputs to technical and non-technical users. Worked on ApacheAirflow before to create DAGs. Ability to work within Agile, considering minimum viable products, story pointing, and sprints. More information: Enjoy fantastic perks More ❯
lakehouse platforms like Databricks, Redshift, and Snowflake. Experience using Git for version control. Ability to automate tasks using Python or workflow orchestration tools like Apache Airflow. Skill in integrating data from various sources, including APIs, databases, and third-party systems. Ensuring data quality through monitoring and validation throughout pipelines. More ❯
Intelligence, Statistical & Data Analysis, Computational Algorithms, Data Engineering, etc. Experience working with a variety of complex, large datasets. Experience building automated pipelines (e.g., Jenkins, Airflow, etc.). Experience building or understanding end-to-end, distributed, and high-performance software infrastructures. Proven ability to work collaboratively as part of a More ❯
across the company. Role requirements 4+ years of experience You have an understanding of developing ETL pipelines using Python frameworks such as luigi or airflow; You have experience with the development of Python-based REST APIs/services and their integration with databases (e.g. Postgres); You are familiar with More ❯
Nottinghamshire, United Kingdom Hybrid / WFH Options
Akkodis
Data Engineer (Python, Data Modelling, ETL/ELT, ApacheAirflow, DBT, AWS) Enterprise-scale tech firm Up to £70,000 plus benefits - FULLY REMOTE UK Are you a Data Engineering enthusiast who thrives from designing and implementing robust ETL processes, highly scalabale data structures and data pipelines within More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Duel
Snowflake. You understand event-driven architectures and real-time data processing. You have experience implementing and maintaining scalable data pipelines using tools like dbt, ApacheAirflow, or similar. You have no issue working with either structured or semi-structured data. You are comfortable working with data engineering and More ❯
maintaining modular, scalable data models that follow best practices Strong understanding of dimensional modelling Familiarity with AWS data services (S3, Athena, Glue) Experience with Airflow for scheduling and orchestrating workflows Experience working with data lakes or modern data warehouses (Snowflake, Redshift, BigQuery) A pragmatic problem solver who can balance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT Expertise Terraform Experience Nice to Have: Data Modelling Data Vault ApacheAirflow Benefits: Up to 10% Bonus Up to 14% Pensions Contribution 29 Days Annual Leave + Bank Holidays Free Company Shares Interviews ongoing More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
Peaple Talent
and familiar with setting up CI/CD workflows using platforms like Azure DevOps or similar tools. Hands-on experience with orchestration tools like ApacheAirflow for managing complex data workflows. Practical familiarity with low-code or no-code platforms such as Talend and SnapLogic for streamlined pipeline More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team. Comfortable in a fast-paced environment. Have excellent written and verbal More ❯
transformation, cleaning, and loading. ·Strong coding experience with Python and Pandas. ·Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. ·Build processes supporting data transformation, data structures, metadata, dependency and workload management. ·Experience supporting and working with cross-functional teams in a dynamic More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
transformation, cleaning, and loading. Strong coding experience with Python and Pandas. Experience with any flavour of data pipeline and workflow management tools: Dagster, Celery, Airflow, etc. Build processes supporting data transformation, data structures, metadata, dependency and workload management. Experience supporting and working with cross-functional teams in a dynamic More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery processes Hands on experience of best engineering practices (handling More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Aubay UK
migration projects, particularly large-scale migrations to distributed database platforms. Hands-on experience with big data processing technologies, including Spark (PySpark and SparkScala) and Apache Airflow. Expertise in distributed databases and computing environments. Familiarity with Enterprise Architecture methodologies, ideally TOGAF. Strong leadership experience, including managing technology teams and delivering More ❯
for data processing, analysis and automation. Proficiency in building and maintaining batch and streaming ETL/ELT pipelines at scale, employing tools such as Airflow, Fivetran, Kafka, Iceberg, Parquet, Spark, Glue for developing end-to-end data orchestration leveraging on AWS services to ingest, transform and process large volumes More ❯
environments (e.g., Snowflake). Proficiency in SQL and at least one scripting language (e.g., Python, Bash). Familiarity with data pipeline orchestration tools (e.g., Airflow, dbt). Strong understanding of data quality principles and best practices. Excellent communication and collaboration skills. Working with AWS, Twilio Segment and Google Analytics. More ❯
Central London, London, United Kingdom Hybrid / WFH Options
167 Solutions Ltd
. Develop and manage data warehouse and lakehouse solutions for analytics, reporting, and machine learning. Implement ETL/ELT processes using tools such as ApacheAirflow, AWS Glue, and Amazon Athena . Work with cloud-native technologies to support scalable, serverless architectures. Collaborate with data science teams to More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
london, south east england, united kingdom Hybrid / WFH Options
Noir
robust data pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (ApacheAirflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code More ❯
or a similar language. Proficiency in database technologies (SQL, NoSQL, time-series databases) and data modelling. Strong understanding of data pipeline orchestration tools (e.g., ApacheAirflow, Kubernetes). You thrive when working as part of a team Comfortable in a fast-paced environment Have excellent written and verbal More ❯