data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data More ❯
of data modelling and data warehousing concepts Familiarity with version control systems, particularly Git Desirable Skills: Experience with infrastructure as code tools such as Terraform or CloudFormation Exposure to Apache Spark for distributed data processing Familiarity with workflow orchestration tools such as Airflow or AWS Step Functions Understanding of containerisation using Docker Experience with CI/CD pipelines and More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity with More ❯
london, south east england, united kingdom Hybrid / WFH Options
iO Associates
Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates to apply and for more More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
iO Associates
Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates to apply and for more More ❯
and maintenance of IDBS's software platforms adheres to IDBS's architecture vision. What We'll Get You Doing Design, develop, and maintain scalable data pipelines using Databricks and Apache Spark (PySpark) to support analytics and other data-driven initiatives. Support the elaboration of requirements, formulation of the technical implementation plan and backlog refinement. Provide technical perspective to products More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge and More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with media, marketing, or advertising data. The Opportunity : Work alongside smart, supportive teammates More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with media, marketing, or advertising data. The Opportunity : Work alongside smart, supportive teammates More ❯
london (city of london), south east england, united kingdom
iO Associates
ideal candidate will lead the delivery of modern data solutions across multiple projects, leveraging Azure and Databricks technologies in an agile environment. Core Requirements Cloud & Data Engineering: Azure, Databricks, Apache Spark, Azure Data Factory, Delta Lake Programming & Querying: Python, SQL (complex, high-performance queries) Data Governance & DevOps: Unity Catalog/Purview, Terraform, Azure DevOps Consulting: Requirements gathering, stakeholder engagement More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
Winchester, Hampshire, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside More ❯
S3, VPC, IAM, CloudFormation/Terraform, and backups. Set up and improve DevOps pipelines: CI/CD, Docker, Kubernetes, automated tests, load testing, and web/application servers (e.g. Apache). Troubleshoot effectively using CLI tools (bash). Guide integrations with external CAFM/CRM systems via APIs. Apply a strong ownership mentality — taking features from design through deployment More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like Apache Airflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like Apache Airflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
london (city of london), south east england, united kingdom
Vallum Associates
Technical Expertise: Solid experience in Python programming, particularly using data manipulation and processing libraries such as Pandas, NumPy, and Apache Spark. Hands-on experience with open-source data frameworks like Apache Spark, Apache Kafka, and Apache Airflow. Strong proficiency in SQL, including advanced query development and performance tuning. Good understanding of distributed computing principles and big … automation pipelines. Experience working with relational databases such as PostgreSQL, MySQL, or equivalent platforms. Skilled in using containerization technologies including Docker and Kubernetes. Experience with workflow orchestration tools like Apache Airflow or Dagster. Familiar with streaming data pipelines and real-time analytics solutions. More ❯
SQL Server Good proficiency in any OOP language (Python, Java, C#) Experience developing reliable and efficient data pipeline solutions Experience with on-prem or cloud data integration system (e.g. Apache NiFi, Apache Airflow, AWS Glue) Familiarity with CI/CD pipelines and DevOps practices Excellent problem-solving and communication skills Bachelor's degree in Computer Science or Engineering More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in Computer More ❯
fine-tuning, or deployment. Technical Skills: Proficiency in Python and frameworks such as PyTorch , TensorFlow , or JAX . Strong familiarity with distributed computing and data engineering tools (e.g., SLURM , Apache Spark , Airflow ). Hands-on experience with LLM training, fine-tuning, and deployment (e.g., Hugging Face , LLamafactory , NVIDIA NeMo ). Preferred Qualifications Advanced degree (MS/PhD) in Computer More ❯
Saffron Walden, Essex, South East, United Kingdom Hybrid / WFH Options
EMBL-EBI
in developing application using SpringBoot Experience in developing web infrastructure (Solr, kubernetes) Experience in git and basic Unix Commands You may also have Experience with large data processing technologies (Apache Spark) Apply now! Benefits and Contract Information Financial incentives: depending on circumstances, monthly family/marriage allowance of £278 monthly child allowance of £336 per child. Non resident allowance More ❯
integration of software IT WOULD BE NICE FOR THE BIG DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as Apache Nifi/RabbitMQ Experience of DevSecOps automated deployment tools such as Jenkins, Ansible, Docker TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For More ❯