data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data More ❯
of data modelling and data warehousing concepts Familiarity with version control systems, particularly Git Desirable Skills: Experience with infrastructure as code tools such as Terraform or CloudFormation Exposure to Apache Spark for distributed data processing Familiarity with workflow orchestration tools such as Airflow or AWS Step Functions Understanding of containerisation using Docker Experience with CI/CD pipelines and More ❯
data engineering tasks. Experience building and maintaining web scraping pipelines. Strong SQL skills, with expertise in performance tuning. Strong proficiency with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity with More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in Python language. Exposed to python-oriented More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
iO Associates
Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates to apply and for more More ❯
london, south east england, united kingdom Hybrid / WFH Options
iO Associates
Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates to apply and for more More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
iO Associates
Skills & Experience: Strong experience with Snowflake data warehousing Solid AWS cloud engineering experience Proficient in Python for data engineering workflows Skilled in building and maintaining Airflow DAGs Familiarity with Apache Iceberg for table format and data lake optimisation If this could be of interest, please get in touch with Alex Lang at iO Associates to apply and for more More ❯
record in full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., Apache Airflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL, Python, and/or other scripting languages for data processing and automation. Familiarity with LLM More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE. Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED. Please either apply by clicking online or emailing me directly to For further information please call me on 07704 152 640. More ❯
of automation IT WOULD BE NICE FOR THE SENIOR SOFTWARE ENGINEER TO HAVE.... Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED.... Please either apply by clicking online or emailing me directly to For further information please call me on . I can More ❯
sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modeling (Kimball, Data Vault, or More ❯
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with media, marketing, or advertising data. The Opportunity : Work alongside smart, supportive teammates More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Datatech Analytics
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with media, marketing, or advertising data. The Opportunity : Work alongside smart, supportive teammates More ❯
London, City of London, United Kingdom Hybrid / WFH Options
Datatech
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). ·Familiarity with data orchestration tools (e.g., Airflow). ·Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). ·Exposure to CI/CD pipelines, ideally using GitLab CI. ·Background working with media, marketing, or advertising data. The Opportunity : ·Work alongside smart, supportive teammates More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with media, marketing, or advertising data. The Opportunity : Work alongside smart, supportive teammates More ❯
london, south east england, united kingdom Hybrid / WFH Options
Datatech Analytics
GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or other). Exposure to CI/CD pipelines, ideally using GitLab CI. Background working with media, marketing, or advertising data. The Opportunity : Work alongside smart, supportive teammates More ❯
london (city of london), south east england, united kingdom
iO Associates
ideal candidate will lead the delivery of modern data solutions across multiple projects, leveraging Azure and Databricks technologies in an agile environment. Core Requirements Cloud & Data Engineering: Azure, Databricks, Apache Spark, Azure Data Factory, Delta Lake Programming & Querying: Python, SQL (complex, high-performance queries) Data Governance & DevOps: Unity Catalog/Purview, Terraform, Azure DevOps Consulting: Requirements gathering, stakeholder engagement More ❯
City of London, London, United Kingdom Hybrid / WFH Options
ECS
cloud data engineering, with a strong focus on building scalable data pipelines Expertise in Azure Databricks, including building and managing ETL pipelines using PySpark or Scala Solid understanding of Apache Spark, Delta Lake, and distributed data processing concepts Hands-on experience with Azure Data Lake Storage, Azure Data Factory, and Azure Synapse Analytics Proficiency in SQL and Python for More ❯
City of London, London, England, United Kingdom Hybrid / WFH Options
Ada Meher
days a week – based business need. To Be Considered: Demonstrable expertise and experience working on large-scale Data Engineering projects Strong experience in Python/PySpark, Databricks & Apache Spark Hands on experience with both batch & streaming pipelines Strong experience in AWS and associated tooling (Eg, S3, Glue, Redshift, Lambda, Terraform etc) Experience designing Data Engineering platforms from scratch Alongside More ❯
. Solid understanding of DevOps principles and agile delivery. Excellent problem-solving skills and a proactive, team-oriented approach. Confident client-facing communication skills. Desirable Skills & Experience Experience with Apache NiFi and Node.js . Familiarity with JSON, XML, XSD, and XSLT . Knowledge of Jenkins, Maven, BitBucket, and Jira . Exposure to AWS and cloud technologies. Experience working within More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯