data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
options Hybrid working - 1 day a week in a central London office High-growth scale-up with a strong mission and serious funding Modern tech stack: Python, SQL, Snowflake, Apache Iceberg, AWS, Airflow, dbt, Spark Work cross-functionally with engineering, product, analytics, and data science leaders What You'll Be Doing Lead, mentor, and grow a high-impact team More ❯
Basingstoke, Hampshire, South East, United Kingdom
Anson Mccade
processes. Monitor integration health and implement alerting, logging, and performance tracking. Contribute to continuous improvement of integration architecture and practices. Key Skills & Experience Experience with workflow orchestration tools (e.g., Apache Airflow). Proven backend development skills using Node.js, Python, Java, or similar. Strong understanding of API design and integration techniques (REST, Webhooks, GraphQL). Familiarity with authentication protocols (OAuth2 More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with a More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
london (city of london), south east england, united kingdom
Sahaj Software
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation tools. Awareness of ML/AI integration into More ❯
of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally Apache Airflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems, Computer More ❯
london, south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
management disciplines, including data integration, modeling, optimisation, data quality and Master Data Management. Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB). Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines. Have worked on migration projects and some experience with management systems such as SAP, ERP More ❯
london, south east england, united kingdom Hybrid / WFH Options
Vallum Associates
create, and optimize classifiers Qualifications: Programming Skills: Expertise in Python; Scala/Java knowledge is an advantage OpenTelemetry Expertise: Strong understanding of OpenTelemetry stack (VMs, middleware like IIS/Apache, hypervisors, databases, etc.) and OTLP formatting/structuring Ability to correlate streaming OLTP data from multiple components and generate actionable insights using ML models Strong analytical and problem-solving More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Vallum Associates
create, and optimize classifiers Qualifications: Programming Skills: Expertise in Python; Scala/Java knowledge is an advantage OpenTelemetry Expertise: Strong understanding of OpenTelemetry stack (VMs, middleware like IIS/Apache, hypervisors, databases, etc.) and OTLP formatting/structuring Ability to correlate streaming OLTP data from multiple components and generate actionable insights using ML models Strong analytical and problem-solving More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Vallum Associates
create, and optimize classifiers Qualifications: Programming Skills: Expertise in Python; Scala/Java knowledge is an advantage OpenTelemetry Expertise: Strong understanding of OpenTelemetry stack (VMs, middleware like IIS/Apache, hypervisors, databases, etc.) and OTLP formatting/structuring Ability to correlate streaming OLTP data from multiple components and generate actionable insights using ML models Strong analytical and problem-solving More ❯
Gerrards Cross, Buckinghamshire, England, United Kingdom
Ikhoi Recruitment
hands-on and enthusiastic person who is quick to learn You'll have at least 2 years 2nd line support experience Experience working with service desk ticketing tools (Jira) Apache or similar server experience Work effectively with a high degree of autonomy Excellent interpersonal and communication skills and enjoy working in a fast-paced environment You will be working More ❯
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data quality … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding of More ❯
analytical data models using dbt (data build tool). Develop and optimize complex SQL queries for data extraction, transformation, and loading in Snowflake. Orchestrate and schedule data workflows using Apache Airflow, including developing custom Python operators and DAGs. Write efficient and maintainable Python scripts for data processing, automation, and integration with various data sources and APIs. Ensure data quality … and cost-effectiveness. Qualifications: Strong proficiency in SQL, particularly with Snowflake's features and functionalities. Extensive experience with dbt for data modeling, transformations, testing, and documentation. Solid experience with Apache Airflow for workflow orchestration and scheduling. Proficiency in Python for data manipulation, scripting, and automation. Experience with cloud platforms (e.g., AWS, Azure, GCP) and relevant data services. Understanding of More ❯