data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
london (city of london), south east england, united kingdom
HCLTech
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
options Hybrid working - 1 day a week in a central London office High-growth scale-up with a strong mission and serious funding Modern tech stack: Python, SQL, Snowflake, Apache Iceberg, AWS, Airflow, dbt, Spark Work cross-functionally with engineering, product, analytics, and data science leaders What You'll Be Doing Lead, mentor, and grow a high-impact team More ❯
Basingstoke, Hampshire, South East, United Kingdom
Anson Mccade
processes. Monitor integration health and implement alerting, logging, and performance tracking. Contribute to continuous improvement of integration architecture and practices. Key Skills & Experience Experience with workflow orchestration tools (e.g., Apache Airflow). Proven backend development skills using Node.js, Python, Java, or similar. Strong understanding of API design and integration techniques (REST, Webhooks, GraphQL). Familiarity with authentication protocols (OAuth2 More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with a More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
london (city of london), south east england, united kingdom
Sahaj Software
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation tools. Awareness of ML/AI integration into More ❯
of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally Apache Airflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems, Computer More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
london, south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
Stockbridge, Hampshire, South East, United Kingdom
Morson Talent
a CAMS Engineer (Continuing Airworthiness Management Support) on a contracting basis for our large client based out of Middle Wallop. About the Role: Our client are actively seeking a Apache Continuing Airworthiness Management Support (CAMS) Engineer to provide Military Aviation Authority (MAA) Regulatory Publication (MRP) compliant support to the Apache Military Continuing Airworthiness Manager Organisation (Mil CAMO). … air systems allocated to our client at Middle Wallop and Wattisham Flying Station, undertaking maintenance, modification and flying training activities. In this role, you will: In accordance with the Apache Long Term Training Support Services (LTTSS) contract provide Military Aviation Authority (MAA) MRP compliant support to the Apache Military Continuing Airworthiness Manager (Mil CAM). The Apache CAMS Engineer will provide direct assistance to the Apache Deputy Continuing Airworthiness Manager (DCAM) and Apache Mil CAMO with the undertaking of CAM activities in general and specific to those air systems allocated to the clients at Middle Wallop and Wattisham Flying Station. Some UK travel is expected. Responsibilities include: The role is responsible to the ApacheMore ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hexegic
to create, test and validate data models and outputs Set up monitoring and ensure data health for outputs What we are looking for Proficiency in Python, with experience in Apache Spark and PySpark Previous experience with data analytics softwares Ability to scope new integrations and translate user requirements into technical specifications What’s in it for you? Base salary More ❯
london, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Hunter Bond
in Python for data pipelines, transformation, and orchestration. Deep understanding of the Azure ecosystem (e.g., Data Factory, Blob Storage, Synapse, etc.) Proficiency in Databricks (or strong equivalent experience with Apache Spark ). Proven ability to work within enterprise-level environments with a focus on clean, scalable, and secure data solutions. If you are the right fit - contact me directly More ❯
management disciplines, including data integration, modeling, optimisation, data quality and Master Data Management. Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB). Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines. Have worked on migration projects and some experience with management systems such as SAP, ERP More ❯
management disciplines, including data integration, modeling, optimisation, data quality and Master Data Management. Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB). Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines. Have worked on migration projects and some experience with management systems such as SAP, ERP More ❯
london, south east england, united kingdom Hybrid / WFH Options
Vallum Associates
create, and optimize classifiers Qualifications: Programming Skills: Expertise in Python; Scala/Java knowledge is an advantage OpenTelemetry Expertise: Strong understanding of OpenTelemetry stack (VMs, middleware like IIS/Apache, hypervisors, databases, etc.) and OTLP formatting/structuring Ability to correlate streaming OLTP data from multiple components and generate actionable insights using ML models Strong analytical and problem-solving More ❯