data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
london (city of london), south east england, united kingdom
HCLTech
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data More ❯
solutions. Backend Development: Willingness to develop proficiency in backend technologies (e.g., Python with Django) to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like Apache Airflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and deployment automation processes. Experience within More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
options Hybrid working - 1 day a week in a central London office High-growth scale-up with a strong mission and serious funding Modern tech stack: Python, SQL, Snowflake, Apache Iceberg, AWS, Airflow, dbt, Spark Work cross-functionally with engineering, product, analytics, and data science leaders What You'll Be Doing Lead, mentor, and grow a high-impact team More ❯
Basingstoke, Hampshire, South East, United Kingdom
Anson Mccade
processes. Monitor integration health and implement alerting, logging, and performance tracking. Contribute to continuous improvement of integration architecture and practices. Key Skills & Experience Experience with workflow orchestration tools (e.g., Apache Airflow). Proven backend development skills using Node.js, Python, Java, or similar. Strong understanding of API design and integration techniques (REST, Webhooks, GraphQL). Familiarity with authentication protocols (OAuth2 More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data governance More ❯
Hucclecote, Gloucestershire, United Kingdom Hybrid / WFH Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What s on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK s secure More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with a More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
london (city of london), south east england, united kingdom
Sahaj Software
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Familiarity with apache and tomcat-based applications. Exposure to cloud technologies and ansible Evolven exposure is a plus ABOUT US J.P. Morgan is a global leader in financial services, providing strategic advice More ❯
Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fruition Group
best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (e.g., S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and ETL/ELT development in AWS More ❯
Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation tools. Awareness of ML/AI integration into More ❯
techniques You might also have: Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Experience with ServiceNow integration More ❯
of ingestion across the portfolio. Key Requirements: Strong proficiency in Python and PySpark Successful track history in a Data Engineering role including Database Management (ideally SQL), Data Orchestration (ideally Apache Airflow or Dagster), Containerisation (ideally Kubernetes and Docker) .Data Pipelines (big data technologies and architectures) Experience in Financial Services (ideally Commodity) Trading. Bachelor's degree in Information Systems, Computer More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
or another language such as Python Good knowledge of developing in a Linux environment Working knowledge of Git version control and GitLabs CI/CD pipelines Experience working with Apache NiFi Some exposure to front-end elements like JavaScript, TypeScript or React Some data interrogation with ElasticSearch and Kibana Exposure to working with Atlassian products Looking for a role More ❯
Bedford, Bedfordshire, England, United Kingdom Hybrid / WFH Options
Reed Talent Solutions
source systems into our reporting solutions. Pipeline Development: Develop and configure meta-data driven data pipelines using data orchestration tools such as Azure Data factory and engineering tools like Apache Spark to ensure seamless data flow. Monitoring and Failure Recovery: Implement monitoring procedures to detect failures or unusual data profiles and establish recovery processes to maintain data integrity. Azure More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
london, south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Medialab Group
data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation tools (we use Preset.io/Apache Superset). Exposure to CI/CD pipelines (GitLab CI preferred). Experience with advertising or media data is a plus. More ❯
Meta, Amazon , OpenAI) Proficiency with essential data science libraries including Pandas, NumPy, scikit-learn, Plotly/Matplotlib, and Jupyter Notebooks Knowledge of ML-adjacent technologies, including AWS SageMaker and Apache Airflow. Strong skills in data preprocessing, wrangling, and augmentation techniques Experience deploying scalable AI solutions on cloud platforms (AWS, Google Cloud, or Azure) with enthusiasm for MLOps tools and More ❯