robust way possible! Diverse training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, Apache Kafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
data cataloging and metadata management using tools like AWS Glue Data Catalog. Demonstrated self-sufficiency in exploring new tools, troubleshooting issues, and continuously improving processes. Hands-on experience with Apache Airflow for orchestrating complex data workflows and ensuring reliable execution. Understanding of cloud security and governance practices including IAM, KMS, and data access policies. Experience with monitoring and observability More ❯
data quality, or other areas directly relevant to data engineering responsibilities and tasks Proven project experience developing and maintaining data warehouses in big data solutions (Snowflake) Expert knowledge in Apache technologies such as Kafka, Airflow, and Spark to build scalable and efficient data pipelines Ability to design, build, and deploy data solutions that capture, explore, transform, and utilize data More ❯
solutions. Backend Development: Willingness to develop proficiency in backend technologies (e.g., Python with Django) to support data pipeline integrations. Cloud Platforms: Familiarity with AWS or Azure, including services like Apache Airflow, Terraform, or SageMaker. Data Quality Management: Experience with data versioning and quality assurance practices. Automation and CI/CD: Knowledge of build and deployment automation processes. Experience within More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
options Hybrid working - 1 day a week in a central London office High-growth scale-up with a strong mission and serious funding Modern tech stack: Python, SQL, Snowflake, Apache Iceberg, AWS, Airflow, dbt, Spark Work cross-functionally with engineering, product, analytics, and data science leaders What You'll Be Doing Lead, mentor, and grow a high-impact team More ❯
Basingstoke, Hampshire, South East, United Kingdom
Anson Mccade
processes. Monitor integration health and implement alerting, logging, and performance tracking. Contribute to continuous improvement of integration architecture and practices. Key Skills & Experience Experience with workflow orchestration tools (e.g., Apache Airflow). Proven backend development skills using Node.js, Python, Java, or similar. Strong understanding of API design and integration techniques (REST, Webhooks, GraphQL). Familiarity with authentication protocols (OAuth2 More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data visualisation, DevOps principles, and ML/AI integration into data architectures Strong grasp of data governance More ❯
Experience with multiple programming languages. Technologies: Scala, Java, Python, Spark, Linux, shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant). Experience with process scheduling platforms like Apache Airflow. Willingness to work with proprietary technologies like Slang/SECDB. Understanding of compute resources and performance metrics. Knowledge of distributed computing and parallel processing on cloud platforms. Experience More ❯
Hucclecote, Gloucestershire, United Kingdom Hybrid / WFH Options
Omega Resource Group
GitLab) Contributing across the software development lifecycle from requirements to deployment Tech Stack Includes: Java, Python, Linux, Git, JUnit, GitLab CI/CD, Oracle, MongoDB, JavaScript/TypeScript, React, Apache NiFi, Elasticsearch, Kibana, AWS, Hibernate, Atlassian Suite What s on Offer: Hybrid working and flexible schedules (4xFlex) Ongoing training and career development Exciting projects within the UK s secure More ❯
equivalent. Must be proficient in the English language, both written and verbal What makes you uncommon? Hands on experience with one or more of the following technologies (Elasticsearch, Kafka, Apache Spark, Logstash, Hadoop/hive, Tensorflow, Kibana, Athena/Presto/BigTable, Angular, React). Experience with cloud platforms such as AWS, GCP, or Azure. Solid understanding of unit More ❯
of real-time and analytical data pipelines, metadata, and cataloguing (e.g., Atlan) Strong communication, stakeholder management, and documentation skills Preferred (but not essential): AWS or Snowflake certifications Knowledge of Apache Airflow, DBT, GitHub Actions Experience with Iceberg tables and data product thinking Why Apply? Work on high-impact, high-scale client projects Join a technically elite team with a More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
london (city of london), south east england, united kingdom
Sahaj Software
technology fundamentals and experience with languages like Python, or functional programming languages like Scala Demonstrated experience in design and development of big data applications using tech stacks like Databricks, Apache Spark, HDFS, HBase and Snowflake Commendable skills in building data products, by integrating large sets of data from hundreds of internal and external sources would be highly critical A More ❯
a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in More ❯
Employment Type: Permanent
Salary: £80000 - £95000/annum Attractive Bonus and Benefits
applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies Familiarity with apache and tomcat-based applications. Exposure to cloud technologies and ansible Evolven exposure is a plus ABOUT US J.P. Morgan is a global leader in financial services, providing strategic advice More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
Fruition Group
best practices for data security and compliance. Collaborate with stakeholders and external partners. Skills & Experience: Strong experience with AWS data technologies (eg, S3, Redshift, Lambda). Proficient in Python, Apache Spark, and SQL. Experience in data warehouse design and data migration projects. Cloud data platform development and deployment. Expertise across data warehouse and ETL/ELT development in AWS More ❯
Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools (Apache Spark, Hadoop). Programming knowledge (Python, R, Java) is a plus. Understanding of ETL/ELT, SQL, NoSQL, and data visualisation tools. Awareness of ML/AI integration into More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
a Senior Data Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in More ❯
techniques You might also have: Familiarity with data visualization tools and libraries (e.g., Power BI) Background in database administration or performance tuning Familiarity with data orchestration tools, such as Apache Airflow Previous exposure to big data technologies (e.g., Hadoop, Spark) for large data processing Experience with ServiceNow integration More ❯
applicable. Essential Skills & Experience Proven experience owning and operating production data platforms within AWS. Strong understanding of AWS core services: EventBridge, Lambda, EC2, S3, and MWAA (Managed Workflows for Apache Airflow). Experience with infrastructure reliability, observability tooling, and platform automation. Solid experience with CI/CD pipelines, preferably Bitbucket Pipelines. Familiarity with Snowflake administration and deployment practices. Comfortable More ❯