London, England, United Kingdom Hybrid / WFH Options
Elsevier
pairing and committed to ongoing learning and mentoring colleagues. Key Responsibilities Designing, prototyping, and implementing robust recommendation applications using best-practice agile development processes Working with technologies including Java, Scala, Spark, EMR, Kubernetes, and Airflow Building cloud infrastructure in AWS to host and monitor the applications, and automating common tasks mercilessly. Collaborating as part of a tight-knit, agile, quality … deliver tangible value to our customers. Requirements Experience in commercial software engineering to deliver server-side applications. Experience in programming skills on the JVM with either Java 8+ or Scala Experience with agile practices for rapid development of quality software, such as CI/TD Candidate who can rapidly grasp modern technologies, languages, and tools to maintain our product's More ❯
teams (Product, Finance, Ops, Customer Success) to deliver actionable insights within fast-moving or finance-oriented environments Qualifications Proficiency in Python (preferred), with experience in other languages like Java, Scala, or TypeScript Strong knowledge of cloud-based big data tools (AWS, Spark, EMR, Lambda, etc.) Excellent communication and problem-solving skills Data Engineer High-Growth Tech Scale-Up Strategic + More ❯
data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative, solutions-focused mindset and a willingness to lead from the front 🌟 What’s on Offer More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Atarus
data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative, solutions-focused mindset and a willingness to lead from the front 🌟 What’s on Offer More ❯
container orchestration ( Kubernetes , OpenShift, Docker, Mesos, etc.) Experience with different distributed technologies (e.g. Spark, S3, Snowflake, DynamoDB, CockroachDB, HDFS, Hive, etc.) Experienced with Java/Go/Python/Scala/other languages Proficiency in English Team Data Reply, as part of the Reply Group, offers a wide range of services to help clients become data-driven. The team is More ❯
container orchestration ( Kubernetes , OpenShift, Docker, Mesos, etc.) Experience with different distributed technologies (e.g. Spark, S3, Snowflake, DynamoDB, CockroachDB, HDFS, Hive, etc.) Experienced with Java/Go/Python/Scala/other languages Proficiency in English Team Data Reply, as part of the Reply Group, offers a wide range of services to help clients become data-driven. The team is More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Atarus
real-time data processing tools (e.g. Kafka, Flink, Spark Streaming) Solid hands-on knowledge of cloud platforms (AWS, GCP or Azure) Strong proficiency in languages like Python, Java or Scala Familiarity with orchestration tools such as Airflow or Kubernetes Strong stakeholder management and communication skills Passion for mentoring and developing engineering talent 🎁 What’s On Offer Competitive salary + quarterly More ❯
real-time data processing tools (e.g. Kafka, Flink, Spark Streaming) Solid hands-on knowledge of cloud platforms (AWS, GCP or Azure) Strong proficiency in languages like Python, Java or Scala Familiarity with orchestration tools such as Airflow or Kubernetes Strong stakeholder management and communication skills Passion for mentoring and developing engineering talent 🎁 What’s On Offer Competitive salary + quarterly More ❯
Birmingham, England, United Kingdom Hybrid / WFH Options
technologies such as Azure and Snowflake. Knowledge of containers like Docker and Kubernetes is advantageous but not mandatory. Familiarity with at least one programming language (e.g., Python, Java, or Scala). Experience with data warehousing concepts and ETL processes. Strong analytical skills and attention to detail. If interested in this exciting opportunity, please apply today! Additional Information Experience: Not required More ❯
Optimize and monitor deployed models for performance, latency, and cost-effectiveness using tools such as CloudWatch, CloudTrail, and Prometheus. Write clean, maintainable code in Python (and optionally Java/Scala) following best software engineering practices. Automate model training, validation, and deployment workflows using CI/CD pipelines (e.g., CodePipeline, Jenkins, GitHub Actions). Ensure security and compliance in data handling More ❯
system in use in terms of data structure and process improvement. What we value These skills will help you succeed in this role Strong skill-set, experience in Python, Scala/PySpark, PL/SQL, PERL/scripting . Skilled Data Engineer for Cloud Data Lake activities. The candidate should have industry experience (preferably in Financial Services) in navigating enterprise More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Signify Technology
of distributed systems and modern data tools (e.g., Apache Spark, Kafka, DBT, Databricks). Experience with both batch and real-time data processing architectures. Strong programming background in Python, Scala, or Java. Familiarity with cloud platforms such as AWS, GCP, or Azure. Demonstrated experience working with privacy-sensitive data and enforcing governance standards. Strong leadership and mentoring skills, with the More ❯
of distributed systems and modern data tools (e.g., Apache Spark, Kafka, DBT, Databricks). Experience with both batch and real-time data processing architectures. Strong programming background in Python, Scala, or Java. Familiarity with cloud platforms such as AWS, GCP, or Azure. Demonstrated experience working with privacy-sensitive data and enforcing governance standards. Strong leadership and mentoring skills, with the More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
JR United Kingdom
Role As a Data Engineer , you will: Design and deploy production data pipelines from ingestion to consumption within a big data architecture. Work with technologies such as Python, Java, Scala, Spark, and SQL to extract, clean, transform, and integrate data. Build scalable solutions using AWS services like EMR, Glue, Redshift, Kinesis, Lambda, and DynamoDB. Process large volumes of structured and More ❯
of experience in DevOps/Data Engineering Extensive familiarity with AWS services to include CloudFormation, EC2, S3, and RDS CloudFormation, Ansible, Git, Jenkins, Bash Programming Languages: Python, Java or Scala Processing Tools: Elasticsearch, Spark, NiFi, and/or Docker Datastore Types: Graph, NoSQL, and/or Relational US Citizenship and an active TS/SCI with Polygraph security clearance required More ❯
Experience with at least one Cloud provider (AWS, Azure) and traditional Data Platforms Knowledge of high load and high throughput Data Platform architectures Hands-on experience with Python/Scala/Java and SQL development Experience with continuous delivery tools and technologies Knowledge of Serverless Designs, MPP Architectures, Containers, and Resource Management systems Ability to research, compare, and select appropriate More ❯
teams. A thorough understanding of Java and SQL and a solid grasp of best practices in software development . Experience using big data and related technologies (like Spark, Python, Scala, Kafka). Willingness to become AWS or Confluent Certified Developer. Very good knowledge of Linux Systems and shell scripting. A positive attitude and willingness to feed our family feel, share More ❯
data engineering experience - Experience with data modeling, warehousing and building ETL pipelines - Experience with SQL - Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS PREFERRED QUALIFICATIONS - Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissions - Experience with non-relational databases/data stores More ❯
your development journey! Key skills and experience: Experience with automated testing frameworks like Playwright Experience with profiling tools and load testing Commercial experience with Java or JVM languages (Groovy, Scala, Kotlin ) API Engineering expertise Relational database knowledge, e.g., PostgreSQL DevOps skills: CI/CD, Docker, Git Additional skills that are a plus: Understanding of Software Engineering Principles: SOLID, design patterns More ❯
experience: Experience with automated testing frameworks such as Playwright Experience with profiling tools and load testing Commercial experience with a core programming language like Java or JVM languages (Groovy, Scala, Kotlin ) Expertise in API Engineering Experience with relational databases like PostgreSQL DevOps: CI/CD, Docker, Git The following skills and technologies are a plus: Understanding of Software Engineering Principles More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava Limited
Governance & Compliance Apply robust security measures (RBAC, encryption) and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Data Modelling: Designing dimensional, relational, and hierarchical data models. Scalability & Performance: Building fault-tolerant, highly available data architectures. Security & Compliance: Enforcing More ❯
data engineering, data platforms, and analytics with a strong track record of successful projects and in-depth knowledge of industry best practices Comfortable writing code in either Python or Scala Working knowledge of two or more common Cloud ecosystems (AWS, Azure, GCP) with expertise in at least one Deep experience with distributed computing with Apache Spark and knowledge of Spark More ❯
London, England, United Kingdom Hybrid / WFH Options
Ten10 Group
Azure DevOps). Excellent communication and stakeholder management skills. Bonus Points for: Previous experience in a consultancy environment . Hands-on coding experience in additional languages like Python, Ruby, Scala, PHP, or C++. What’s in It for You? At Ten10, we believe in recognizing and rewarding great work. Here’s what you can expect: 25 Days of Annual Leave More ❯
City of Westminster, England, United Kingdom Hybrid / WFH Options
VIOOH
its inner workings. Experience managing AWS or GCP. Experience in building or integrating Monitoring Tools (Datadog/Kibana/Grafana/Prometheus). Write software using either Java/Scala/Python. The following are nice to have, but not required - Apache Spark jobs and pipelines. Experience with any functional programming language. Writing and analysing SQL queries. Application overVIOOH Our More ❯
building ETL pipelines - Experience with one or more scripting language (e.g., Python, KornShell) - Experience with one or more query language (e.g., SQL, PL/SQL, DDL, MDX, HiveQL, SparkSQL, Scala) PREFERRED QUALIFICATIONS - Experience with big data technologies such as: Hadoop, Hive, Spark, EMR - Experience with any ETL tool like, Informatica, ODI, SSIS, BODI, Datastage, etc. Our inclusive culture empowers Amazonians More ❯