architectures. Experienced with Matillion and modern data visualisation tools (QuickSight, Tableau, Looker, etc.). Strong scripting and Linux/cloud environment familiarity. Desirable: Exposure to big data tools (Spark, Hadoop, MapReduce). Experience with microservice-based data APIs. AWS certifications (Solutions Architect or Big Data Specialty). Knowledge of machine learning or advanced analytics. Interested? This is a great More ❯
environment. Relevant domains: FinTech, analytics, data science & engineering, software development, ML, LLMs, SaaS, etc. Very strong understanding of computer science Knowledge of diverse data-related solutions and topics: Spark, Hadoop, Elasticsearch, ETL, ML, LLM, etc. Programming skills: Scala or Java (preferred), or Python (or any other object-oriented language) Strong understanding of learning theories and instructional design principles Experience More ❯
help shape our digital platforms 🧠 What we’re looking for Proven experience as a Data Engineer in cloud environments (Azure ideal) Proficiency in Python, SQL, Spark, Databricks Familiarity with Hadoop, NoSQL, Delta Lake Bonus: Azure Functions, Logic Apps, Django, CI/CD tools 💼 What you’ll get from Mars A competitive salary & bonus Hybrid working with flexibility built in More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Mars
help shape our digital platforms 🧠 What we’re looking for Proven experience as a Data Engineer in cloud environments (Azure ideal) Proficiency in Python, SQL, Spark, Databricks Familiarity with Hadoop, NoSQL, Delta Lake Bonus: Azure Functions, Logic Apps, Django, CI/CD tools 💼 What you’ll get from Mars A competitive salary & bonus Hybrid working with flexibility built in More ❯
and continuous delivery Excellent problem-solving skills and a collaborative mindset Agile development experience in a team setting Bonus Skills (nice to have) Experience with big data tools like Hadoop, Spark, or Scala Exposure to fraud, payments , or financial services platforms Understanding of cloud-native development and container orchestration Knowledge of test-driven development and modern code quality practices More ❯
if you Have experience with Cloud-based or SaaS products and a good understanding of Digital Marketing and Marketing Technologies. Have experience working with Big Data technologies (such as Hadoop, MapReduce, Hive/Pig, Cassandra, MongoDB, etc) An understanding of web technologies such as Javascript, node.js and html. Some level of understanding or experience in AI/ML. Physical More ❯
Go) University degree (IT/math) or equivalent experience The following additional qualifications are a significant plus: Kubernetes knowledge and operating experience Experience with big data stack components like Hadoop, Spark, Kafka, Nifi, Experience with data science/data analysis Knowledge of SRE/DevOP stacks - monitoring/system management tools (Prometheus, Ansible, ELK, ) Version control using git A More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Anson McCade
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
Knowledge of CI/CD processes and infrastructure-as-code. • Eligible for SC clearance (active clearance highly advantageous). Desirable Skills • Exposure to large data processing frameworks (e.g., Spark, Hadoop). • Experience deploying data via APIs and microservices. • AWS certifications (Solution Architect Associate, Data Analytics Speciality, etc.). • Experience in public sector programmes or government frameworks. Package & Benefits More ❯
such as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies like NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, including infrastructure as code and GitOps Database technologies, e.g., relational databases, Elasticsearch, MongoDB Why join Gemba More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Gemba Advantage
as Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. relational databases, Elasticsearch, Mongo Why join More ❯
to implement them through libraries. Experience with programming, ideally Python, and the ability to quickly pick up handling large data volumes with modern data processing tools, e.g. by using Hadoop/Spark/SQL Experience with or ability to quickly learn open-source software including machine learning packages, such as Pandas, scikit-learn, along with data visualisation technologies. Experience More ❯
/product management environmenta Relevant experience within core java and spark Experience in systems analysis and programming of java applications Experience using big data technologies (e.g. Java Spark, hive, Hadoop) Ability to manage multiple/competing priorities and manage deadlines or unexpected changes in expectations or requirements Prior financial services/trade surveillance experience is desirable Strong analytical and More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego More ❯
excellent project management skills and technical experience to ensure successful delivery of complex data projects on time and within budget. Responsibilities: Lead and manage legacy data platform migration (Teradata, Hadoop), data lake build, and data analytics projects from initiation to completion. Develop comprehensive project plans, including scope, timelines, resource allocation, and budgets. Monitor project progress, identify risks, and implement More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Quant Capital
multi-threaded Java – if you haven’t been siloed in a big firm then don’t worry. Additional exposure to the following is desired : - Tech Stack you will learn Hadoop and Flink RUST, Javascript, React, Redux, Flow Linux, Jenkins Kafka, Avro, Kubernetes, Puppet Involvement in the Java community My client is based London. Home work is encouraged but you More ❯
an asset: Testing performance with JMeter or similar tools Web services technology such as REST, JSON or Thrift Testing web applications with Selenium WebDriver Big data technology such as Hadoop, MongoDB, Kafka or SQL Network principles and protocols such as HTTP, TLS and TCP Continuous integration systems such as Jenkins or Bamboo What you can expect: At Global Relay More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tec Partners
Ansible . Comfortable working with cloud platforms (e.g., AWS, Azure, GCP) and container orchestration tools like Kubernetes . Excellent troubleshooting skills and a collaborative approach. Bonus: Experience with Cloudera , Hadoop , CDSW , or CML is a plus. What's on Offer: Flexible hybrid working arrangements Core benefits including private healthcare, dental, life assurance and pension Optional benefits including health cash More ❯
and machine learning PREFERRED QUALIFICATIONS - Experience with modeling tools such as R, scikit-learn, Spark MLLib, MxNet, Tensorflow, numpy, scipy etc. - Experience with large scale distributed systems such as Hadoop, Spark etc. Amazon is an equal opportunity employer and does not discriminate on the basis of protected veteran status, disability, or other legally protected status. Los Angeles County applicants More ❯
Kubernetes experience, we want to see people who have worked on Kubernetes implementation, involved in architecture, hands on migration etc K8S, Spark S3 Engine Terraform Ansible CI/CD Hadoop Linux/RHEL – on prem background/container management Grafana or Elastic Search– for observability Kubernetes security Nice to have: Observability – Open telemetry Argo Candidates will be required to More ❯
Kubernetes experience, we want to see people who have worked on Kubernetes implementation, involved in architecture, hands on migration etc K8S, Spark S3 Engine Terraform Ansible CI/CD Hadoop Linux/RHEL – on prem background/container management Grafana or Elastic Search– for observability Kubernetes security Nice to have: Observability – Open telemetry Argo Candidates will be required to More ❯
enterprise-level systems; Excellent object-oriented design skills, including OOA/OOD; Experience with multi-tier architectures and service-oriented architecture; Exposure to and understanding of RDBMS, NoSQL, and Hadoop is desirable; Knowledge of the software development lifecycle and agile practices, including TDD/BDD; Strategic thinking, collaboration, and consensus-building skills. Please note: Familiarity with DevOps is important More ❯
East London, London, United Kingdom Hybrid / WFH Options
World Wide Technology
Kubernetes experience, we want to see people who have worked on Kubernetes implementation, involved in architecture, hands on migration etc K8S, Spark S3 Engine Terraform Ansible CI/CD Hadoop Linux/RHEL – on prem background/container management Grafana or Elastic Search– for observability Kubernetes security Nice to have: Observability – Open telemetry Argo Candidates will be required to More ❯
City of London, London, United Kingdom Hybrid / WFH Options
World Wide Technology
Kubernetes experience, we want to see people who have worked on Kubernetes implementation, involved in architecture, hands on migration etc K8S, Spark S3 Engine Terraform Ansible CI/CD Hadoop Linux/RHEL – on prem background/container management Grafana or Elastic Search– for observability Kubernetes security Nice to have: Observability – Open telemetry Argo Candidates will be required to More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
World Wide Technology
Kubernetes experience, we want to see people who have worked on Kubernetes implementation, involved in architecture, hands on migration etc K8S, Spark S3 Engine Terraform Ansible CI/CD Hadoop Linux/RHEL – on prem background/container management Grafana or Elastic Search– for observability Kubernetes security Nice to have: Observability – Open telemetry Argo Candidates will be required to More ❯