modern engineering tools, languages, and practices, including Java, Spring, and Hibernate, Relational Databases (RDBMS) including Oracle, DB2, SQL Server, and MySQL, middleware packages including MQ (IBM MQ, ActiveMQ, RabbitMQ), Kafka, Unit Testing, and Integration Testing, and Continuous Integration tools including Jenkins, UrbanCode, and Gitlab CI. Manage, prioritize, and prune a backlog of work items and help the team align … tools, languages, and practices, including Java, Spring, and Hibernate; (2) Relational Databases (RDBMS) including Oracle, DB2, SQL Server, and MySQL; (3) middleware packages including MQ (IBM MQ, ActiveMQ, RabbitMQ), Kafka, Unit Testing, and Integration Testing; (4) Continuous Integration tools including Jenkins, UrbanCode, and Gitlab CI; (5) designing and delivering technology solutions using micro-service architecture, service-oriented architecture, asynchronous More ❯
What are we looking for? Excellent written and verbal communication skills in English Solid background in Java +9/10 years o experience Experience delivering/managing projects using Kafka and Rabbit frameworks Ability to define the design, implementation of microservices. Provide technical leadership to our engineering team Experience in projects with microservices architectures , pipeline implementation , and API development More ❯
log4j2, JEE stack ) Fluent in English A big plus if you have knowledge of : Previous experience in NLU, GOT, LLMs, Gen AI or machine learning is a strong plus Kafka, Elastichsearch, Cassandra, Data Mining or IA. Offered: Flexible remuneration: transportation, restaurant ticket, childcare, Flexible schedule Medical insurance for you and your family, 100% paid by the company More ❯
for this position. Nice to haves (in order of priority – at least one or two is needed): Python (commercial experience) AWS services – S3, ECS, AppSync GraphQL or REST APIs Kafka or message queue systems PostgreSQL or NoSQL database experience To apply – click the link or for a faster response, email Barry.Ansell@HarringtonStarr.com More ❯
for this position. Nice to haves (in order of priority – at least one or two is needed): Python (commercial experience) AWS services – S3, ECS, AppSync GraphQL or REST APIs Kafka or message queue systems PostgreSQL or NoSQL database experience To apply – click the link or for a faster response, email Barry.Ansell@HarringtonStarr.com More ❯
Compensation: 8 yrs. experience, $253,000/year Bachelor's in Computer Science or 4 additional years experience Job Description: Develop and maintain ETL pipelines and microservices using Java, Kafka, MongoDB, and Docker Implement data streaming and UI interfaces using Angular or React Support high-availability operations in production environments Coordinate closely with operations and analysts to tailor workflow More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Block MB
Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for clean, maintainable code and robust data systems Previous experience in a fintech or regulated environment is a plus Benefits: Join a collaborative, mission-driven team More ❯
Python, SQL, and modern ETL tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for clean, maintainable code and robust data systems Previous experience in a fintech or regulated environment is a plus Benefits: Join a collaborative, mission-driven team More ❯
experience as a Data Engineer , preferably in a freelance or consulting capacity. Strong expertise in SQL, Python, and/or Scala . Experience with big data technologies (Spark, Hadoop, Kafka) is a plus. Hands-on experience with cloud platforms (AWS, Azure, Google Cloud). Knowledge of ETL tools, data warehouses (Snowflake, BigQuery, Redshift) and pipeline orchestration (Airflow, dbt). More ❯
experience building out SRE capabilities within App Support teams as that is the bulk of the headcount. They would also want good knowledge of: Cloud (AWS, OnPrem) Microservices (K8s, Kafka) IaC (Terraform) CI/CD (GitOps, Github Actions, ArgoCD) Monitoring (OpenTelemetry, Prometheus, Grafana) Security (Vault, IAM, OPA, SOC2, GDPR) What’s in it for you? Annual bonus Share Options More ❯
experience building out SRE capabilities within App Support teams as that is the bulk of the headcount. They would also want good knowledge of: Cloud (AWS, OnPrem) Microservices (K8s, Kafka) IaC (Terraform) CI/CD (GitOps, Github Actions, ArgoCD) Monitoring (OpenTelemetry, Prometheus, Grafana) Security (Vault, IAM, OPA, SOC2, GDPR) What’s in it for you? Annual bonus Share Options More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Identify Solutions
CD, IaC, observability). Experience with Cloud and big data technologies (e.g. Spark/Databricks/Delta Lake/BigQuery). Familiarity with eventing technologies (e.g. Event Hubs/Kafka) and file formats such as Parquet/Delta/Iceberg. Want to learn more? Get in touch for an informal chat. More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
InterEx Group
Data implementation projects Experience in the definition of Big Data architecture with different tools and environments: Cloud (AWS, Azure and GCP), Cloudera, No-sql databases (Cassandra, Mongo DB), ELK, Kafka, Snowflake, etc. Past experience in Data Engineering and data quality tools (Informatica, Talend, etc.) Previous involvement in working in a Multilanguage and multicultural environment Proactive, tech passionate and highly More ❯
Kubernetes. Strong ownership, accountability, and communication skills. Bonus Points For: Experience leading projects in SCIF environments. Expertise in Cyber Analytics, PCAP, or network monitoring. Familiarity with Spark, Dask, Snowpark, Kafka, or task schedulers like Airflow and Celery. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Durlston Partners
systems to facilitate research and idea generation Improving core infrastructure and system scalability Working across the full technology stack - frontend to database Tech Stack Java | TypeScript/React | Python | Kafka | SQL Server | Snowflake | Docker | InfluxDB What they are Looking For 3+ years commercial Java experience Previous experience at a bank or hedge fund Basic knowledge of at least one More ❯
systems to facilitate research and idea generation Improving core infrastructure and system scalability Working across the full technology stack - frontend to database Tech Stack Java | TypeScript/React | Python | Kafka | SQL Server | Snowflake | Docker | InfluxDB What they are Looking For 3+ years commercial Java experience Previous experience at a bank or hedge fund Basic knowledge of at least one More ❯
framework Build knowledge of all data resources within ND and prototype new data sources internally and externally Skills and Experience Essential Proficiency in Spark (SQL and/or Scala), Kafka Strong analytical and problem-solving skills applied to data solutions Proficiency with traditional SQL database technologies Experience integrating data from multiple sources Experience with CI/CD best practices More ❯
infrastructure health Partner with cross-functional teams to deliver robust data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative, solutions-focused mindset and More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Atarus
infrastructure health Partner with cross-functional teams to deliver robust data solutions 💡 What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative, solutions-focused mindset and More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Atarus
infrastructure health Partner with cross-functional teams to deliver robust data solutions What You’ll Bring Strong hands-on experience building streaming data platforms Deep understanding of tools like Kafka , Flink , Spark Streaming , etc. Proficiency in Python , Java , or Scala Cloud experience with AWS , GCP , or Azure Familiarity with orchestration tools like Airflow , Kubernetes Collaborative, solutions-focused mindset and More ❯
e.g., Terraform) and observability tooling (e.g., Prometheus, Grafana) Comfortable working on distributed systems and improving developer workflows A product mindset and a collaborative approach to problem-solving Experience with Kafka, gRPC, or open-source contributions is a bonus If you’re a Platform Engineer who thrives in high-trust, fast-moving environments and enjoys building tools that empower others More ❯
designing and optimizing complex database schemas and queries (PostgreSQL, Redis) have built event-driven architectures and implemented asynchronous processing systems have experience with message queues and distributed systems (e.g., Kafka, RabbitMQ) have shipped and maintained production services from scratch, with a focus on reliability and performance are comfortable working with cloud services (AWS/GCP) and infrastructure orchestration tools More ❯
functional change requests/new requirements into architectural concepts ('data architect') regarding -Knowledge graph-based applications (CMEM by Eccenca, Web frontend) -Interfaces to consumer/provider systems based on kafka streaming technology -Workflows for ETL automation based on third party software •Implementing or adjusting workflows in an enterprise knowledge graph platform To You must have: •Proven Experience as Data More ❯
methodologies appropriate to the development environment. Minimum Qualifications: TS/SCI with polygraph clearance is required Programming: React, Java, Python, SQL, YAML, shell scripting Tools: Docker, Docker Swarm, NGINX, KAFKA, Elastic Searc, MySQL, NiFi, Logstas Bachelor's degree in Computer Science or related discipline from an accredited college or university Four (4) years' experience as a Software Engineer in More ❯