Job Title: CLM IIB Developer Location: London (Hybrid – 3 days in office) Contract: 6 months initially Rate: £415.16 per day (Umbrella) About the Role: We are looking for an experienced CLM IIB Developer to join a dynamic development team. You More ❯
the ground-up with a deep understanding of core Java, data structures and concurrency, rather than relying on frameworks such as Spring. You have built event-driven applications using Kafka and solutions with event-streaming frameworks at scale (Flink/Kafka Streams/Spark) that go beyond basic ETL pipelines. You know how to orchestrate the deployment of More ❯
Data Engineer - Azure Databricks , ApacheKafka Permanent Basingstoke (Hybrid - x2 PW) Circa £70,000 + Excellent Package Overview We're looking for a skilled Data Analytics Engineer to help drive the evolution of our clients data platform. This role is ideal for someone who thrives on building scalable data solutions and is confident working with modern tools such as … Azure Databricks , ApacheKafka , and Spark . In this role, you'll play a key part in designing, delivering, and optimising data pipelines and architectures. Your focus will be on enabling robust data ingestion and transformation to support both operational and analytical use cases. If you're passionate about data engineering and want to make a meaningful impact in … hear from you !! Role and Responsibilities Designing and building scalable data pipelines using Apache Spark in Azure Databricks Developing real-time and batch data ingestion workflows, ideally using ApacheKafka Collaborating with data scientists, analysts, and business stakeholders to build high-quality data products Supporting the deployment and productionisation of machine learning pipelines Contributing to the ongoing development of More ❯
and maintaining highly reliable and scalable streaming solutions that empower all engineers at Adyen to leverage real time data. Our team is responsible for providing robust services around ApacheKafka and Apache Flink , ensuring the utmost reliability for real time data processing. We strive to create a seamless and efficient streaming experience, enabling our product engineering teams to build … innovative features and make data driven decisions at lightning speed. What You'll Do Design, develop, and deploy scalable, high performance streaming services and platforms primarily involving ApacheKafka and Apache Flink . Enhance the quality, reliability, and performance of our existing streaming infrastructure. Collaborate with cross functional teams, including product engineers and other platform engineering teams, to build … util.concurrent and concurrency primitives, dependency injection principles. Good scripting skills and ability to pick up new languages. Deep understanding of stream processing concepts and hands on experience with ApacheKafka or Apache Flink . Experience with other streaming technologies is a strong plus. Experience with highly available/fault tolerant, replicated data storage systems, large scale data processing systems More ❯
high-pressure, performance-driven teams. Roles and Responsibilities: Develop and maintain high-performance, low-latency Java-based systems for front office trading or pricing platforms. Build reactive systems using Kafka Streams , Akka , Eclipse Vert.x , or Apache Flink . Utilize multithreading , concurrency models , and Executor Services to optimize system performance and throughput. Write clean, efficient, and maintainable code using functional … Key Requirements: 6+ years of hands-on Java development experience, preferably in front office systems (e.g., trading platforms, pricing engines, market data systems). Proven expertise in reactive programming (Kafka Streams, Akka, Vert.x, Flink). Solid understanding of multithreading and Executor Services in Java. Strong background in functional programming and Java 8+ features. Adherence to robust engineering practices: SOLID More ❯
high-pressure, performance-driven teams. Roles and Responsibilities: Develop and maintain high-performance, low-latency Java-based systems for front office trading or pricing platforms. Build reactive systems using Kafka Streams , Akka , Eclipse Vert.x , or Apache Flink . Utilize multithreading , concurrency models , and Executor Services to optimize system performance and throughput. Write clean, efficient, and maintainable code using functional … Key Requirements: 6+ years of hands-on Java development experience, preferably in front office systems (e.g., trading platforms, pricing engines, market data systems). Proven expertise in reactive programming (Kafka Streams, Akka, Vert.x, Flink). Solid understanding of multithreading and Executor Services in Java. Strong background in functional programming and Java 8+ features. Adherence to robust engineering practices: SOLID More ❯
based client is urgently seeking a Java Developer for an initial 3 month contract, with extension possibilities. Skills needed; Strong hands-on skills in Java, Spring (DI, scopes, transactions) Kafka (partitions, consumer groups, scaling) Kubernetes Relational/NoSQL databases. End-to-end involvement in building, deploying, and supporting applications. Strong troubleshooting skills; Kafka lag, Spring bean misconfigurations, DB More ❯
Our Scala code uses the Typelevel ecosystem extensively. Our Python code uses FastAPI and Pydantic extensively. We use open source technologies to underpin our backend services, things like ApacheKafka, Postgres, nginx, and Kubernetes. Our infrastructure is expressed as code using OpenTofu (the FOSS fork of Terraform). We're looking for someone who's passionate about code, curious … problem-solving; someone diligent and thoughtful. Senior Software Engineer or above and experience in: Scala, Domain modelling, HTTP and REST APIs, Writing robust, fault-tolerant and observable software, ApacheKafka and event-based architecture, Postgres or other relational databases, Kubernetes, Python Plus experience in the following would be a bonus: Terraform, OpenTofu CircleCI What you'll do: You and More ❯
Fort Lauderdale, Florida, United States Hybrid / WFH Options
Vegatron Systems
DESCR: Candidates will start out REMOTE WORK and then will eventually be sitting in Frt. Lauderdale, FL. Candidates should be senior Data Engineers with big data tools (Hadoop, Spark, Kafka) as well as AWS (cloud services: EC2, EMR, RDS, Redshift) and NOSQL. This is a phone and Skype to hire. Candidates in Florida with a LinkedIn profile preferred but … analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.) • Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud • Experience with big data tools: Hadoop, Spark, Kafka, etc. • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra • Experience with AWS cloud services: EC2, EMR, RDS, Redshift • Experience with stream-processing systems: Storm, Spark-Streaming … analytics technologies (i.e. Hadoop, Spark, Elastic, AWS, etc.) • Experience working and developing in Cloud platforms such as AWS, Azure, or Google Cloud • Experience with big data tools: Hadoop, Spark, Kafka, etc. • Experience with relational SQL and NoSQL databases, including Postgres and Cassandra • Experience with AWS cloud services: EC2, EMR, RDS, Redshift • Experience with stream-processing systems: Storm, Spark-Streaming More ❯
London, England, United Kingdom Hybrid / WFH Options
Client Server
Senior Data Engineer (AWS Kafka Python) London/WFH to £85k Are you a tech savvy Data Engineer with AWS expertise combined with client facing skills? You could be joining a global technology consultancy with a range of banking, financial services and insurance clients in a senior, hands-on Data Engineer role. As a Senior Data Engineer you will … design and build end-to-end real-time data pipelines using AWS native tools, Kafka and modern data architectures, applying AWS Well-Architected Principles to ensure scalability, security and resilience. You’ll collaborate directly with clients to analyse requirements, define solutions and deliver production grade systems, leading the development of robust, well tested and fault tolerant data engineering solutions. … financial services environments You have expertise with AWS including Lake formation and transformation layers You have strong Python coding skills You have experience with real-time data streaming using Kafka You’re collaborative and pragmatic with excellent communication and stakeholder management skills You’re comfortable taking ownership of projects and working end-to-end You have a good knowledge More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
Senior Data Engineer (AWS Kafka Python) London/WFH to £85k Are you a tech savvy Data Engineer with AWS expertise combined with client facing skills? You could be joining a global technology consultancy with a range of banking, financial services and insurance clients in a senior, hands-on Data Engineer role. As a Senior Data Engineer you will … design and build end-to-end real-time data pipelines using AWS native tools, Kafka and modern data architectures, applying AWS Well-Architected Principles to ensure scalability, security and resilience. You'll collaborate directly with clients to analyse requirements, define solutions and deliver production grade systems, leading the development of robust, well tested and fault tolerant data engineering solutions. … financial services environments You have expertise with AWS including Lake formation and transformation layers You have strong Python coding skills You have experience with real-time data streaming using Kafka You're collaborative and pragmatic with excellent communication and stakeholder management skills You're comfortable taking ownership of projects and working end-to-end You have a good knowledge More ❯
pipelines across Bronze → Silver → Gold layers, ensuring quality, scalability, and reliability. Develop and optimise Spark (PySpark) jobs for large-scale distributed processing. Design and implement streaming data pipelines with Kafka/MSK, applying best practices for late event handling and throughput. Use Terraform and CI/CD pipelines (GitHub Actions or similar) to manage infrastructure and automate deployments. (For … Expected Proven experience delivering end-to-end data pipelines in Databricks and Spark environments. Strong understanding of data modelling, schema evolution, and data contract management. Hands-on experience with Kafka, streaming architectures, and real-time processing principles. Proficiency with Docker, Terraform, and cloud platforms (AWS, GCP, or Azure) for scalable data infrastructure. Demonstrated ability to lead, coach, and elevate More ❯
and Healthcare technology sector Salary up to £150k + Equity Paddington-based (4-5 days per week in-office) Tech environment: Python | SQL | Airflow | dbt | GCP | AWS | Data Architecture | Kafka If you wish to keep your CV/data private, feel free to WhatsApp your details/CV to me, Dan - WHO WE ARE: We're a rapidly growing … orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. NICE TO HAVE: Background in data management … to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Data Management | Python | SQL | Airflow | dbt | GCP | AWS | Data Architecture | Kafka | Governance | Data Quality | Cloud Data Platforms More ❯
and Healthcare technology sector Salary up to £150k + Equity Paddington-based (4–5 days per week in-office) Tech environment: Python | SQL | Airflow | dbt | GCP | AWS | Data Architecture | Kafka If you wish to keep your CV/data private, feel free to WhatsApp your details/CV to me, Dan – 07704 152638 WHO WE ARE: We’re a … orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. NICE TO HAVE: Background in data management … to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Data Management | Python | SQL | Airflow | dbt | GCP | AWS | Data Architecture | Kafka | Governance | Data Quality | Cloud Data Platforms More ❯
and Healthcare technology sector Salary up to £150k + Equity Paddington-based (4–5 days per week in-office) Tech environment: Python | SQL | Airflow | dbt | GCP | AWS | Data Architecture | Kafka If you wish to keep your CV/data private, feel free to WhatsApp your details/CV to me, Dan – 07704 152638 WHO WE ARE: We’re a … orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. NICE TO HAVE: Background in data management … to process (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Data Management | Python | SQL | Airflow | dbt | GCP | AWS | Data Architecture | Kafka | Governance | Data Quality | Cloud Data Platforms More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Fruition Group
Developer will design, develop, and implement data-intensive applications across the full engineering lifecycle. You’ll architect and deliver microservices-based systems using Go (Golang), AWS, Kubernetes, Docker, and Kafka, working closely with cross-functional teams to build scalable, reliable, and resilient platforms. You’ll also play a key role in optimising system performance, improving reliability, and ensuring scalability … including system design and architecture Background in complex, large-scale, data-driven applications Product-focused approach, ideally within fast-paced tech organisations (start-ups or scale-ups) Knowledge of Kafka, Cassandra, gRPC, and microservices is a strong advantage Open-source contributions are beneficial If you’re a Senior Go Developer looking for a challenging 6-month contract with a More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Fynity
building asynchronous, event-driven systems using modern Java technologies. Design and build scalable, high-availability systems processing millions of real-time transactions Work with Java 17+, Spring WebFlux, AKKA, Kafka, and more Write clean, testable code using TDD and BDD Contribute to architectural decisions in a fast-evolving codebase Collaborate within Agile teams (Kanban/Scrum) 🧠 What You Bring … problems. Strong hands-on experience with Java 11+ (ideally Java 17) Solid knowledge of Reactive Programming (e.g., Spring WebFlux, AKKA) Experience with event-driven architecture, real-time messaging systems (Kafka, JMS) Familiarity with asynchronous request handling, scalability, and system resilience Agile mindset, with TDD/BDD and CI/CD experience Bonus: Background in Banking/Payments is helpful More ❯
building asynchronous, event-driven systems using modern Java technologies. Design and build scalable, high-availability systems processing millions of real-time transactions Work with Java 17+, Spring WebFlux, AKKA, Kafka, and more Write clean, testable code using TDD and BDD Contribute to architectural decisions in a fast-evolving codebase Collaborate within Agile teams (Kanban/Scrum) 🧠 What You Bring … problems. Strong hands-on experience with Java 11+ (ideally Java 17) Solid knowledge of Reactive Programming (e.g., Spring WebFlux, AKKA) Experience with event-driven architecture, real-time messaging systems (Kafka, JMS) Familiarity with asynchronous request handling, scalability, and system resilience Agile mindset, with TDD/BDD and CI/CD experience Bonus: Background in Banking/Payments is helpful More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
platform teams to make sure the company’s AI models are learning, and improving, in production. 🧠 What You’ll Be Doing Architect and maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Build robust feature pipelines using Airflow, Prefect, or Dagster Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or Delta Lake) Automate and scale model training … Terraform, or dbt Champion data reliability, scalability, and performance across the platform 🧩 The Tech Environment You’ll likely be working with some combination of: Languages: Python, Scala, Go Streaming: Kafka/Flink/Spark Structured Streaming Workflow orchestration: Airflow/Prefect/Dagster Data storage & processing: Snowflake/Detabricks/BigQuery/Redshift Infrastructure: Docker/Kubernetes/Terraform More ❯
SR2 | Socially Responsible Recruitment | Certified B Corporation™
platform teams to make sure the company’s AI models are learning, and improving, in production. 🧠 What You’ll Be Doing Architect and maintain real-time streaming data systems (Kafka, Kinesis, or Flink) Build robust feature pipelines using Airflow, Prefect, or Dagster Manage and optimise data storage solutions (Snowflake, BigQuery, Redshift, or Delta Lake) Automate and scale model training … Terraform, or dbt Champion data reliability, scalability, and performance across the platform 🧩 The Tech Environment You’ll likely be working with some combination of: Languages: Python, Scala, Go Streaming: Kafka/Flink/Spark Structured Streaming Workflow orchestration: Airflow/Prefect/Dagster Data storage & processing: Snowflake/Detabricks/BigQuery/Redshift Infrastructure: Docker/Kubernetes/Terraform More ❯
Birmingham, West Midlands, England, United Kingdom Hybrid / WFH Options
SF Recruitment
of the following key skills - Strong Data engineering skills including ETL pipeline creation, data warehousing and visualization - Strong database architecture understanding - PostgreSQL, MongoDB, MS SQL - Event sourcing/messaging - Kafka, CQRS etc - Experience working in a high growth, product focussed business within small cross functional engineering teams - Solid cloud provisioning - GCP, Azure, AWS etc - A good appreciation of software … lead role in shaping the technical output of a successful, scaling organisation please apply now to be considered and for further info. Lead Developer ETL, data warehousing, MongoDB, SQL, Kafka, event messaging, software engineering More ❯
Position: Platform Engineer Department: Analytics Platform Full Time London or Bristol Ready for a challenge? Then Just Eat Takeaway might be the place for you. We're a leading global online delivery platform, and our vision is to empower everyday More ❯
About PostGrid: PostGrid is a global leader in enterprise SaaS platform for automated offline communications. Our robust APIs empower organizations to create, personalize, and dispatch physical mail letters, postcards, checks, and more to any address on the globe without ever More ❯
Main Requirements Platforms, preferably on AWS, using Kubernetes and microservices. Hands-on experience with modern backend stacks such as TypeScript/Node.js, Java, Postgres, and Kafka. Comfortable working with data models, APIs, service-oriented architecture, and event-driven systems. Resolution More ❯
Role – Technology Lead Technology – Kafka, Confluent cloud and Data Integration Location – London, UK Job Description We are looking for an experienced Kafka and Confluent Cloud Specialist to join our team onsite as a Subject Matter Expert (SME) and primary technical lead for all data onboarding activities across on-premises and cloud environments. This role demands strong expertise in … the design, implementation, and optimization of enterprise-grade data integration solutions, with a strong focus on real-time streaming and cloud-native platforms. You will lead initiatives involving ApacheKafka and Confluent Cloud, enabling scalable, event-driven architectures that support high-throughput, low-latency data movement across systems. You will anchor engagements from requirement gathering and solution architecture to … will be key in enabling organizations to unlock the value of real-time data, improve operational efficiency, and accelerate digital transformation. Responsibilities Act as the primary technical owner for Kafka and Confluent Cloud implementations. Design, configure, and maintain Kafka clusters, topics, and streaming pipelines. Ensure secure, reliable, and scalable data onboarding from various sources to both on-prem More ❯