Greater Portsmouth Area, United Kingdom Hybrid / WFH Options
Trust In SODA
or AWS in regulated environments. GitOps workflows (ArgoCD, Flux). Observability tooling: Prometheus, Grafana, Splunk, ELK/EFK. Service mesh technologies: Istio, Linkerd. Experience with containerised services: Postgres, Redis, Kafka, Keycloak. Familiarity with MOD delivery processes and SSDF-aligned development environments. More ❯
within engineering teams 💡 What You’ll Bring 8+ years’ QA/engineering experience, including 5+ within financial or trading systems Strong hands-on experience with Azure, AKS, Solace (or Kafka/RabbitMQ), SQL, and C#/.NET Proven track record designing automation frameworks and performance/security test suites Excellent communication and leadership skills with the ability to influence More ❯
of working with complex, large-scale data applications Product-driven mindset and experience working in fast-moving, tech-focused organisations (e.g. start-ups or scale-ups) Additional skills with Kafka, Cassandra, gRPC, and microservices design are highly advantageous Open-source contributions are a plus This is an excellent opportunity for a Senior Backend Engineer who thrives on tackling technical More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
work from home on Fridays. About you: You have strong full stack web development experience using TypeScript and Node.js You understand AWS, microservices, APIs, Event Driven systems, Data Streaming (Kafka), DevOps environments, CI/CD pipelines and how to deploy to the cloud, they use CloudFlare You're happy to do some front end work with React You have More ❯
or PostgreSQL. Knowledge of CI/CD principles and Agile/Scrum development practices. Excellent communication skills and a proactive, solution-focused mindset. Desirable extras: Experience with GraphQL , gRPC , Kafka , or Reactive Extensions (RX) . Exposure to the energy, commodities, or financial services sectors. This is an opportunity to contribute to an ambitious, data-driven organisation where your work More ❯
or PostgreSQL. Knowledge of CI/CD principles and Agile/Scrum development practices. Excellent communication skills and a proactive, solution-focused mindset. Desirable extras: Experience with GraphQL , gRPC , Kafka , or Reactive Extensions (RX) . Exposure to the energy, commodities, or financial services sectors. This is an opportunity to contribute to an ambitious, data-driven organisation where your work More ❯
St Albans, England, United Kingdom Hybrid / WFH Options
Client Server
enhancements to complex Payments and client systems within a microservices environment (300 services). You'll be working with a modern tech stack using C# .Net Core, AWS, Kubernetes, Kafka, Redis and TypeScript/Angular; using the right tool for the job, you'll be able to pick up new technologies and make recommendations for improvements. WFH Policy: You More ❯
code quality, reliability, performance optimization, and observability. Preferred Qualifications Exposure to machine learning workflows, model lifecycle management, or data engineering platforms. Experience with distributed systems, event-driven architectures (e.g., Kafka), and big data platforms (e.g., Spark, Databricks). Familiarity with banking or financial domain use cases, including data governance and compliance-focused development. Knowledge of platform security, monitoring, and More ❯
Candidate Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
CV TECHNICAL LTD
Candidate Profile: Proven experience as a Data Engineer, with strong expertise in designing and managing large-scale data systems. Hands-on proficiency with modern data technologies such as Spark, Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake More ❯
York, England, United Kingdom Hybrid / WFH Options
WRK digital
of ML approaches such supervised/unsupervised machine learning, reinforcement learning, Bayesian inference. AWS Certification is a strong benefit. Experience with Google Cloud’s Big Data tools. Proficient with Kafka is plus but not essential. If you’re a data visionary with the technical expertise and leadership qualities to shape the future of data in a fast-moving business More ❯
observability best practices. Knowledge of DevOps methodologies, CI/CD, and infrastructure as code (IaC). Proficiency in Python or Bash scripting for automation and integration. Experience working with Kafka, REST APIs, and webhook-based integrations. Exposure to cloud environments (AWS, Azure, Cloud) and container technologies (Docker, Kubernetes). Compensation: Hourly Rate: $70 - $75 per hour This range reflects More ❯
supplemented by commensurate work experience. Proven experience in solution architecture within supply chain technology. Expertise in cloud platforms (Azure, AWS, GCP), ERP integration (SAP, Oracle), and event-driven systems (Kafka). Hands-on experience with Kubernetes, Snowflake, and ML frameworks (including BYOML) Develop messaging, drive internal enablement, and demonstrate key capabilities of the Blue Yonder platform, extensibility, value added More ❯
track record of collaborating with traders, product managers, and infrastructure teams in fast-paced environments. Nice to have skills: Strong background in API-based and message-driven architectures, including Kafka, JMS, or MQ-based Middleware. Proficiency with SQL for trade/order data analysis and reconciliation. Exposure to Java/Python for Scripting, debugging, or developing integration utilities. Familiarity More ❯
development of scalable applications and integrations. The candidate will work closely with cross-functional teams to ensure the seamless integration of systems and services. Familiarity with technologies such as Kafka, GitLab, SQL, and Unix is a plus. Key Responsibilities: Lead the development and architecture of scalable and high-performance applications and services using Java, Node.js, and Cloud technologies. Design … for designing schemas, queries, and optimizing performance. Unix/Linux: Experience working in Unix/Linux environments, including scripting, command-line tools, and performance troubleshooting. Good to Have: ApacheKafka: Familiarity with Kafka for real-time data streaming, event-driven architecture, and message queuing. NoSQL Databases: Experience with NoSQL databases like MongoDB, Cassandra, or Firebase. Docker & Kubernetes: Experience More ❯
Toubro Infotech and Mindtree in solving the most complex business challenges and delivering transformation at scale. For more information, please visit . Job Title: Senior Specialist - Cloud Engineering (AWS Kafka Platform Engineer) Work Location- Jersey City, NJ Job Description: 8 years of experience with AWS Kafka platform engineer Bachelor's degree in computer science or related fields Minimum … Implementations are mandated Proficiency in Kafka Streams Kafka Connect and ksqlDB. Experience with AWS MSK Confluent Cloud and monitoring tools like Prometheus and Grafana Familiarity with Kubernetes CICD pipelines and GitOps workflows Proficiency in a wide range of public cloud technologies ex Cloud watchAWS EC2 EKS EBS RDS S3 etc. Experience with Amazon AWS cloud services such as … EC2 S3 SQS Kafka and RDS is preferred Design and manage AWS Kafka based streaming data pipelines. Configure and monitor Kafka clusters for performance and scalability. Integrate Kafka with AWS services like EC2 RDS Lambda and S3. Implement security protocols disaster recovery and high availability. Experience working with Kafka Confluent Cloud Schema Registry and KStreams. More ❯
Senior Vice President, Full-Stack Engineer At BNY, our culture allows us to run our company better and enables employees' growth and success. As a leading global financial services company at the heart of the global financial system, we influence More ❯
training opportunities and social benefits (e.g. UK pension schema) What do you offer? Strong hands-on experience working with modern Big Data technologies such as Apache Spark, Trino, ApacheKafka, Apache Hadoop, Apache HBase, Apache Nifi, Apache Airflow, Opensearch Proficiency in cloud-native technologies such as containerization and Kubernetes Strong knowledge of DevOps tools (Terraform, Ansible, ArgoCD, GitOps, etc. More ❯
and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, ApacheKafka, Apache Pulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building products that make More ❯
and delays of even milliseconds can have big consequences. Essential skills: 3+ years of experience in Python development. 3+ with open-source real-time data feeds (Amazon Kinesis, ApacheKafka, Apache Pulsar or Redpanda) Exposure building and managing data pipelines in production. Experience integrating serverless functions (AWS, Azure or GCP). Passion for fintech and building products that make More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Advanced Resource Managers
data warehouse knowledge Redshift and Snowflake preferred Working with IaC – Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, Apache Flink Experience working with a ETL environments Experience working with a confluent cloud platform More ❯
data warehouse knowledge Redshift and Snowflake preferred Working with IaC – Terraform and Cloud Formation Working understanding of scripting languages including Python and Shell Experience working with streaming technologies inc. Kafka, Apache Flink Experience working with a ETL environments Experience working with a confluent cloud platform More ❯
of Linux operating systems and scripting languages (e.g. Python) Knowledge of infrastructure as code and container technologies (e.g. Puppet, Docker) Knowledge of COTS integration technologies (e.g. Apache Camel, ApacheKafka) Experience of Atlassian tools (e.g. Jira, Confluence) Experience with public cloud platforms (e.g. AWS) Experience of the complete system life cycle from problem definition through to deployment Understanding of More ❯
/or FICC options trading Experience with Linux-based, concurrent, high-throughput, low-latency software systems Experience with pipeline orchestration frameworks (e.g. Airflow, Dagster) Experience with streaming platforms (e.g. Kafka), data lake platforms (e.g. Delta Lake, Apache Iceberg), and relational databases Have a Bachelor or advanced degree in Computer Science, Mathematics, Statistics, Physics, Engineering, or equivalent work experience For More ❯