functional product like Security Posture Management. A natural sense of curiosity and a can-do attitude. Nice-to-Have Skills Experience with PostgreSQL, ApacheKafka, Python, Go, and Unix systems will be highly regarded. You do not need to have experience with any of these technologies! The ability to More ❯
ideally including experience with CloudWatch 👍 Bonus points for experience with: Banking systems/FS experience API Mocking and stubbing experience (e.g. WireMock) Experience with Kafka/event-driven architecture End to end testing, experience working with test data More ❯
as well as deploying Microservice APIs. Experience needed: C# .Net Azure/AWS Azure Suite - Data Factory, Event Hubs, Bicep, Cognitive etc. Message brokers - Kafka, RabbitMQ etc. Please apply with an up-to-date CV or get in touch with Tom Parker at Source Technology. More ❯
ideally including experience with CloudWatch 👍 Bonus points for experience with: Banking systems/FS experience API Mocking and stubbing experience (e.g. WireMock) Experience with Kafka/event-driven architecture End to end testing, experience working with test data More ❯
.NET or Python environments Experience operating and supporting bespoke trading platforms Commercial experience deploying applications into AKS clusters Experience operating one or more of Kafka, Redis, Atlassian Suite, Elastic, Datadog etc. Sponsorship cannot be offered for this role. Apply below with an up to date CV below to set More ❯
.NET or Python environments Experience operating and supporting bespoke trading platforms Commercial experience deploying applications into AKS clusters Experience operating one or more of Kafka, Redis, Atlassian Suite, Elastic, Datadog etc. Sponsorship cannot be offered for this role. Apply below with an up to date CV below to set More ❯
to ensure optimal value extraction. Required Skills: Proficiency in Python. Solid experience with Linux, SQL, relational databases, and version control systems. Familiarity with Airflow, Kafka, and GCP (AWS experience is also acceptable). If this sounds like you, please apply directly or reach out to Daniel O'Connell at More ❯
to ensure optimal value extraction. Required Skills: Proficiency in Python. Solid experience with Linux, SQL, relational databases, and version control systems. Familiarity with Airflow, Kafka, and GCP (AWS experience is also acceptable). If this sounds like you, please apply directly or reach out to Daniel O'Connell at More ❯
as star schemas, de-normalised views, and normalisation. What you'll need - Proficiency in ER Studio. - Hands-on experience with GCP (e.g. GCS, BigQuery, Kafka). - Strong knowledge of Advanced SQL, SSIS, Python, and Unix shell scripting. - Experience working within Agile teams and delivery frameworks. More ❯
Lambda, Glue, Redshift, Athena, S3). Strong understanding of ML model development, deployment, and monitoring . Experience with big data processing frameworks (Spark, EMR, Kafka) is a plus. Background in FinTech or financial services is advantageous. What’s on Offer? Competitive salary of £80,000-£90,000 Hybrid working More ❯
to success. You'll also have: Excellent understanding of one or more of the key components in our tech stack (ReactNative/Java/Kafka/Kubernetes/AWS). Solid practical and theoretical knowledge of modern software development. Experience building and designing scalable and performant systems and making More ❯
to success. You’ll also have: Excellent understanding of one or more of the key components in our tech stack (ReactNative/Java/Kafka/Kubernetes/AWS). Solid practical and theoretical knowledge of modern software development. Experience building and designing scalable and performant systems and making More ❯
using technology such as Junit, Cucumber, Gherkin, Contract Test (PACT), TestContainers or other similar technology Knowledge of using eventing and messaging infrastructure such as Kafka and MQ Experience with cloud (preferably AWS) Desire to learn, improve and innovate, with high development standards No Sponsorship offered. Lead Scala Engineer | HYBRID More ❯
with scalable, large-scale, or distributed systems and service-oriented architecture. Expertise in optimizing code for high-performance applications. Familiarity with messaging systems (e.g., Kafka, AMPS, QPID). Experience with Python, bash, and/or Q/KDB. Why This Opportunity? Work with a global team that values innovation More ❯
greater manchester, north west england, United Kingdom
ECOM
Mathematics, or related field. - Proven experience (5+ years) in developing and deploying data engineering pipelines and products - Strong proficiency in Python - Experienced in Hadoop, Kafka or Spark - Experience leading/mentoring junior team members - Strong communication and interpersonal skills, with the ability to effectively communicate complex technical concepts to More ❯
with scalable, large-scale, or distributed systems and service-oriented architecture. Expertise in optimizing code for high-performance applications. Familiarity with messaging systems (e.g., Kafka, AMPS, QPID). Experience with Python, bash, and/or Q/KDB. Why This Opportunity? Work with a global team that values innovation More ❯
engineering experience. Experience building ETL pipelines using Python. Experience of SQL and relational databases. Experience with AWS or similar Cloud technology. Experience with S3, Kafka, Airflow, and Iceberg will be beneficial. Experience in the financial markets with a focus on securities & derivatives trading. Exceptional communication skills, attention to detail More ❯
engineering experience. Experience building ETL pipelines using Python. Experience of SQL and relational databases. Experience with AWS or similar Cloud technology. Experience with S3, Kafka, Airflow, and Iceberg will be beneficial. Experience in the financial markets with a focus on securities & derivatives trading. Exceptional communication skills, attention to detail More ❯
Advanced SQL, SSIS packages, Python and Shell Scripting expertise Experience using ER Studio Hands-on experience with Google Cloud Platform tools (e.g., GCS, BigQuery, Kafka) Agile methodology experience Excellent communication and collaboration skills in client-facing roles Contract Details: Rate: £400 per day inside IR35 Duration: 6 months Start More ❯
experience Excellent written and spoken English Added bonus if you have: Unix/Linux experience Oracle Experience Experience of event-driven distributed messaging (e.g. Kafka) Experience of financial markets and the trade lifecycle beneficial C# and any GUI development experience What we offer you: At FIS, you can learn More ❯
experience Excellent written and spoken English Added bonus if you have: Unix/Linux experience Oracle Experience Experience of event-driven distributed messaging (e.g. Kafka) Experience of financial markets and the trade lifecycle beneficial C# and any GUI development experience What we offer you: At FIS, you can learn More ❯
enterprise data lake solutions. Hands-on experience with data modelling , ETL/ELT pipelines , and data integration across multiple systems. Familiarity with tools like Kafka , Spark , and modern API-based architectures . Experience with relational databases such as Oracle and SQL Server . Knowledge of data governance platforms like More ❯
Experience with web services technologies such as REST, JSON, or Thrift Testing web applications with Selenium WebDriver Big data technologies such as Hadoop, MongoDB, Kafka, or SQL Understanding of network principles and protocols like TLS, TCP Experience with continuous integration systems such as Jenkins or Bamboo Knowledge of continuous More ❯
london, south east england, United Kingdom Hybrid / WFH Options
twentyAI
enterprise data lake solutions. Hands-on experience with data modelling , ETL/ELT pipelines , and data integration across multiple systems. Familiarity with tools like Kafka , Spark , and modern API-based architectures . Experience with relational databases such as Oracle and SQL Server . Knowledge of data governance platforms like More ❯
smarter business decision-making. Key Responsibilities: Design and develop homogenous data repositories for enterprise reporting and analytics. Ingest data from SQL databases, REST APIs, Kafka streams and other sources. Apply data cleansing rules to ensure high data quality standards. Model data into a single source of truth using Kimball More ❯