OpenLink and processes developed by the group Participate in capacity planning and performance/throughput analysis Consuming and publishing transaction data in AVRO over Kafka Automation of system maintenance tasks, end-of-day processing jobs, data integrity checks and bulk data loads/extracts Release planning and deployment Build More ❯
a focus on high availability. Requirements: Experience with AWS infrastructure, networking, IAM, and load balancing. Proficiency in containerisation (Docker, Kubernetes, Helm) and messaging systems (Kafka). Hands-on CI/CD pipeline development (GitHub Actions/GitOps preferred). Strong PostgreSQL management skills, including scaling and optimisation. Familiarity with More ❯
a focus on high availability. Requirements: Experience with AWS infrastructure, networking, IAM, and load balancing. Proficiency in containerisation (Docker, Kubernetes, Helm) and messaging systems (Kafka). Hands-on CI/CD pipeline development (GitHub Actions/GitOps preferred). Strong PostgreSQL management skills, including scaling and optimisation. Familiarity with More ❯
plug skill gaps as needed. Essential Skills 5+ years of experience with Java. 2+ years of experience with Python. Experience with Spring. Experience with Kafka Proficiency in SQL. Experience with AWS. Additional Skills & Qualifications Experience with Perl. Experience supporting highly distributed systems in enterprise environments. Familiarity with CI/ More ❯
london, south east england, United Kingdom Hybrid / WFH Options
DeFinitive
a focus on high availability. Requirements: Experience with AWS infrastructure, networking, IAM, and load balancing. Proficiency in containerisation (Docker, Kubernetes, Helm) and messaging systems (Kafka). Hands-on CI/CD pipeline development (GitHub Actions/GitOps preferred). Strong PostgreSQL management skills, including scaling and optimisation. Familiarity with More ❯
vendors and managing tech partnerships. Nice to have AWS Solution Architect certification. Docker, Kubernetes, or container orchestration tools. Observability tools (e.g. New Relic) experience. Kafka, Flink, or IoT streaming tech exposure. Background in financial services or regulated environments. What's in it for you? Competitive salary : Up to More ❯
people who have experience in Microsoft Power Platform. Name dropping just some other techs we use: Java, Springboot, Jenkins, Typescript, React, PostGres, SQL Server, Kafka, ActiveMQ, ElasticSearch, AWS, Ansible, WSO2, REST, Docker, API Gateways, JavaScript, Mongo. The Central Government Team We work hard and often go the extra mile More ❯
BGP, BMP, ARP, SNMP, CDP/LLDP) and network engineering, management, and operations. Experience with search and analytics engines/big data tools (OpenSearch, Kafka, Kibana, Telegraf, InfluxDB, Prometheus). Our Preferred Qualifications for this role: Basic understanding of AI and ML algorithms, including model training, testing, and deployment. More ❯
Out in Science, Technology, Engineering, and Mathematics
roles, preferably for a customer facing data product Expertise in designing and implementing large-scale data processing systems with data tooling such as Spark, Kafka, Airflow, dbt, Snowflake, Databricks, or similar Strong programming skills in languages such as SQL, Python, Go or Scala Demonstrable use and an understanding of More ❯
PostgreSQL). Bonus: Advanced LookML knowledge and experience building data visualisation tools. Skilled in building and managing real-time and batch data pipelines using Kafka and DBT. Familiarity with Docker, Terraform, and Kubernetes for application orchestration and deployment. A strong numerical or technical background, ideally with a degree in More ❯
troubleshooting in live production systems. Experience with Messaging Systems: You have experience with distributed systems that use some form of messaging system (e.g. RabbitMQ, Kafka, Pulsar, etc). The role is focusing on RabbitMQ and you will have time to acquire deep knowledge in it. Programming Proficiency: You have More ❯
and in the digitalization of technical process development through modeling, including the end-to-end digital process setup, testing and maintenance. Experience in Confluent Kafka, Mendix, MS Azure stack and Agile tool kits such as JIRA or Azure DevOps. Experience of AI/ML model metrics (e.g., F1 and More ❯
TO HAVE . Cloud based experience Microservice architecture or server-less architecture Big Data/Messaging technologies such as Apache Nifi/MiNiFi/Kafka TO BE CONSIDERED . Please either apply by clicking online or emailing me directly to For further information please call me on or . More ❯
patterns and their application in hybrid and cloud-native environments. You have experience with message brokers and event-driven systems using tools like ApacheKafka, RabbitMQ, or Azure Service Bus. Expertise in designing and managing RESTful APIs, GraphQL, or gRPC as well as API management tools like Boomi API More ❯
ELT pipelines using Python and pandas within a financial environment. Strong knowledge of relational databases and SQL. Familiarity with various technologies such as S3, Kafka, Airflow, Iceberg. Proficiency working with large financial datasets from various vendors. A commitment to engineering excellence and pragmatic technology solutions. A desire to work More ❯
non-technical stakeholders alike. The Head of BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, ApacheKafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI More ❯
London, England, United Kingdom Hybrid / WFH Options
Infused Solutions
with Azure Data Factory, Synapse, and Fabric ✅ Strong SQL & Python skills ✅ Experience across ETL/ELT, data lakes, modelling, and governance ✅ Exposure to Spark, Kafka, or Snowflake a plus ✅ Agile/DevOps mindset and experience with Git, APIs, microservices Bonus: DP-203 or Fabric Analytics certification highly desirable. Important More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Infused Solutions
with Azure Data Factory, Synapse, and Fabric ✅ Strong SQL & Python skills ✅ Experience across ETL/ELT, data lakes, modelling, and governance ✅ Exposure to Spark, Kafka, or Snowflake a plus ✅ Agile/DevOps mindset and experience with Git, APIs, microservices Bonus: DP-203 or Fabric Analytics certification highly desirable. Important More ❯
real-time achievable. Requirements: 8+ years commercial experience building high performance systems in Core Java Good knowledge of AWS and Kubernetes Good knowledge of Kafka STEM degree from a top university Benefits: Flexible hybrid working Competitive compensation packages Market leading pension scheme Genuine exposure to Open Source tech More ❯
real-time achievable. Requirements: 8+ years commercial experience building high performance systems in Core Java Good knowledge of AWS and Kubernetes Good knowledge of Kafka STEM degree from a top university Benefits: Flexible hybrid working Competitive compensation packages Market leading pension scheme Genuine exposure to Open Source tech More ❯
haves (in order of priority – at least one or two is needed): Python (commercial experience) AWS services – S3, ECS, AppSync GraphQL or REST APIs Kafka or message queue systems PostgreSQL or NoSQL database experience To apply – click the link or for a faster response, email Barry.Ansell@HarringtonStarr.com More ❯
haves (in order of priority – at least one or two is needed): Python (commercial experience) AWS services – S3, ECS, AppSync GraphQL or REST APIs Kafka or message queue systems PostgreSQL or NoSQL database experience To apply – click the link or for a faster response, email Barry.Ansell@HarringtonStarr.com More ❯
haves (in order of priority – at least one or two is needed): Python (commercial experience) AWS services – S3, ECS, AppSync GraphQL or REST APIs Kafka or message queue systems PostgreSQL or NoSQL database experience To apply – click the link or for a faster response, email Barry.Ansell@HarringtonStarr.com More ❯
tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for clean, maintainable code and robust data systems Previous experience in a fintech or regulated environment is a plus Benefits: Join More ❯
tools (e.g., Airflow, dbt) Solid grasp of cloud platforms (AWS/GCP/Azure) and data warehouses (e.g., BigQuery, Snowflake) Familiarity with streaming technologies (Kafka, Kinesis, etc.) Passion for clean, maintainable code and robust data systems Previous experience in a fintech or regulated environment is a plus Benefits: Join More ❯