are always willing to dive into specific areas to gain the expertise needed to be successful in your role. Nice-to-Haves Tech stack knowledge. Our stack is Clojure, Kafka, PostgreSQL and Redshift on AWS, so experience with these is ideal. Our Role Champion work-life harmony . We'll give you the flexibility you need in your work More ❯
data from diverse sources, transform it into usable formats, and load it into data warehouses, data lakes or lakehouses. Big Data Technologies: Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Cloud Platforms: Deploy and manage data solutions on cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP), leveraging cloud … for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures, such as lakehouse. Experience with CI/CD pipelines, version control systems like Git, and containerization (e.g., Docker). Experience with … and analytical skills with the ability to diagnose and resolve complex data-related issues. SQL (for database management and querying) Apache Spark (for distributed data processing) Apache Spark Streaming, Kafka or similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory More ❯
Description About this role BlackRock is one of the world's leading providers of investment, advisory, and risk management solutions, including Aladdin, our investment and risk management technology. Aladdin is a comprehensive technology platform used by BlackRock and delivered to More ❯
analysis, release co-ordination) developing in Linux, multi-threaded, large scale, fault tolerance systems Azul Zing, FIX Trading Protocol messaging services like TIBCO RV/EMS, JMS, AMPS, ApacheKafka Understanding of latency, garage collection Contract: 12 Months Rolling Rate: £ 600-800 p/d Via Umbrella Location: London - 1/2 days per week in the office. If More ❯
analysis, release co-ordination) developing in Linux, multi-threaded, large scale, fault tolerance systems Azul Zing, FIX Trading Protocol messaging services like TIBCO RV/EMS, JMS, AMPS, ApacheKafka Understanding of latency, garage collection Contract: 12 Months Rolling Rate: £ (Apply online only) p/d Via Umbrella Location: London - 1/2 days per week in the office. More ❯
VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH/LEAD BIG DATA ENGINEER/LEAD BIG DATA DEVELOPER More ❯
modeling, optimisation, data quality and Master Data Management. Experience with database technologies such as RDBMS (SQL Server, Oracle) or NoSQL (MongoDB). Knowledge in Apache technologies such as Spark, Kafka and Airflow to build scalable and efficient data pipelines. Have worked on migration projects and some experience with management systems such as SAP, ERP OR CRM would be desirable More ❯
Employment Type: Contract
Rate: £700 - £750/day £700-750 Per Day (Inside IR35)
with cross-functional teams to deliver scalable and robust system solutions. Key Skills Required: Strong development experience in Java (primary skill). Site Reliability Engineering ( SRE ) experience. Proficiency with Kafka , Mule , and Oracle Database . Ability to work at a managerial level while remaining hands-on with technical tasks. Nice to Have: Knowledge of payment systems . Banking/ More ❯
with cross-functional teams to deliver scalable and robust system solutions. Key Skills Required: Strong development experience in Java (primary skill). Site Reliability Engineering ( SRE ) experience. Proficiency with Kafka , Mule , and Oracle Database . Ability to work at a managerial level while remaining hands-on with technical tasks. Nice to Have: Knowledge of payment systems . Banking/ More ❯
You have: · Advanced Python proficiency, especially in scalable, clean code architecture and microservices (e.g., FastAPI, Flask, asyncio) · Solid understanding of API integration patterns and inter-servic communication (e.g. REST, Kafka) · Experience with authentication and authorization mechanisms (e.g. OAuth2, JWT, Azure AD) · At least two ML/AI solutions delivered to production, ideally involving document understanding, NLP or search/ More ❯
You have: · Advanced Python proficiency, especially in scalable, clean code architecture and microservices (e.g., FastAPI, Flask, asyncio) · Solid understanding of API integration patterns and inter-servic communication (e.g. REST, Kafka) · Experience with authentication and authorization mechanisms (e.g. OAuth2, JWT, Azure AD) · At least two ML/AI solutions delivered to production, ideally involving document understanding, NLP or search/ More ❯
and database monitoring alerts. Required Skills & Experience Strong hands-on knowledge of Snowflake and SQL Server (including SSIS).Experience in ETL, data migration, and performance optimization .Familiarity with Oracle, Kafka, ODI, and Azure SQL DB .Proven ability in incident, problem, and change management .Good understanding of data governance, security, and auditing frameworks .Exposure to monitoring and observability tools (e.g. More ❯
CO AU Alfa Financial Software Australia Pty Limited
often in an evolving environment. Experience in mentoring and coaching. Preferred: Experience in the Auto/Equipment Finance Industry. Integration experience with APIs, web services, event streaming and messaging (Kafka, JMS, SQS etc). Experience with market-leading SaaS or iPaaS solutions. Database experience, including with reporting systems/data warehousing. Performance analysis, with tools like JProfiler/CodeGuru. More ❯
Overview Python Technical Architect - 6 Month contract initially. Hybrid working in Bradford (Max 3 days p/w onsite). Rate: Market rates (via Umbrella company). We have a great opportunity with a world leading organisation where you will More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
Excellent communication and collaboration skills Strong understanding of data movement patterns including ELT, event-oriented architecture, and data streaming Snowflake expertise is essential Experience with dbt, dbt Cloud, and Kafka is highly desirable Knowledge of the London Markets is a must Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive More ❯
City of London, London, Vintry, United Kingdom Hybrid / WFH Options
Deerfoot Recruitment Solutions Limited
unit testing frameworks (e.g., JUnit). Experience testing backend systems or APIs, with knowledge of REST, JSON, or Thrift. Bonus: Familiarity with Selenium WebDriver, Jenkins, big data technologies (Hadoop, Kafka), or performance testing tools like JMeter. Why Apply? Competitive bonus structure (from 4%, up to 8% after 3 years). Comprehensive benefits including private pension plans, health & dental coverage More ❯
Employment Type: Permanent
Salary: £90000 - £120000/annum Benefits + Bonus + Hybrid Working
Mondays, Wednesdays, and Thursdays. About the role As a Product Manager at Conduktor, you will be instrumental in solving customer problems for organisations working in the data streaming space (Kafka), driving growth in adoption of our Enterprise platform. You will partner with Product & Engineering leadership to extend our platform capabilities with a focus on data security and observability. This More ❯
stack in designing and building data & AI solutions Experience with data modeling, ETL processes, and data warehousing Knowledge of big data tools and frameworks such as Spark, Hadoop, or KafkaMore ❯
least 5+ years of experience building highly reliable and scalable backend services in Go Experience with RDBMSs such as Postgres, MySQL etc. Experience with message-brokers technologies such as Kafka, RabbitMQ etc, working within event-driven architectures. You have excellent knowledge of the best practices in designing, developing and deploying those services in a cloud environment You have experience More ❯
South East London, London, United Kingdom Hybrid / WFH Options
Stepstone UK
over 50 million visits each month, deliver the best user experiences and personalisation for talent and talent seekers. We built in-house large scale tracking platform, powered by Tealium, Kafka and Adobe Analytics, and successfully rolled it out to 10 of our brands. Along with the tracking we are owners of our AB testing tool Optimizely, our in-house … integrating data sources via web and hypermedia APIs Proficient in Web Technologies like HTML, CSS, JavaScript, React.js, Next.js and Node.js, and in event-driven and eventual consistency systems using Kafka, .Net, Java, REST APIs, AWS, Terraform and DevOps Nice to have:experience in data pipping and modelling using SQL, DBT, ETL, Data Warehousing, Redshift and Python, and ecommerce and More ❯
platform. Lead/mentor a global engineering team (~30 people across Europe & Asia). Shape architecture and design for scalable, resilient systems. Drive adoption of modern technologies: containers, Kubernetes, Kafka, microservices. Collaborate with business and tech teams to balance strategy with delivery. Champion agile, DevOps, and CI/CD practices. Key Requirements Proven experience in application architecture for distributed … systems. Strong engineering background: Java, Spring, Oracle SQL. Solid understanding of REST, messaging patterns, design best practices. Experience in FX Exposure to Kubernetes, containers, Kafka a strong advantage. Confident communicator with the gravitas to influence both technical and non-technical stakeholders. TOGAF or similar certifications Robert Walters Operations Limited is an employment business and employment agency and welcomes applications More ❯
systems and shell scripting Familiarity with monitoring tools and performance tuning techniques Experience with backup and disaster recovery processes Review the current Debezium deployment architecture, including Oracle connector configuration, Kafka integration, and downstream consumers. Analyze Oracle database setup for CDC compatibility (e.g., redo log configuration, supplemental logging, privileges). Evaluate connector performance, lag, and error handling mechanisms. Identify bottlenecks … on experience with Debezium, especially the Oracle connector (LogMiner). Deep understanding of Oracle internals relevant to CDC: redo logs, SCNs, archive log mode, supplemental logging. Proficiency with ApacheKafka and Kafka ecosystem tools. Experience with monitoring and debugging Debezium connectors in production environments. Ability to analyze logs, metrics, and connector configurations to identify root causes of issues. … communication skills for delivering technical assessments. Edge : MongoDB Certified DBA Associate or higher. Exposure to containerized environments (Docker, Kubernetes) and Infrastructure as Code (Terraform, etc.). RedHat Certifications concerning Kafka or Debezium More ❯
in high-volume, distributed processing systems. Familiarity with DevOps tools such as GitHub Actions or Jenkins. Solid grounding in modern engineering principles and full-stack development. Bonus Skills: Airflow, Kafka/Kafka Connect, Delta Lake, JSON/XML/Parquet/YAML, cloud-based data services. Why Apply? Work for a global payments innovator shaping the future of More ❯
in high-volume, distributed processing systems. Familiarity with DevOps tools such as GitHub Actions or Jenkins. Solid grounding in modern engineering principles and full-stack development. Bonus Skills: Airflow, Kafka/Kafka Connect, Delta Lake, JSON/XML/Parquet/YAML, cloud-based data services. Why Apply? Work for a global payments innovator shaping the future of More ❯