working with modern tools in a fast-moving, high-performance environment. Your responsibilities may include: Build and maintain scalable, efficient ETL/ELT pipelines for both real-time and batch processing. Integrate data from APIs, streaming platforms, and legacy systems, with a focus on data quality and reliability. Design and manage data storage solutions, including databases, warehouses, and lakes. … Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads. Operations & Tooling Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and security measures in line with best practices and regulatory standards. Develop observability and anomaly detection tools to support Tier … maintaining scalable ETL/ELT pipelines and data architectures. Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services. Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch. Strong understanding of data governance, security, and best practices for data quality. Effective communicator with the ability to work across technical More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
INTEC SELECT LIMITED
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intec Select
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
a high-impact engineering team, you'll help design and implement fault-tolerant data pipelines and services that operate on massive, time-series datasets and support real-time and batch analytics. This is an opportunity to solve challenging problems at scale in a domain where precision, performance, and reliability are critical. What Will You be Involved With? Design and … build scalable, distributed systems using Java , Python , and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for processing large-scale time-series datasets Build event-driven and batchprocessing workflows using Lambda , SNS/SQS , and DynamoDB Read, process, and transform data from a variety of sources: flat files, APIs, and streaming systems Participate … ideas clearly to both engineers and non-technical stakeholders What Will You Bring to the Table? Strong backend software engineering experience, ideally with distributed systems and large-scale data processing Strong programming skills in Java (multithreading, concurrency, performance tuning) Deep experience with Apache Spark and Spark Streaming Proficiency with AWS services , including Lambda, DynamoDB, S3, SNS, SQS, and Serverless More ❯
infrastructure, and ensure data is reliable, secure, and accessible in a fast-paced environment. Responsibilities: Build and maintain efficient, scalable ETL/ELT pipelines for both real-time and batch processing. Design and manage data storage solutions across databases, warehouses, and data lakes. Integrate data from APIs, streaming platforms, and legacy systems. Optimize data infrastructure for performance, cost, and … related field. Proficiency in Python, Java, and SQL; familiarity with Rust is a plus. Proven track record with cloud platforms (e.g., AWS) and distributed data tools (e.g., Flink, AWS Batch). Strong understanding of data security, quality, and governance principles. Excellent communication and collaboration skills across technical and non-technical teams. Bonus Points For: Experience with orchestration tools like … Apache Airflow. Familiarity with real-time data processing and event-driven systems. Knowledge of observability and anomaly detection in production environments. Exposure to visualization tools like Tableau or Looker. Relevant cloud or data engineering certifications. What’s Offered: Competitive salary with two annual discretionary bonuses. Generous benefits, including healthcare, dental, vision, and retirement planning. 30 days of holiday plus More ❯
infrastructure, and ensure data is reliable, secure, and accessible in a fast-paced environment. Responsibilities: Build and maintain efficient, scalable ETL/ELT pipelines for both real-time and batch processing. Design and manage data storage solutions across databases, warehouses, and data lakes. Integrate data from APIs, streaming platforms, and legacy systems. Optimize data infrastructure for performance, cost, and … related field. Proficiency in Python, Java, and SQL; familiarity with Rust is a plus. Proven track record with cloud platforms (e.g., AWS) and distributed data tools (e.g., Flink, AWS Batch). Strong understanding of data security, quality, and governance principles. Excellent communication and collaboration skills across technical and non-technical teams. Bonus Points For: Experience with orchestration tools like … Apache Airflow. Familiarity with real-time data processing and event-driven systems. Knowledge of observability and anomaly detection in production environments. Exposure to visualization tools like Tableau or Looker. Relevant cloud or data engineering certifications. What’s Offered: Competitive salary with two annual discretionary bonuses. Generous benefits, including healthcare, dental, vision, and retirement planning. 30 days of holiday plus More ❯
the AcadiaSoft MarginSphere adapter. Work closely with the vendor to identify and implement changes to improve the performance of the system. Create and update in-house SQL extracts. Optimize batchprocessing to decouple margin types and reduce business/SLA impact when failures occur. Key Requirements: Previous experience upgrading and supporting the TLM Collateral Management application for OTC More ❯
engineering, and stakeholder management in investment banking environment. Key Responsibilities Design and maintain Power BI dashboards for trading, risk, and regulatory reporting Build data pipelines for real-time and batchprocessing of financial data Partner with traders, portfolio managers, and risk teams to deliver analytics solutions Ensure compliance with regulatory reporting requirements Optimize data models for front office More ❯
from either a technical or business area. Comfortable making presentations covering business, technical or sales. Detailed knowledge of; Project phasing and reporting On-site delivery Systems support Operational procedures Batchprocessing System sizing and capacity planning System backup, contingency and disaster recovery Regional roll outs Education & Preferred Qualifications Third level qualification essential ideally a Technical Bachelors Degree. Fluency More ❯
Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation. Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & BatchProcessing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self-service analytics. Excellent leadership and More ❯
Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & BatchProcessing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self-service analytics. Excellent leadership and More ❯
Bank client are looking for a Power BI Developer to design and maintain dashboards for trading, risk, and regulatory reporting. You'll build data pipelines for Real Time and batchprocessing of financial data. This is a long-term contract opportunity. The following skills/experience is essential: Strong Power BI (DAX, Power Query, data modelling) Databricks, Python More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hunter Bond
Bank client are looking for a Power BI Developer to design and maintain dashboards for trading, risk, and regulatory reporting. You'll build data pipelines for Real Time and batchprocessing of financial data. This is a long-term contract opportunity. The following skills/experience is essential: Strong Power BI (DAX, Power Query, data modelling) Databricks, Python More ❯
integration with AcadiaSoft MarginSphere and optimize system performance. Collaborate with internal teams and vendors to address inefficiencies and implement improvements. Develop and maintain SQL extracts and stored procedures; improve batchprocessing resilience. Skills & Experience Proven experience with TLM Collateral Management , particularly in upgrades and support. Strong understanding of collateral processes for OTC derivatives, repos, and stock borrowing/ More ❯
integration with AcadiaSoft MarginSphere and optimize system performance. Collaborate with internal teams and vendors to address inefficiencies and implement improvements. Develop and maintain SQL extracts and stored procedures; improve batchprocessing resilience. Skills & Experience Proven experience with TLM Collateral Management , particularly in upgrades and support. Strong understanding of collateral processes for OTC derivatives, repos, and stock borrowing/ More ❯
and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
will deliver proof-of-concept projects, topical workshops, and lead implementation projects. These professional services engagements will focus on key customer solutions such as, web applications, enterprise applications, HPC, batchprocessing and big data, archiving and disaster recovery, education and government. Professional Services engage in a wide variety of projects for customers and partners, providing collective experience from More ❯
field Experience: Proven experience in ML model lifecycle management ● Core Competencies: Model lifecycle: You’ve got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Burns Sheehan
field Experience: Proven experience in ML model lifecycle management ● Core Competencies: Model lifecycle: You’ve got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯
Month Contract Hybrid Working leading London Based Investment Bank You will: Design and build foundation components that will underpin our data mesh ecosystem. Build enterprise class real-time and batch solutions that support mission critical processes. Building solutions in line with our Digital Principles. Partner with our Product team(s) to create sustainable and resilient solutions. Your experience: Bachelor … more; Azure (preferred). Demonstrate hands-on engineering in large scale complex Enterprise(s), ideally in the banking/financial industry. Worked with modern tech ' data streaming, real-time & batchprocessing and compute clusters. Working knowledge of relational and NoSQL databases, designing and implementing scalable solutions. Experience of working in continuous architecture environment, iterating and improving solutions including More ❯
years in the telecom industry (preferably in fiber networks). Proven experience in designing, implementing, and managing complex data architectures for large-scale systems in real-time and batchprocessing environments. Strong background in telecom data systems including but not limited to, network performance data, customer usage data, and fiber network management systems. Technical Skills: Proficiency with big More ❯
years in the telecom industry (preferably in fiber networks). Proven experience in designing, implementing, and managing complex data architectures for large-scale systems in real-time and batchprocessing environments. Strong background in telecom data systems including but not limited to, network performance data, customer usage data, and fiber network management systems. Technical Skills: Proficiency with big More ❯