London, England, United Kingdom Hybrid / WFH Options
Aimpoint Digital
Python, SKLearn, XGBoost, SparkML, etc. Experience with deep learning frameworks like TensorFlow or PyTorch. Knowledge of ML model deployment options (e.g., Azure Functions, FastAPI, Kubernetes) for real-time and batch processing. Experience with CI/CD pipelines (e.g., DevOps pipelines, Git actions). Knowledge of infrastructure as code (e.g., Terraform, ARM Template, Databricks Asset Bundles). Understanding of advanced … machine learning techniques, including graph-based processing, computer vision, natural language processing, and simulation modeling. Experience with generative AI and LLMs, such as LLamaIndex and LangChain Understanding of MLOps or LLMOps. Familiarity with Agile methodologies, preferably Scrum We are actively seeking candidates for full-time, remote work within the UK. #J-18808-Ljbffr More ❯
Easter Howgate, Midlothian, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
raw data into processed data. You will own the data operations infrastructure, manage and optimise performance, reliability, and scalability of the system to meeting growing demands on ingestion and processing pipelines. To succeed in this data engineering position, you should have strong problem-solving skills and the ability to combine data from different sources. Data engineer skills also include … structured or unstructured solutions. Design, Develop, Deploy and Support data infrastructure, pipelines and architecture. Implement reliable, scalable, and tested solutions to automate data ingestion. Development of systems to manage batchprocessing and real-time streaming of data Evaluate business needs and objectives. Facilitate pipelines, which prepare data for prescriptive and predictive modelling. Working with domain teams to scale … the processing of data. Combine raw information from different sources. Manage and maintain automated tools for data quality and reliability. Explore ways to enhance data quality and reliability. Collaborate with data scientists, IT and architects on several projects What you'll bring Successful Candidates will have previous experience as a data or software engineer in a similar role. Attributes More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
JR United Kingdom
experience with AWS data and analytics services (or equivalent open-source technologies). Expertise in designing and building data lakes, data warehouses, and ETL pipelines. Strong understanding of data processing techniques, including stream and batch processing. Familiarity with data mining, machine learning, and natural language processing is a plus. Ability to travel to client sites when required. More ❯
London, England, United Kingdom Hybrid / WFH Options
Jobs via eFinancialCareers
the AcadiaSoft MarginSphere adapter. Work closely with the vendor to identify and implement changes to improve the performance of the system. Create and update in-house SQL extracts. Optimize batchprocessing to decouple margin types and reduce business/SLA impact when failures occur. Key Requirements: Previous experience upgrading and supporting the TLM Collateral Management application for OTC More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Version 1
S3 to build and deploy cloud-based solutions, ensuring high availability and scalability. Database Management: Design, implement, and maintain database schemas, write complex SQL queries, and optimize database performance. BatchProcessing: Develop and manage batchprocessing systems to handle large volumes of data efficiently. Code Quality: Ensure code quality through code reviews, unit testing, and adherence … APIs. Problem-Solving: Excellent analytical and problem-solving skills. Communication: Strong verbal and written communication skills. Team Player: Ability to work effectively in a collaborative team environment. Preferred Qualifications BatchProcessing experience: Hands-on experience with batchprocessing frameworks and tools. Python experience: Minimum of 2 years is nice to have. Additional Information Why Version More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
INTEC SELECT LIMITED
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
London, England, United Kingdom Hybrid / WFH Options
Citi
they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of More ❯
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intec Select
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
a high-impact engineering team, you'll help design and implement fault-tolerant data pipelines and services that operate on massive, time-series datasets and support real-time and batch analytics. This is an opportunity to solve challenging problems at scale in a domain where precision, performance, and reliability are critical. What Will You be Involved With? Design and … build scalable, distributed systems using Java , Python , and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for processing large-scale time-series datasets Build event-driven and batchprocessing workflows using Lambda , SNS/SQS , and DynamoDB Read, process, and transform data from a variety of sources: flat files, APIs, and streaming systems Participate … ideas clearly to both engineers and non-technical stakeholders What Will You Bring to the Table? Strong backend software engineering experience, ideally with distributed systems and large-scale data processing Strong programming skills in Java (multithreading, concurrency, performance tuning) Deep experience with Apache Spark and Spark Streaming Proficiency with AWS services , including Lambda, DynamoDB, S3, SNS, SQS, and Serverless More ❯
London, England, United Kingdom Hybrid / WFH Options
Neurensic
a high-impact engineering team, you’ll help design and implement fault-tolerant data pipelines and services that operate on massive, time-series datasets and support real-time and batch analytics. This is an opportunity to solve challenging problems at scale in a domain where precision, performance, and reliability are critical. What Will You be Involved With? Design and … build scalable, distributed systems using Java, Python, and Apache Spark Develop and optimize Spark jobs on AWS Serverless EMR for processing large-scale time-series datasets Build event-driven and batchprocessing workflows using Lambda, SNS/SQS, and DynamoDB Read, process, and transform data from a variety of sources: flat files, APIs, and streaming systems Participate … ideas clearly to both engineers and non-technical stakeholders What Will You Bring to the Table? Strong backend software engineering experience, ideally with distributed systems and large-scale data processing Strong programming skills in Java (multithreading, concurrency, performance tuning) Deep experience with Apache Spark and Spark Streaming Proficiency with AWS services, including Lambda, DynamoDB, S3, SNS, SQS, and Serverless More ❯
London, England, United Kingdom Hybrid / WFH Options
DEPOP
company. Responsibilities Partner with stakeholders across Product, Data, and Engineering to translate strategic goals into platform capabilities. Lead and mentor 1-2 squads responsible for experimentation, analytics event logging, batch data platform, real-time infrastructure, data observability and governance. Collaborate closely with stakeholders to define and drive the technical roadmap for Depop's modern data platform, enabling reliable and … Deep understanding of distributed systems and modern data ecosystems - including experience with Databrick, Apache Spark, Apache Kafka and DBT. Demonstrated success in managing data platforms at scale, including both batchprocessing and real-time streaming architectures. Deep understanding of data warehousing concepts, ETL/ELT processes, and analytics engineering. Strong programming skills, particularly in Python, Scala or Java More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
proactive mindset to help ensure the stability, efficiency, and resilience of applications across esure. Good working knowledge of Java is needed for this position. What you’ll do: Support batchprocessing and carry out proactive analysis across squads and services. Review Java code and suggest improvements to enhance quality and performance. Investigate and resolve production incidents, problems, and … change requests. Offer expertise on business processes for projects and production support issues. Maintain core Insurance systems and related infrastructure like Kubernetes and SOLR. Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. Provide data analysis to support business decisions across Operations and Claims. Deliver patching, database fixes, and business-critical data correction services. Qualifications What More ❯
London, England, United Kingdom Hybrid / WFH Options
Rollbar, Inc
who is confident, thorough and tenacious Bonus Points Hands-on, production experience with K8S Hands-on, production experience with cloud (AWS, Azure, GCP) Experience with data engineering - streaming and batchprocessing, spark, tryno, iceberg, clickhouse, parquet Are you a Do'er? Be your truest self. Work on your terms. Make a difference. We are home to a global More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
scalability, and maintainability. Data Integration & Pipeline Engineering: Design and implement resilient ETL/ELT workflows using technologies such as BigQuery, Dataflow, Informatica, or IBM DataStage, supporting real-time and batch processing. Governance & Data Quality Management: Establish data governance frameworks, including metadata management and quality assurance, using platforms like Unity Catalog, Alation, Profisee, or DQ Pro. Strategic Advisory & Stakeholder Engagement More ❯
London, England, United Kingdom Hybrid / WFH Options
Zego
field Experience: Proven experience in ML model lifecycle management Core Competencies: Model lifecycle: You’ve got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Hunter Bond
Bank client are looking for a Power BI Developer to design and maintain dashboards for trading, risk, and regulatory reporting. You'll build data pipelines for Real Time and batchprocessing of financial data. This is a long-term contract opportunity. The following skills/experience is essential: Strong Power BI (DAX, Power Query, data modelling) Databricks, Python More ❯
Bank client are looking for a Power BI Developer to design and maintain dashboards for trading, risk, and regulatory reporting. You'll build data pipelines for Real Time and batchprocessing of financial data. This is a long-term contract opportunity. The following skills/experience is essential: Strong Power BI (DAX, Power Query, data modelling) Databricks, Python More ❯
and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
London, England, United Kingdom Hybrid / WFH Options
Acord (association For Cooperative Operations Research And Development)
for clients utilizing messaging standards (incl. FIX, SWIFT and custom XML formats) over MQ, APIs, SFTP, TCP/IP and other protocols. Typically solutions include a combination of messaging, batchprocessing and data transformation, and are commonly written in Python. This role can be performed in a hybrid model, where you can balance work from home and office More ❯
London, England, United Kingdom Hybrid / WFH Options
TESTQ Technologies Limited
our goals. Key Responsibilities Architect and Design: Lead the design of reference architectures and applications for multiple patterns in each public cloud provider, e.g., message-driven, simple web applications, batch processing. Mentorship: Provide guidance and mentorship to other engineers, fostering a culture of continuous learning and improvement. Customization and Flexibility: Develop solutions that support flexibility and customization for advanced More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Burns Sheehan
field Experience: Proven experience in ML model lifecycle management ● Core Competencies: Model lifecycle: You’ve got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯
field Experience: Proven experience in ML model lifecycle management ● Core Competencies: Model lifecycle: You’ve got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯