Easter Howgate, Midlothian, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
raw data into processed data. You will own the data operations infrastructure, manage and optimise performance, reliability, and scalability of the system to meeting growing demands on ingestion and processing pipelines. To succeed in this data engineering position, you should have strong problem-solving skills and the ability to combine data from different sources. Data engineer skills also include … structured or unstructured solutions. Design, Develop, Deploy and Support data infrastructure, pipelines and architecture. Implement reliable, scalable, and tested solutions to automate data ingestion. Development of systems to manage batchprocessing and real-time streaming of data Evaluate business needs and objectives. Facilitate pipelines, which prepare data for prescriptive and predictive modelling. Working with domain teams to scale … the processing of data. Combine raw information from different sources. Manage and maintain automated tools for data quality and reliability. Explore ways to enhance data quality and reliability. Collaborate with data scientists, IT and architects on several projects What you'll bring Successful Candidates will have previous experience as a data or software engineer in a similar role. Attributes More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
JR United Kingdom
experience with AWS data and analytics services (or equivalent open-source technologies). Expertise in designing and building data lakes, data warehouses, and ETL pipelines. Strong understanding of data processing techniques, including stream and batch processing. Familiarity with data mining, machine learning, and natural language processing is a plus. Ability to travel to client sites when required. More ❯
our capabilities, solving new data problems and challenges every day. Key Responsibilities: Design, Build, and Optimise Real-Time Data Pipelines: Develop and maintain robust and scalable stream and micro-batch data pipelines using Databricks, Spark (PySpark/SQL), and Delta Live Tables. Implement Change Data Capture (CDC): Implement efficient CDC mechanisms to capture and process data changes from various … Databricks Lakehouse Platform is essential. Strong Spark and Python/SQL Skills: Proficiency in Spark programming (PySpark and/or Scala) and expert-level SQL skills. Real-Time Data Processing: Demonstrable experience in building and managing stream and micro-batchprocessing pipelines using technologies like Spark Structured Streaming or Delta Live Tables. Deep Understanding of Delta Lake More ❯
services, code development (C#/Java), containerization (Docker, OpenShift, Kubernetes), web UI (React), and Infrastructure as Code (Terraform). Experience with integration techniques, real-time messaging, API design, and batch processing. Familiarity with DevOps practices, CI/CD, automated deployments, cloud infrastructure, Grid Computing, and cost optimization. Understanding of Agile methodology. Ability to work independently or in small teams More ❯
design patterns, AWS services, programming (C#/Java), containerization (Docker, OpenShift, Kubernetes), web UI (React), Infrastructure as Code (Terraform). Integration techniques including real-time messaging (AMQ), API design, batch processing. DevOps practices, CI/CD, automated deployments, cloud infrastructure, Grid Computing, and cost optimization. Understanding of Agile methodologies. Ability to work independently or in small teams. Strong analytical More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Version 1
S3 to build and deploy cloud-based solutions, ensuring high availability and scalability. Database Management: Design, implement, and maintain database schemas, write complex SQL queries, and optimize database performance. BatchProcessing: Develop and manage batchprocessing systems to handle large volumes of data efficiently. Code Quality: Ensure code quality through code reviews, unit testing, and adherence … APIs. Problem-Solving: Excellent analytical and problem-solving skills. Communication: Strong verbal and written communication skills. Team Player: Ability to work effectively in a collaborative team environment. Preferred Qualifications : BatchProcessing experience: Hands-on experience with batchprocessing frameworks and tools. Python experience: Minimum of 2 years is nice to have. Additional Information Why Version More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
INTEC SELECT LIMITED
Responsibilities: Design, implement, and maintain robust ETL pipelines to integrate data from diverse sources, including APIs like Facebook, Google Analytics, and payment providers. Develop and optimize data models for batchprocessing and real-time streaming using tools like AWS Redshift, S3, and Kafka. Lead efforts in acquiring, storing, processing, and provisioning data to meet evolving business requirements. More ❯
both internal and external) and join wider cross-functional teams. You’ll have some experience building distributed systems both on-prem and in public cloud, CI/CD pipelines, batch compute tooling and developer productivity tooling. You will be excited and committed to providing an excellent developer experience, with a constant eye on continuous improvement. Key responsibilities: Understanding the …/components that benefit multiple Software Engineering teams, as well as providing the mechanism for discovering and consuming shared components. Including but not limited to: K8s configuration and usability, batchprocessing and HPC interactions and hybrid cloud deployments Stay informed on open source and 3rd party tooling that we should consider adopting rather than building in-house Assist … approach to technical problem-solving Exposure to the following would be beneficial: Developer portals and software catalogues Infrastructure-as-code tools such as Ansible or Terraform Experience in using batch compute frameworks, HPC tooling Company Description For almost 50 years, Williams Racing has been at the forefront of one of the fastest sports on the planet, being one of More ❯
AWS offerings, coding (C#/Java), microservice tools (Docker, Kubernetes), web UI development (React), and Infrastructure as Code (Terraform). Expertise in integration techniques, including messaging, API design, and batch processing. Familiarity with DevOps practices, CI/CD, automated deployments, cloud infrastructure, and cost optimization. Understanding of Agile methodologies. Ability to work independently or in small teams, with strong More ❯
services, C#/Java, Docker, OpenShift, Kubernetes, React, and Infrastructure as Code (Terraform). Experience with integration techniques such as real-time messaging (AMQ), API design (JSON, Swagger), and batch processing. Familiarity with DevOps practices, CI/CD, automated deployments, cloud infrastructure, grid computing, and cost optimization. Understanding of Agile methodologies. Ability to work independently or in small teams More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
proactive mindset to help ensure the stability, efficiency, and resilience of applications across esure. Good working knowledge of Java is needed for this position. What you’ll do: Support batchprocessing and carry out proactive analysis across squads and services. Review Java code and suggest improvements to enhance quality and performance. Investigate and resolve production incidents, problems, and … change requests. Offer expertise on business processes for projects and production support issues. Maintain core Insurance systems and related infrastructure like Kubernetes and SOLR. Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. Provide data analysis to support business decisions across Operations and Claims. Deliver patching, database fixes, and business-critical data correction services. Qualifications What More ❯
EC2/Node (ReactJS) using AWS API Gateway and Lambda Functions (Python) for the internal web application. The individual will be able to demonstrate a strong background in backend batchprocessing (ETL) using Python and SQL skills. Whilst having good knowledge developing web applications using ReactJS/Python. Our technology team uses the following tools for development GitLab More ❯
EC2/Node (ReactJS) using AWS API Gateway and Lambda Functions (Python) for the internal web application. The individual will be able to demonstrate a strong background in backend batchprocessing (ETL) using Python and SQL skills. Whilst having good knowledge developing web applications using ReactJS/Python. Our technology team uses the following tools for development GitLab More ❯
background - proven track record in enterprise data solutions Experience with ETL processes and data transformation , preferably using Databricks Strong foundation in Data Warehousing architectures and dimensional modeling Familiarity with batchprocessing from relational database sources Communication & Collaboration Skills of the Data Engineer Outstanding stakeholder engagement abilities across technical and business audiences Strong relationship-building skills with experience managing More ❯
architecture and design patterns, including AWS, C#/Java, Docker, OpenShift, Kubernetes, React, and Infrastructure as Code (Terraform). Experience with integration techniques, real-time messaging, API design, and batch processing. Familiarity with DevOps practices, CI/CD, automated deployments, cloud infrastructure, and cost optimization. Understanding of Agile methodologies and ability to work independently or in small teams. Excellent More ❯
engineering, and stakeholder management in investment banking environment. Key Responsibilities Design and maintain Power BI dashboards for trading, risk, and regulatory reporting Build data pipelines for real-time and batchprocessing of financial data Partner with traders, portfolio managers, and risk teams to deliver analytics solutions Ensure compliance with regulatory reporting requirements Optimize data models for front office More ❯
physical server provisioning, especially in strategic data centres Platform Resilience & Observability: Designing for uptime, performance, and root cause analysis. Web Services & APIs: Used for Integration with 24+ LBGI systems BatchProcessing: Understanding of batch suite performance and scheduling constraints RPA & Automation (Batching): Familiarity with robotic process automation Log Aggregation & Analysis: Tooling for log interrogation and root cause More ❯
environment with multiple competing requirements. A commitment to delivering high-quality, well-tested software. The ability and desire to work across the entire software stack, from server-side and batchprocessing components to front-end web development. Qualifications & Experience Essential Skills: Basic understanding of software development practices Familiarity with Agile methodologies (Scrum or Kanban) Basic understanding of Test More ❯
risk management actions • Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. • Interact with business users for resolving issues with applications. • Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. • Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code … and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). • Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. • Knowledge of SDLC and experience in working through entire life cycle of the project from start to end ABOUT GOLDMAN SACHS At Goldman Sachs, we commit More ❯
risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code … and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing – parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through the entire life cycle of the project from start to end. ABOUT GOLDMAN SACHS At Goldman Sachs, we More ❯
risk management actions • Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. • Interact with business users for resolving issues with applications. • Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. • Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code … and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). • Knowledge and experience in distributed computing - parallel computation on a single machine like DASK, Distributed processing on Public Cloud. • Knowledge of SDLC and experience in working through entire life cycle of the project from start to end ABOUT GOLDMAN SACHS At Goldman Sachs, we commit More ❯
Conduct descriptive and exploratory analysis to uncover patterns, trends, and potential applications across a wide range of datasets. Maintain robust and scalable data infra to support real-time and batchprocessing workflows. Provide domain expertise to help shape financial knowledge representation in LLM outputs. Develop and apply custom quality metrics to assess and guide model performance. Actively participate … Computer Science, Finance, Economics, Statistics, Engineering). 3+ years of professional experience in Data Science or Quantitative roles within finance. Proficiency in Python and SQL with exp. in data processing libraries such as pandas/numpy. Experience with machine learning frameworks and statistical analysis tools. Familiarity with financial data sources, APIs, and data modelling techniques. Ability to design, implement More ❯
strong plus. Your responsibilities: (Up to 10, Avoid repetition) Architect and implement scalable, secure, and high-performance data platforms using Azure, Databricks, and Microsoft Fabric for real-time and batch processing. Lead integration across structured and unstructured data sources such as SCADA, SAP, APIs, telemetry, and time-series data using modern ETL/ELT patterns. Establish robust data governance More ❯
to develop their skills in a secure, enterprise-grade setting. Key Responsibilities Deliver 2nd line support across a broad range of IT infrastructure technologies Monitor alerts and events, manage batchprocessing, and respond proactively to incidents Work to meet SLAs through timely resolution of technical issues Support change requests, patching, and technical escalations Maintain detailed records using call More ❯
Employment Type: Permanent
Salary: £28000 - £31000/annum Pension, Healthcare + more
Willenhall, West Midlands, United Kingdom Hybrid / WFH Options
Parser Limited
Sr QA Manual Engineer We are looking for a detail-oriented Functional QA Engineer to join a data-centric project focused on real-time processing and system reliability. You will be responsible for validating business-critical workflows, ensuring data accuracy in CSV imports/exports, and maintaining traceability for audit and compliance purposes. The ideal candidate is analytical, structured … testing with sprint goals. Perform regression, integration, and exploratory testing. Manage defect reporting and tracking, while ensuring traceability and timely resolution. Contribute to the testing of real-time and batch processes using CSV and other input formats. Ensure proper documentation of test cases, results, and traceability matrices to support audit and compliance needs. What you'll bring to us More ❯