Belfast, County Antrim, Northern Ireland, United Kingdom Hybrid / WFH Options
VANLOQ LIMITED
excellent opportunity to work on high-performance, data-driven applications within a fast-paced enterprise environment. The successful candidate will play a key role in developing and optimising complex batchprocessing and integration systems using Java 17, Spring Batch, Spring Integration, and GridGain Database. Key Responsibilities Design, develop, and maintain scalable Java 17 applications for enterprise data … processing. Build and optimise batchprocessing solutions using Spring Batch. Design and implement integration flows with Spring Integration. Develop and tune SQL queries and stored procedures for relational databases. Work with GridGain Database to deliver distributed, high-performance data processing. Support data migration and workflow optimisation from MongoDB to GridGain. Automate deployment and operational tasks in Unix/… technical and business teams to deliver high-quality, reliable solutions. Skills & Experience Required Strong proficiency in Java 17, including modern Java features and best practices. Proven experience with Spring Batch and Spring Integration. Solid understanding of Unix/Linux scripting and SQL. Hands-on experience with GridGain Distributed Cache/Database. Experience working with MongoDB and migrating data between More ❯
Telford, Shropshire, England, United Kingdom Hybrid / WFH Options
TXP Technology x People
support legacy systems within a Central Government environment. Work with COBOL, ICL VME, and PL/SQL to deliver reliable, high-quality solutions. Apply structured programming principles and manage batchprocessing tasks. Collaborate with the team to ensure systems meet operational and compliance requirements. Essential Skills and Experience: Strong experience with COBOL development. Hands-on experience with ICL … VME and PL/SQL . Knowledge of structured programming and batchprocessing in mainframe environments. Ability to work effectively in a remote UK-based team. BPSS minimum security clearance (SC preferred). This is an excellent opportunity for a skilled COBOL developer to contribute to high-profile Central Government programmes, ensuring the reliability and continuity of essential More ❯
South Glamorgan, United Kingdom Hybrid / WFH Options
IntaPeople
of hours incident (e.g. DBA team, Network team) Taking first line corrective action for agreed and documented administrative systems Running & monitoring (not building, but see "Additional Responsibilities" below) traditional batch schedules, including using automated tools such as Control/M, Visualcron etc. Running (not building, but see "Additional Responsibilities" below) weekend release jobs/scripts to release development changes … to ensure datacentre temperature etc. (in addition to watching automatic building management alert systems) All other ad hoc activities reasonably required or expected to ensure smooth operation of the batchprocessing and datacentre operations as requested by Atradius under the supervision of an Atradius Operations Coordinator Ensure security and access procedures are followed, this includes maintenance of related … develop own skills and competencies and understand changes in market trends where these are relevant to the role. In addition the applicant would ideally possess: A proven background in batchprocessing and job scheduling. Experience of a job scheduling tool, particularly Control-M, would be an advantage as would experience of a Microfocus environment. Good understanding of JCL More ❯
CF10, Butetown Community, Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
IntaPeople
of hours incident (e.g. DBA team, Network team) Taking first line corrective action for agreed and documented administrative systems Running & monitoring (not building, but see "Additional Responsibilities" below) traditional batch schedules, including using automated tools such as Control/M, Visualcron etc. Running (not building, but see "Additional Responsibilities" below) weekend release jobs/scripts to release development changes … to ensure datacentre temperature etc. (in addition to watching automatic building management alert systems) All other ad hoc activities reasonably required or expected to ensure smooth operation of the batchprocessing and datacentre operations as requested by Atradius under the supervision of an Atradius Operations Coordinator Ensure security and access procedures are followed, this includes maintenance of related … develop own skills and competencies and understand changes in market trends where these are relevant to the role. In addition the applicant would ideally possess: A proven background in batchprocessing and job scheduling. Experience of a job scheduling tool, particularly Control-M, would be an advantage as would experience of a Microfocus environment. Good understanding of JCL More ❯
a strong relation to Java backend engineering to join our Graph Delivery team and help build, extend, and maintain customer-facing graph solutions. You will work on large-scale batchprocessing pipelines while also contributing to APIs and sometimes UIs, ensuring our products remain reliable, efficient, and impactful. The role requires strong problem ownership, and clear communication across … features, while long term driving the evolution of our graph products with new capabilities, improved performance, and architectural coherence. Key Responsibilities: Join a team to design, build, and maintain batch-oriented high scale data products, along with the APIs and minimalistic UIs that support them. Collaborate with Product, Solution Consulting, and Sales Engineering to deliver solutions for our most More ❯
cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms. Your work will directly support batchprocessing, real-time streaming, and event-driven data pipelines across a variety of use cases. We’re looking for candidates with over 3 years of relevant experience in … the following skills or proven experience: Apache NiFi Expertise: Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. Experience designing and optimizing data flows for batch, real-time streaming, and event-driven architectures. Ability to identify and resolve flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. More ❯
cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms. Your work will directly support batchprocessing, real-time streaming, and event-driven data pipelines across a variety of use cases. We’re looking for candidates with over 3 years of relevant experience in … the following skills or proven experience: Apache NiFi Expertise: Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. Experience designing and optimizing data flows for batch, real-time streaming, and event-driven architectures. Ability to identify and resolve flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. More ❯
Cloud) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow etc Good knowledge of stream and batchprocessing solutions like Apache Flink, Apache Kafka Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic etc Given that this is just More ❯
bristol, south west england, united kingdom Hybrid / WFH Options
Lloyds Banking Group
a recent focus on developing data and analytics solutions on cloud platform (e.g. GCP/AWS/Azure). Technical know-how in data engineering techniques which cover both batchprocessing and/or streaming Agile Delivery: Significant experience in scoping and development of technical solutions in an Agile environment. Technical Proficiency: Deep technical expertise in software and … data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (Kafka, Flink, DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. CI/CD: Detailed understanding of working automated CI/CD pipelines and experience of working with tools … had Industry Standard: GCP Data Engineer/Cloud Architect certifications Good appreciation of data security and privacy, and architectural implications it has on application design. Modern progressive technologies– e.g. batch/streaming pipelines, machine learning, artificial intelligence etc. High-level knowledge of QA, data quality, and software quality tools such as SonarQube, etc. About working for us Our ambition More ❯
Google Cloud) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow Good knowledge of stream and batchprocessing solutions like Apache Flink, Apache Kafka Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic Note: The following line contains removed More ❯
domain and sharing insights with the team WHAT YOU'LL BRING: Writes clean, maintainable Python code (ideally with type hints, linters, and tests like pytest) Understands data engineering basics: batchprocessing, schema evolution, and building ETL pipelines Has experience with or is eager to learn Apache Spark for large-scale data processing Is familiar with the AWS More ❯
currently recruiting for a Platfrom Support Engineer to join an industry leading and ambitious financial services company. This is an exciting opportunity with multifaceted responsibilities, including infrastructure, application and batch support. As well as developing and maintaining Grafana dashboards. What you’ll do: • Support batchprocessing and carry out proactive analysis across squads and services. • Support and … resolve production incidents, problems, and change requests. • Offer expertise on business processes for projects and production support issues. • Maintain core systems and related infrastructure like Kubernetes and SOLR. • Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. • Provide data analysis to support business decisions • Deliver patching, database fixes, and business-critical data correction services. What we More ❯
currently recruiting for a Platfrom Support Engineer to join an industry leading and ambitious financial services company. This is an exciting opportunity with multifaceted responsibilities, including infrastructure, application and batch support. As well as developing and maintaining Grafana dashboards. What you'll do: • Support batchprocessing and carry out proactive analysis across squads and services. • Support and … resolve production incidents, problems, and change requests. • Offer expertise on business processes for projects and production support issues. • Maintain core systems and related infrastructure like Kubernetes and SOLR. • Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. • Provide data analysis to support business decisions • Deliver patching, database fixes, and business-critical data correction services. What we More ❯
Senior Data Engineer – SC Cleared We are seeking a hands-on Senior Data Engineer with deep expertise in building and managing streaming and batch data pipelines . The ideal candidate will have strong experience working with large-scale data systems operating on cloud-based platforms such as AWS, Databricks or Snowflake . This role also involves close collaboration with … to evaluate and document Proofs of Concept (PoCs) for modern data platforms, while effectively engaging with senior stakeholders across the organisation. Key Responsibilities: Design, develop, and maintain streaming and batch data pipelines using modern data engineering tools and frameworks. Work with large volumes of structured and unstructured data , ensuring high performance and scalability. Collaborate with cloud providers and data … and Security teams to align data platform initiatives with business and technical strategies. Required Experience & Skills: Proven experience as a Data Engineer with a strong focus on streaming and batchprocessing . Hands-on experience with cloud-based data plaforms such as AWS/Databricks/IBM/Snowflake . Strong programming skills in Python, Scala, or Java More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
high motivated individual to bring a diverse skill set and proactive mindset to help ensure the stability, efficiency, and resilience of applications across esure. What you’ll do: Support batchprocessing and carry out proactive analysis across squads and services. Review Java code and suggest improvements to enhance quality and performance. Investigate and resolve production incidents, problems, and … change requests. Offer expertise on business processes for projects and production support issues. Maintain core Insurance systems and related infrastructure like Kubernetes and SOLR. Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. Provide data analysis to support business decisions across Operations and Claims. Deliver patching, database fixes, and business-critical data correction services. What we More ❯
guildford, south east england, united kingdom Hybrid / WFH Options
esure Group
high motivated individual to bring a diverse skill set and proactive mindset to help ensure the stability, efficiency, and resilience of applications across esure. What you’ll do: Support batchprocessing and carry out proactive analysis across squads and services. Review Java code and suggest improvements to enhance quality and performance. Investigate and resolve production incidents, problems, and … change requests. Offer expertise on business processes for projects and production support issues. Maintain core Insurance systems and related infrastructure like Kubernetes and SOLR. Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. Provide data analysis to support business decisions across Operations and Claims. Deliver patching, database fixes, and business-critical data correction services. What we More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
CloudSQL, Spanner, Pub Sub, Cloud composer and GKE. Data Engineering : Proficient in modern data tools and frameworks for building, operating, and optimising ETL/ELT pipelines, streaming data, and batch processing. SRE & Automation : Strong understanding of SRE principles, CI/CD, infrastructure as code, and operational automation. Data Literacy : Ability to design, develop, and apply data and analytics experience More ❯
Antrim, County Antrim, United Kingdom Hybrid / WFH Options
Vanloq - Workforce Solutions
edge enterprise systems that power critical data-driven processes across the business. You'll play a key role in developing and maintaining high-performance applications using Java 17, Spring Batch, Spring Integration, and GridGain Database, while contributing to complex data workflow and integration projects. Key Responsibilities Design, develop, and maintain scalable Java 17 applications supporting enterprise data operations. Build … optimise, and manage batchprocessing solutions using Spring Batch. Design and implement integration flows with Spring Integration. Develop efficient SQL queries, stored procedures, and scripts to support data pipelines. Work with GridGain Database to deliver distributed, high-performance data solutions. Support the transition and optimisation of data workflows from MongoDB to GridGain. Automate deployment and operational processes within … cross-functional teams to deliver robust, secure, and scalable solutions. Skills & Experience Required Strong proficiency in Java 17 and familiarity with the latest Java features. Proven experience with Spring Batch and Spring Integration frameworks. Solid understanding of Unix/Linux scripting and SQL development. Hands-on experience with GridGain Distributed Cache/Database. Proficiency with MongoDB and experience in More ❯
newtownabbey, antrim, united kingdom Hybrid / WFH Options
Vanloq - Workforce Solutions
edge enterprise systems that power critical data-driven processes across the business. You’ll play a key role in developing and maintaining high-performance applications using Java 17, Spring Batch, Spring Integration, and GridGain Database, while contributing to complex data workflow and integration projects. Key Responsibilities Design, develop, and maintain scalable Java 17 applications supporting enterprise data operations. Build … optimise, and manage batchprocessing solutions using Spring Batch. Design and implement integration flows with Spring Integration. Develop efficient SQL queries, stored procedures, and scripts to support data pipelines. Work with GridGain Database to deliver distributed, high-performance data solutions. Support the transition and optimisation of data workflows from MongoDB to GridGain. Automate deployment and operational processes within … cross-functional teams to deliver robust, secure, and scalable solutions. Skills & Experience Required Strong proficiency in Java 17 and familiarity with the latest Java features. Proven experience with Spring Batch and Spring Integration frameworks. Solid understanding of Unix/Linux scripting and SQL development. Hands-on experience with GridGain Distributed Cache/Database. Proficiency with MongoDB and experience in More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Vanloq - Workforce Solutions
edge enterprise systems that power critical data-driven processes across the business. You’ll play a key role in developing and maintaining high-performance applications using Java 17, Spring Batch, Spring Integration, and GridGain Database, while contributing to complex data workflow and integration projects. Key Responsibilities Design, develop, and maintain scalable Java 17 applications supporting enterprise data operations. Build … optimise, and manage batchprocessing solutions using Spring Batch. Design and implement integration flows with Spring Integration. Develop efficient SQL queries, stored procedures, and scripts to support data pipelines. Work with GridGain Database to deliver distributed, high-performance data solutions. Support the transition and optimisation of data workflows from MongoDB to GridGain. Automate deployment and operational processes within … cross-functional teams to deliver robust, secure, and scalable solutions. Skills & Experience Required Strong proficiency in Java 17 and familiarity with the latest Java features. Proven experience with Spring Batch and Spring Integration frameworks. Solid understanding of Unix/Linux scripting and SQL development. Hands-on experience with GridGain Distributed Cache/Database. Proficiency with MongoDB and experience in More ❯
real-time, distributed systems Deep expertise with modern streaming tools: Kafka, Flink, Spark, Databricks Strong cloud-native data architecture experience (AWS, GCP, or Azure) Solid understanding of streaming vs. batchprocessing paradigms Experience defining technical strategy and creating actionable roadmaps Familiarity with broader data ecosystem tools like Airflow, Dagster, DBT Strong skills in Python, Scala, or Java A More ❯
of: Billing Account data and Invoices, Credit memos, Payments, Refunds Additionally, you’ll work to modernize the RevPro journal outbound flow, evolving it from a slow and fragile monthly batch process to a daily or event-based integration with Workday, improving data availability and reducing reconciliation issues. You will: Collaborate with Finance, Accounting, and RevOps teams to map and More ❯
syncing of: Billing Account data and Invoices, Credit memos, Payments, Refunds Additionally, youll work to modernize the RevPro journal outbound flow, evolving it from a slow and fragile monthly batch process to a daily or event-based integration with Workday, improving data availability and reducing reconciliation issues. You will: Collaborate with Finance, Accounting, and RevOps teams to map and More ❯
and implement scalable, automated infrastructure for deploying ML models in AWS, using Terraform as the primary provisioning tool. Manage and optimise existing AWS environments (SageMaker, ECS/EKS, Lambda, Batch, and GPU-backed instances). Build and maintain CI/CD pipelines for ML model delivery and monitoring. Ensure the infrastructure can support both real-time inference and batchprocessing workloads. Collaborate closely with Data Scientists and Engineers to productionise models efficiently. Monitor system performance and cost, identifying opportunities for optimisation and automation. Maintain infrastructure security, reliability, and compliance with best practices. Essential Skills & Experience Extensive hands-on experience with Terraform, including provisioning and managing complex AWS infrastructure. Strong knowledge of AWS services relevant to ML Ops … SageMaker for model training and deployment ECS/EKS or Elastic Beanstalk for containerised workloads Lambda and Batch for inference pipelines S3, CloudWatch, IAM, Glue, and related orchestration tools Proven experience deploying GPU-accelerated models in production. Solid understanding of ML model lifecycle management, including versioning, packaging, and scaling. Competence in Python and experience with frameworks such as FastAPI More ❯
field Experience: Proven experience in ML model lifecycle management Core Competencies : Model lifecycle: You've got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯