Telford, Shropshire, England, United Kingdom Hybrid / WFH Options
TXP Technology x People
support legacy systems within a Central Government environment. Work with COBOL, ICL VME, and PL/SQL to deliver reliable, high-quality solutions. Apply structured programming principles and manage batchprocessing tasks. Collaborate with the team to ensure systems meet operational and compliance requirements. Essential Skills and Experience: Strong experience with COBOL development. Hands-on experience with ICL … VME and PL/SQL . Knowledge of structured programming and batchprocessing in mainframe environments. Ability to work effectively in a remote UK-based team. BPSS minimum security clearance (SC preferred). This is an excellent opportunity for a skilled COBOL developer to contribute to high-profile Central Government programmes, ensuring the reliability and continuity of essential More ❯
cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms. Your work will directly support batchprocessing, real-time streaming, and event-driven data pipelines across a variety of use cases. We’re looking for candidates with over 3 years of relevant experience in … the following skills or proven experience: Apache NiFi Expertise: Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. Experience designing and optimizing data flows for batch, real-time streaming, and event-driven architectures. Ability to identify and resolve flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. More ❯
cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms. Your work will directly support batchprocessing, real-time streaming, and event-driven data pipelines across a variety of use cases. We’re looking for candidates with over 3 years of relevant experience in … the following skills or proven experience: Apache NiFi Expertise: Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. Experience designing and optimizing data flows for batch, real-time streaming, and event-driven architectures. Ability to identify and resolve flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. More ❯
Cloud) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow etc Good knowledge of stream and batchprocessing solutions like Apache Flink, Apache Kafka Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic etc Given that this is just More ❯
currently recruiting for a Platfrom Support Engineer to join an industry leading and ambitious financial services company. This is an exciting opportunity with multifaceted responsibilities, including infrastructure, application and batch support. As well as developing and maintaining Grafana dashboards. What you’ll do: • Support batchprocessing and carry out proactive analysis across squads and services. • Support and … resolve production incidents, problems, and change requests. • Offer expertise on business processes for projects and production support issues. • Maintain core systems and related infrastructure like Kubernetes and SOLR. • Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. • Provide data analysis to support business decisions • Deliver patching, database fixes, and business-critical data correction services. What we More ❯
currently recruiting for a Platfrom Support Engineer to join an industry leading and ambitious financial services company. This is an exciting opportunity with multifaceted responsibilities, including infrastructure, application and batch support. As well as developing and maintaining Grafana dashboards. What you'll do: • Support batchprocessing and carry out proactive analysis across squads and services. • Support and … resolve production incidents, problems, and change requests. • Offer expertise on business processes for projects and production support issues. • Maintain core systems and related infrastructure like Kubernetes and SOLR. • Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. • Provide data analysis to support business decisions • Deliver patching, database fixes, and business-critical data correction services. What we More ❯
currently recruiting for a Platform Support Engineer to join an industry leading and ambitious financial services company. This is an exciting opportunity with multifaceted responsibilities, including infrastructure, application and batch support. As well as developing and maintaining Grafana dashboards. What you'll be doing Support batchprocessing and carry out proactive analysis across squads and services. Support … resolve production incidents, problems, and change requests. Offer expertise on business processes for projects and production support issues. Maintain core systems and related infrastructure like Kubernetes and SOLR. Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. Provide data analysis to support business decisions Deliver patching, database fixes, and business-critical data correction services. Experience Required More ❯
bristol, south west england, united kingdom Hybrid / WFH Options
Lloyds Banking Group
a recent focus on developing data and analytics solutions on cloud platform (e.g. GCP/AWS/Azure). Technical know-how in data engineering techniques which cover both batchprocessing and/or streaming Agile Delivery: Significant experience in scoping and development of technical solutions in an Agile environment. Technical Proficiency: Deep technical expertise in software and … data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (Kafka, Flink, DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. CI/CD: Detailed understanding of working automated CI/CD pipelines and experience of working with tools … had Industry Standard: GCP Data Engineer/Cloud Architect certifications Good appreciation of data security and privacy, and architectural implications it has on application design. Modern progressive technologies– e.g. batch/streaming pipelines, machine learning, artificial intelligence etc. High-level knowledge of QA, data quality, and software quality tools such as SonarQube, etc. About working for us Our ambition More ❯
Google Cloud) and big data technologies (e.g., Hadoop, Spark). Strong knowledge of data workflow solutions like Azure Data Factory, Apache NiFi, Apache Airflow Good knowledge of stream and batchprocessing solutions like Apache Flink, Apache Kafka Good knowledge of log management, monitoring, and analytics solutions like Splunk, Elastic Stack, New Relic Note: The following line contains removed More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
high motivated individual to bring a diverse skill set and proactive mindset to help ensure the stability, efficiency, and resilience of applications across esure. What you’ll do: Support batchprocessing and carry out proactive analysis across squads and services. Review Java code and suggest improvements to enhance quality and performance. Investigate and resolve production incidents, problems, and … change requests. Offer expertise on business processes for projects and production support issues. Maintain core Insurance systems and related infrastructure like Kubernetes and SOLR. Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. Provide data analysis to support business decisions across Operations and Claims. Deliver patching, database fixes, and business-critical data correction services. What we More ❯
guildford, south east england, united kingdom Hybrid / WFH Options
esure Group
high motivated individual to bring a diverse skill set and proactive mindset to help ensure the stability, efficiency, and resilience of applications across esure. What you’ll do: Support batchprocessing and carry out proactive analysis across squads and services. Review Java code and suggest improvements to enhance quality and performance. Investigate and resolve production incidents, problems, and … change requests. Offer expertise on business processes for projects and production support issues. Maintain core Insurance systems and related infrastructure like Kubernetes and SOLR. Improve batch efficiency and reduce costs, including audit-ready data archiving strategies. Provide data analysis to support business decisions across Operations and Claims. Deliver patching, database fixes, and business-critical data correction services. What we More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
CloudSQL, Spanner, Pub Sub, Cloud composer and GKE. Data Engineering : Proficient in modern data tools and frameworks for building, operating, and optimising ETL/ELT pipelines, streaming data, and batch processing. SRE & Automation : Strong understanding of SRE principles, CI/CD, infrastructure as code, and operational automation. Data Literacy : Ability to design, develop, and apply data and analytics experience More ❯
Senior Data Engineer – SC Cleared We are seeking a hands-on Senior Data Engineer with deep expertise in building and managing streaming and batch data pipelines . The ideal candidate will have strong experience working with large-scale data systems operating on cloud-based platforms such as AWS, Databricks or Snowflake . This role also involves close collaboration with … to evaluate and document Proofs of Concept (PoCs) for modern data platforms, while effectively engaging with senior stakeholders across the organisation. Key Responsibilities: Design, develop, and maintain streaming and batch data pipelines using modern data engineering tools and frameworks. Work with large volumes of structured and unstructured data , ensuring high performance and scalability. Collaborate with cloud providers and data … and Security teams to align data platform initiatives with business and technical strategies. Required Experience & Skills: Proven experience as a Data Engineer with a strong focus on streaming and batchprocessing . Hands-on experience with cloud-based data plaforms such as AWS/Databricks/IBM/Snowflake . Strong programming skills in Python, Scala, or Java More ❯
field Experience: Proven experience in ML model lifecycle management Core Competencies : Model lifecycle: You've got hands-on experience with managing the ML model lifecycle, including both online and batch processes Statistical Methodology: You have worked with GLMs and other machine learning algorithms and have in-depth knowledge of how they work Python: You have built and deployed production More ❯
Java, SQL, C#, Python Connex Gateways for trade booking and market data integration Real-Time Position Engine (RTPE) and Automated Position Management (APM) Performance tuning and optimization for EOD batchprocessing If interested please apply here or reacj out to me on More ❯
Java, SQL, C#, Python Connex Gateways for trade booking and market data integration Real-Time Position Engine (RTPE) and Automated Position Management (APM) Performance tuning and optimization for EOD batchprocessing If interested please apply here or reacj out to me on +44 More ❯
Java, SQL, C#, Python Connex Gateways for trade booking and market data integration Real-Time Position Engine (RTPE) and Automated Position Management (APM) Performance tuning and optimization for EOD batchprocessing If interested please apply here or reacj out to me on +44 More ❯
Java, SQL, C#, Python Connex Gateways for trade booking and market data integration Real-Time Position Engine (RTPE) and Automated Position Management (APM) Performance tuning and optimization for EOD batchprocessing If interested please apply here or reacj out to me on +44 More ❯
Java, SQL, C#, Python Connex Gateways for trade booking and market data integration Real-Time Position Engine (RTPE) and Automated Position Management (APM) Performance tuning and optimization for EOD batchprocessing If interested please apply here or reacj out to me on +44 More ❯
london (city of london), south east england, united kingdom
RJC Group
Java, SQL, C#, Python Connex Gateways for trade booking and market data integration Real-Time Position Engine (RTPE) and Automated Position Management (APM) Performance tuning and optimization for EOD batchprocessing If interested please apply here or reacj out to me on +44 More ❯
will have direct global impact. In this role, you will own the full data stack from ingestion through to analytics, building and maintaining scalable pipelines across API, streaming, and batch processes. You will manage cloud infrastructure on AWS with a DevOps mindset, including CI/CD, infrastructure as code, monitoring, and scaling. Working closely with engineering and product teams More ❯
will have direct global impact. In this role, you will own the full data stack from ingestion through to analytics, building and maintaining scalable pipelines across API, streaming, and batch processes. You will manage cloud infrastructure on AWS with a DevOps mindset, including CI/CD, infrastructure as code, monitoring, and scaling. Working closely with engineering and product teams More ❯
will have direct global impact. In this role, you will own the full data stack from ingestion through to analytics, building and maintaining scalable pipelines across API, streaming, and batch processes. You will manage cloud infrastructure on AWS with a DevOps mindset, including CI/CD, infrastructure as code, monitoring, and scaling. Working closely with engineering and product teams More ❯
will have direct global impact. In this role, you will own the full data stack from ingestion through to analytics, building and maintaining scalable pipelines across API, streaming, and batch processes. You will manage cloud infrastructure on AWS with a DevOps mindset, including CI/CD, infrastructure as code, monitoring, and scaling. Working closely with engineering and product teams More ❯
london (city of london), south east england, united kingdom
Futureheads Recruitment | B Corp™
will have direct global impact. In this role, you will own the full data stack from ingestion through to analytics, building and maintaining scalable pipelines across API, streaming, and batch processes. You will manage cloud infrastructure on AWS with a DevOps mindset, including CI/CD, infrastructure as code, monitoring, and scaling. Working closely with engineering and product teams More ❯