/09/2025 End Date: 27/03/2026 Rate: £650 per day - PAYE via Umbrella Only Role Overview: We are looking for a seasoned Azure Batch Cloud Architect to lead the design and implementation of scalable, secure, and high-performance batchprocessing solutions in Microsoft Azure. This role is pivotal in supporting data-intensive … workloads and compute-heavy operations critical to financial services, including risk modelling, regulatory reporting, and large-scale data transformation. Key Responsibilities: Architect and implement Azure Batch solutions to support large-scale, parallel compute workloads. Design and optimise cloud-native batchprocessing pipelines for financial analytics, simulations, and reporting. Collaborate with data engineering, DevOps, and security teams to … ensure end-to-end automation, compliance, and observability. Define and enforce governance, cost optimisation, and security best practices for batch workloads in Azure. Lead proof-of-concept initiatives, performance tuning, and capacity planning for compute-intensive applications. Provide architectural guidance on containerised batch jobs using Docker and Azure Kubernetes Service (AKS) where applicable. Maintain documentation and provide training More ❯
Farnborough, Hampshire, South East, United Kingdom
Peregrine
cross-functional teams including data architects, compliance officers, and cybersecurity specialists to integrate data from various systems such as databases, APIs, and cloud platforms. Your work will directly support batchprocessing, real-time streaming, and event-driven data pipelines across a variety of use cases. Were looking for candidates with over 3 years of relevant experience in data … the following skills or proven experience: Apache NiFi Expertise: Deep understanding of core NiFi concepts: FlowFiles, Processors, Controller Services, Schedulers, Web UI. Experience designing and optimizing data flows for batch, real-time streaming, and event-driven architectures. Ability to identify and resolve flow issues, optimize performance, and implement error-handling strategies. Optional scripting skills for creating custom NiFi processors. More ❯
a recent focus on developing data and analytics solutions on cloud platform (e.g. GCP/AWS/Azure). Technical know-how in data engineering techniques which cover both batchprocessing and/or streaming Agile Delivery: Significant experience in scoping and development of technical solutions in an Agile environment. Technical Proficiency: Deep technical expertise in software and … data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (Kafka, Flink, DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. CI/CD: Detailed understanding of working automated CI/CD pipelines and experience of working with tools … had Industry Standard: GCP Data Engineer/Cloud Architect certifications Good appreciation of data security and privacy, and architectural implications it has on application design. Modern progressive technologies- e.g. batch/streaming pipelines, machine learning, artificial intelligence etc. High-level knowledge of QA, data quality, and software quality tools such as SonarQube, etc. About working for us Our ambition More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
a recent focus on developing data and analytics solutions on cloud platform (e.g. GCP/AWS/Azure). Technical know-how in data engineering techniques which cover both batchprocessing and/or streaming Agile Delivery: Significant experience in scoping and development of technical solutions in an Agile environment. Technical Proficiency: Deep technical expertise in software and … data engineering, programming languages (python, java etc.). Understanding of orchestration (Composer, DAGs), data processing (Kafka, Flink, DataFlow, dbt), and database capabilities (e.g. BigQuery, CloudSQL, BigTable). Container technologies (Docker, Kubernetes), IaaC (Terraform) and experience with cloud platforms such as GCP. CI/CD: Detailed understanding of working automated CI/CD pipelines and experience of working with tools … had Industry Standard: GCP Data Engineer/Cloud Architect certifications Good appreciation of data security and privacy, and architectural implications it has on application design. Modern progressive technologies- e.g. batch/streaming pipelines, machine learning, artificial intelligence etc. High-level knowledge of QA, data quality, and software quality tools such as SonarQube, etc. About working for us Our ambition More ❯
systems are written in Elixir, but where necessary, we use small amounts of Python and Java where vendor SDKs require it. Designing, developing and maintaining realtime data streaming, and batchprocessing workloads. Providing on-call support as part of our teamwide rotation. Our on-call rotation is split across US and UK time zones, ensuring coverage whilst keeping … algorithms. Analysis of concurrency and parallelism for speed/space performance tradeoffs. Bonus Experience: Exchange traded financial instruments. Problem-solving and proof construction. For more information about DRW's processing activities and our use of job applicants' data, please view our Privacy Notice at . More ❯
management, data lineage tracking, and access control to ensure data integrity and compliance. Optimize data pipelines and transformation logic using SAP Datasphere's capabilities to support real-time and batchprocessing needs. Collaborate with business and technical teams to translate data requirements into robust, reusable data products and services. Monitor and tune system performance, proactively identifying and resolving More ❯
management, data lineage tracking, and access control to ensure data integrity and compliance. * Optimize data pipelines and transformation logic using SAP SAC's capabilities to support real-time and batchprocessing needs. * Collaborate with business and technical teams to translate data requirements into robust, reusable data products and services. * Monitor and tune system performance, proactively identifying and resolving More ❯
management, data lineage tracking, and access control to ensure data integrity and compliance. * Optimize data pipelines and transformation logic using SAP Datasphere's capabilities to support real-time and batchprocessing needs. * Collaborate with business and technical teams to translate data requirements into robust, reusable data products and services. * Monitor and tune system performance, proactively identifying and resolving More ❯
background - proven track record in enterprise data solutions Experience with ETL processes and data transformation, preferably using Databricks Strong foundation in Data Warehousing architectures and dimensional modeling Familiarity with batchprocessing from relational database sources Communication & Collaboration Skills of the Data Engineer Outstanding stakeholder engagement abilities across technical and business audiences Strong relationship-building skills with experience managing More ❯
they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of More ❯
they operate. Hands on Experience in Java , Spark , Scala ( or Java) Production scale hands-on Experience to write Data pipelines using Spark/any other distributed real time/batch processing. Strong skill set in SQL/Databases Strong understanding of Messaging tech like Kafka, Solace , MQ etc. Writing production scale applications to use the Caching technologies. Understanding of More ❯
building scalable, reliable, highly available and fault-tolerant systems, broad knowledge of software engineering and site reliability engineering in areas such as Large-Scale Data and Compute Infrastructure, Stream Processing, Kubernetes, High-Performance Networking, Observability and Infrastructure Automation. RESPONSIBILITIES Set the technology strategy for our cloud infrastructure, factoring in the AD/ADAS cloud development needs and our cloud … to automate routine tasks. NICE TO HAVES Master's degree in Computer Science. Experience working as a Software Engineer on data-intensive applications, data platforms, data pipelines, workflow orchestration, batchprocessing, and/or distributed databases. Previous experience in monitoring, tracking and optimising cloud compute and storage costs Experience working with RPC protocols and their formats, e.g., gRPC More ❯
Data Scientists, Business Analysts Extensive experience using Python, SQL and AWS Define and enforce data architecture across mesh and domain-driven data products. Implement and govern real-time/batchprocessing (Kafka, Spark, Glue). Ensure strong metadata, cataloguing, and lineage practices across the enterprise. Lead teams of engineers across global hubs, mentoring and supporting high standards. Knowledge More ❯
needs within each department. MAIN RESPONSIBILITIES Database estate administration and monitoring Managing application system Disaster Recovery and high availability Managing backup and recovery strategies/automation of maintenance and batchprocessing Database performance tuning including TSQL/Procedural performance tuning (Optimizer and Index usage) Database design and architecture, development and administration Setting up best practice database development and More ❯
and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
and review code, paired programming, and debugging code related performance issues, SQL tuning, etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka). Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda, and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
/review code/paired programming/debugging code related performance issues, SQL tuning etc. Experience with AWS services such as S3, RDS, Aurora, NoSQL, MSK (Kafka) Experience with batchprocessing/ETL using Glue Jobs, AWS Lambda and Step functions. Experience with designing bespoke & tailored front-end solutions (GUI based) using open source technology such as React More ❯
Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & BatchProcessing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self-service analytics. Excellent leadership and More ❯
Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & BatchProcessing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self-service analytics. Excellent leadership and More ❯
needs within each department. MAIN RESPONSIBILITIES Database estate administration and monitoring Managing application system Disaster Recovery and high availability Managing backup and recovery strategies/automation of maintenance and batchprocessing Database performance tuning including TSQL/Procedural performance tuning (Optimizer and Index usage) Database design and architecture, development and administration Setting up best practice database development and More ❯
needs within each department. MAIN RESPONSIBILITIES Database estate administration and monitoring Managing application system Disaster Recovery and high availability Managing backup and recovery strategies/automation of maintenance and batchprocessing Database performance tuning including TSQL/Procedural performance tuning (Optimizer and Index usage) Database design and architecture, development and administration Setting up best practice database development and More ❯
mindset with strong organizational skills and attention to detail. Familiarity with tools like JIRA or ServiceNow. Certifications in ITIL, Linux, or cloud platforms (e.g., AWS, Azure). Experience with batchprocessing, data interfaces, or monitoring tools. Exposure to automation or scripting. Learn more about the LexisNexis Risk team and how we work We are committed to providing a More ❯
ensure alignment with business goals. Support upgrades, vendor management, and regulatory compliance, including GDPR . Investigate and resolve complex technical issues, ensuring minimal service disruption. Maintain and monitor overnight batch processes and key operational reports using tools like Talend, Skybot, SOA, and Progress. Take ownership of environment refreshes, code deployments, and configuration changes across multiple system instances. Contribute to More ❯
that serve,process and transform large quantities of data in the cloud Minimum Qualifications: Experience with Python (or other Object Oriented language) Experience building reliable, distributed applications for Data Processing or similar areas Hands-on experience developing cloud applications (e.g. AWS, GCP, Azure) Experience with technologies like BigQuery and SnowFlake Preferred Qualifications: Experience writing testable and modular code Experience … working in a fast-paced environment, collaborating across teams and disciplines Experience designing, deploying, and maintaining distributed systems Data pipelines, data platforms, workflow orchestration, batchprocessing Experience on building ML pipelines WHAT WE OFFER We are committed to creating a modern work environment that supports our employees and their loved ones. We offer many options of the best More ❯
testable and modular code Experience working in a fast-paced environment, collaborating across teams and disciplines Experience designing, deploying, and maintaining distributed systems Data pipelines, data platforms, workflow orchestration, batchprocessing WHAT WE OFFER We are committed to creating a modern work environment that supports our employees and their loved ones. We offer many options of the best More ❯