that underpin essential business processes across multiple government departments and local authorities. You will play a key role in maintaining, optimising, and modernising the batch processing estate, ensuring the performance, reliability, and stability of a system that supports thousands of users and vital public services. Key Responsibilities Develop, maintain, and optimise Pro*C , PL/SQL , and shell script … E-Business Suite. Analyse and modernise legacy code , working within multi-disciplinary teams to understand requirements, propose technical options, and prototype solution approaches. Deliver robust, performant code and conduct performancetesting in high-volume, transaction-driven environments. Contribute to the ongoing enhancement of a mission-critical platform used across numerous government functions. Essential Skills & Experience Strong hands-on … Legacy code analysis Working with multidisciplinary teams Providing technical input into solution design Rapid prototyping of solution approaches Ability to deliver highly performant, resilient, and maintainable code. Experience with performancetesting in demanding, high-transaction environments. Desirable Skills Experience with Oracle encryption , key management, and DBMS_CRYPTO . Exposure to containerising workloads and migrating services to AWS EKS More ❯
using Control-M and improve margin run performance. Implement trade compression and portfolio margining solutions. Collaborate with cross-functional teams to refine business requirements and deliver scalable solutions. Conduct performancetesting and capacity analysis. Participate in agile ceremonies and act as Scrum Lead when required. Required Skills & Experience: 5+ years of Murex development experience, ideally within FX or More ❯
Engineer with strong expertise in designing and optimising large-scale data engineering solutions within the Databricks Data Intelligence Platform. This role is ideal for someone passionate about building high-performance data pipelines and ensuring robust data governance across modern cloud environments. Key Responsibilities Design, build, and maintain scalable data pipelines using Databricks Notebooks, Jobs, and Workflows for both batch … and streaming data. Optimise Spark and Delta Lake performance through efficient cluster configuration, adaptive query execution, and caching strategies. Conduct performancetesting and cluster tuning to ensure cost-efficient, high-performing workloads. Implement data quality, lineage tracking, and access control policies aligned with Databricks Unity Catalogue and governance best practices. Develop PySpark applications for ETL, data transformation … such as Azure Data Lake Storage, Key Vault, and Azure Functions. What We're Looking For Proven experience with Databricks, PySpark, and Delta Lake. Strong understanding of workflow orchestration, performance optimisation, and data governance. Hands-on experience with Azure cloud services. Ability to work in a fast-paced environment and deliver high-quality solutions. SC Cleared candidates If you More ❯