3 of 3 Data Reconciliation Jobs in the South East

Data Migration Engineer

Hiring Organisation
Morson Edge
Location
Rochester, Kent, South East, United Kingdom
Employment Type
Contract
Contract Rate
£31.31 - 42.11 per hour + Inside IR35
Data Migration Engineer Location: Rochester, on site based Contract: 12 months Security Clearance: BPSS only, no nationality restrictions Rate: £31.31 p/h PAYE (£35.86 p/h inc. holiday rate) OR £42.11 p/h Umbrella Inside IR35 The Data Migration Engineer is responsible for designing, developing … executing data migration solutions to support ERP data transformation into Oracle Cloud. The role ensures data is migrated accurately, securely, and efficiently, maintaining data integrity, quality, and compliance in accordance with business transformation rules. This role works closely with business stakeholders, data architects, and project ...

Temporary Data Engineer - Banking / Payments

Hiring Organisation
Love Success Recruitment
Location
London, South East, England, United Kingdom
Employment Type
Temporary
Salary
£700 - £1,000 per day
Overview Our client is seeking an experienced Data Engineer to support a critical data migration programme within a banking environment. This role focuses on high-volume payments systems, requiring deep expertise in IBM DB2 on mainframe , safe data extraction, and end-to-end migration assurance. Key Responsibilities … Deliver data migration activities within banking or payments systems , ensuring zero disruption to live transaction processing Design and execute safe data extraction strategies from complex, high-volume, and highly concurrent tables Perform data mapping, lineage analysis , and documentation Lead data reconciliation and validation of migrated ...

Lead PySpark Engineer

Hiring Organisation
Randstad Technologies
Location
London, South East, England, United Kingdom
Employment Type
Contractor
Contract Rate
£281 - £292 per day
production-ready distributed pipelines for a Tier-1 financial services environment. Core Responsibilities Engineering Leadership: Design and develop complex ETL/ELT pipelines and Data Marts using PySpark, EMR, and Glue. Legacy Modernisation: Architect the conversion of SAS Base/Macros into modular, testable Python code using SAS2PY … Spark execution (partitioning, shuffling, caching) to ensure cost-efficient processing of massive financial datasets. Quality & Governance: Implement rigorous CI/CD, unit testing, and data reconciliation frameworks to ensure "penny-perfect" accuracy. Technical Stack Engine: PySpark (Expert), Python (Clean Code/SOLID principles). AWS: EMR, Glue ...