The Role: As a Data Platform Engineer in a highly regulated environment, you will be responsible for designing, building, and maintaining secure and scalable data infrastructure that supports both cloud and on premises platforms. You will play a key role in ensuring that all data systems comply with industry regulations and security standards while … enabling efficient access for analytics and operational teams. A strong command of Apache NiFi is essential for this role. You will be expected to design, implement, and maintain data flows using NiFi, ensuring accurate, efficient, and secure data ingestion, transformation, and delivery. You should be adept at identifying and resolving issues within NiFi flows, managing performance … with over 3 years of relevant experience in data engineering, platform engineering, or a related field, with demonstrated hands-on expertise in NiFi and datapipeline design in regulated environments. Responsibilities: Design, develop, and maintain robust and secure datapipelines using NiFi and related big data technologies. Troubleshoot and optimize NiFi More ❯
Contract Data Engineer Financial Services 6+ Months Competitive Day Rate My client, a global consultancy, is looking for an experienced Data Engineer to join on a contract basis to support major digital transformation projects with Tier 1 banks. You'll help design and build scalable, cloud-based data solutions using Databricks , Python , Spark , and … traffic financial applications. Key Skills & Experience: Strong hands-on experience with Databricks , Delta Lake , Spark Structured Streaming , and Unity Catalog Advanced Python/PySpark and big datapipeline development Familiar with event streaming tools ( Kafka , Azure Event Hubs ) Solid understanding of SQL , data modelling , and lakehouse architecture Experience deploying via CI/CD tools (e.g. … Azure DevOps, GitHub Actions) Nice to Have: Knowledge of Scala/Java Understanding of GDPR and handling sensitive data This is a contract role (UK-based) offering the chance to work on high-impact projects shaping the future of finance with some onsite presence required. More ❯
Reading, Berkshire, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
AWS Data Engineer – Contract Location: Reading (Hybrid – 1–2 days/month onsite) Rate: £500-550/day (Inside IR35) Start Date: ASAP Duration: 5 months (with potential for extension)A leading financial services organisation is seeking an experienced AWS Data Engineer to join their Compliance Reporting team. This backend-focused role involves designing and deploying … scalable data solutions that support the delivery of regulatory compliance reports across the business.You’ll work with a modern AWS stack and infrastructure-as-code tools to build robust datapipelines and applications that process complex datasets from multiple operational systems.Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/… Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV Collaborate with stakeholders to deliver technical solutions aligned with regulatory requirements Manage CI/CD workflows using Bitbucket, Terraform, and Atlantis Support database management and improve data ingestion More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
AWS Data Engineer - Contract Location: Reading (Hybrid - 1-2 days/month onsite) Rate: £500-550/day (Inside IR35) Start Date: ASAP Duration: 5 months (with potential for extension) A leading financial services organisation is seeking an experienced AWS Data Engineer to join their Compliance Reporting team. This backend-focused role involves designing and deploying … scalable data solutions that support the delivery of regulatory compliance reports across the business. You'll work with a modern AWS stack and infrastructure-as-code tools to build robust datapipelines and applications that process complex datasets from multiple operational systems. Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue … PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV Collaborate with stakeholders to deliver technical solutions aligned with regulatory requirements Manage CI/CD workflows using Bitbucket, Terraform, and Atlantis Support database management and improve dataMore ❯