Data Engineer

We are seeking a Data Engineer who will be responsible for delivering secure, scalable, and reliable data pipelines, models, and reporting datasets that support business intelligence, analytics, and AI/ML workloads. This role is hands-on across Microsoft Fabric (including OneLake) and Snowflake, with a strong focus on ETL/ELT engineering, data modelling, performance optimisation, and data governance.

Client Details

The client is a well-established organisation within the financial services sector, operating in a professional, fast-paced, and data-driven environment. They are investing heavily in modern cloud technologies, advanced analytics, and scalable data platforms to support both internal operations and customer-focused insights.

Description

  • Deliver and enhance ETL/ELT pipelines across Microsoft Fabric and Snowflake
  • Configure and maintain ingestion from APIs, files, and other data feeds
  • Implement integration patterns between OneLake and Snowflake
  • Build and operate medallion architecture (bronze / silver / gold) layers
  • Define and maintain data models, schemas, and semantic layers
  • Optimise pipeline performance, query efficiency, and cloud cost
  • Troubleshoot and resolve data quality, reliability, and performance issues
  • Implement role-based access, secure data handling, and privacy-aware processing
  • Apply row-level security and dataset-level client isolation
  • Develop SQL, Python, and PySpark scripts for AI/ML feature and model datasets
  • Deliver work through Jira and collaborate with cross-functional teams
  • Use GitHub for version control and contribute to CI/CD processes
  • Produce documentation on data flows, transformations, and operational procedures

Profile

  • 5+ years of data engineering experience delivering production-grade pipelines
  • Strong understanding of modern cloud data architectures and cost optimisation
  • Advanced SQL and relational data modelling skills
  • Hands-on experience with Snowflake and Microsoft Fabric
  • Proficiency in Python and PySpark, including notebook-based development
  • Experience with workflow orchestration, CI/CD, and GitHub-based delivery
  • Strong knowledge of data security, encryption, and GDPR compliance
  • Experience supporting BI/reporting tools (Power BI preferred)
  • Excellent communication skills and the ability to work proactively and independently

Job Offer

  • Competitive daily rate.
  • 6month contract with potential to extend.
  • 3 days per week based in Liverpool Street, London.
  • Temporary position offering valuable experience in data analytics and engineering.

Job Details

Company
Michael Page Technology
Location
London, South East, England, United Kingdom
Employment Type
Temporary
Salary
Salary negotiable
Posted