Senior Data Engineer

Overview:

We are seeking a highly skilled and experienced Senior Data Engineer with a minimum of 5 years working with Databricks and Lakehouse architecture to join our team in the consumer finance industry. The successful candidate will play a critical role in mapping requirements, designing, implementing, and maintaining scalable data solutions, and will ensure seamless integration and operation of CI/CD pipelines.

 

Key Responsibilities:

  • Design, develop, and optimize robust data architectures using Databricks and Lakehouse principles to support large-scale and complex data analytics needs.
  • Implement and maintain CI/CD pipelines to ensure continuous integration and delivery of data solutions, ensuring data quality and operational efficiency.
  • Collaborate with cross-functional teams (ideally Finance) to understand data requirements, map data through distributed systems, and translate these into technical solutions that align with business objectives.
  • Manage and optimize data storage and retrieval systems to ensure performance and cost-effectiveness.
  • Develop, maintain, and document ETL/ELT processes for data ingestion, transformation, and loading using industry best practices.
  • Ensure data security and compliance, particularly within the context of financial data, adhering to relevant regulations and standards.
  • Troubleshoot and resolve any data-related issues, ensuring high availability and reliability of data systems.
  • Evaluate and incorporate new technologies and tools to improve data engineering practices and productivity.
  • Mentor junior data engineers and provide technical guidance to the wider team.
  • Contribute to the strategic planning of data architecture and infrastructure.

Required Qualifications and Experience:

  • Bachelor’s degree in Computer Science, Information Technology, or a related field. A Master’s degree is a plus.
  • Minimum of 5 years of professional experience as a Data Engineer or in a similar role within the finance industry, demonstrating experience of working with consumer finance data models.
  • Proficient in using Databricks for data engineering and analytics.
  • Strong experience with Lakehouse architecture and its optimization.
  • Highly proficient in programming languages such as Python
  • Demonstrable expertise in implementing and managing CI/CD pipelines for data solutions
  • Solid experience with cloud platforms (e.g., AWS, Azure, or GCP), and their data services.
  • Deep understanding of data warehousing concepts and technologies (e.g., Snowflake, Redshift).
  • Strong knowledge of ETL/ELT processes and tools.
  • Solid experience of utilising PowerBI or similar visualisation tools
  • Experience working with big data technologies and frameworks (e.g., Spark)
  • Excellent problem-solving skills and a proactive approach to data engineering challenges.
  • Strong communication skills with the ability to articulate complex technical concepts to non-technical stakeholders.

 

Desirable Skills:

 

  • Certifications in Databricks or cloud technologies.
  • Experience with machine learning pipelines and model deployment.
  • Knowledge of regulatory requirements in the finance industry, such as GDPR or PCI-DSS.
  • Experience with agile development methodologies, such as Scrum or Kanban.
Company
Glow Services Corp
Location
London, UK
Posted
Company
Glow Services Corp
Location
London, UK
Posted