Data Architect

location: Central London

Hybrid role (2-3 days from client location)

Job type: Permanent

We are looking a highly skilled Data Architect- Technical with expertise in Databricks, PySpark, and modern data engineering practices. The ideal candidate will lead the design, development, and optimization of scalable data pipelines, while ensuring data accuracy, consistency, and performance across the enterprise Lakehouse platform. This role requires strong leadership, technical depth, and the ability to collaborate with cross-functional teams.

Key Responsibilities

Lead the design, development, and maintenance of scalable, high-performance data pipelines on Databricks.
Architect and implement data ingestion, transformation, and integration workflows using PySpark, SQL, and Delta Lake.
Guide the team in migrating Legacy ETL processes to modern cloud-based data pipelines.
Ensure data accuracy, schema consistency, row counts, and KPIs during migration and transformation.
Collaborate with Data engineer, BI Engineers, and Security teams to define data standards, governance, and compliance.
Optimize Spark jobs and Databricks clusters for performance and cost efficiency.
Support Real Time and batch data processing for downstream systems (eg, BI tools, APIs, reporting consumers).
Mentor junior engineers, conduct code reviews, and enforce best practices in coding, testing, and deployment.
Validate SLAs for data processing and reporting, ensuring business requirements are consistently met.
Stay updated with industry trends and emerging technologies in data engineering, cloud platforms, and analytics.

Required Skills & Qualifications

10-12 years of experience in data engineering, with at least 3+ years in a technical lead role.
Strong expertise in Databricks, PySpark, and Delta Lake. DBT
Advanced proficiency in SQL, ETL/ELT pipelines, and data modelling.
Experience with Azure Data Services (ADLS, ADF, Synapse) or other major cloud platforms (AWS/GCP).
Strong knowledge of data warehousing, transformation logic, SLAs, and dependencies.
Hands-on experience with Real Time streaming near-Real Time batch is a plus., optimisation of data bricks and DBT workload and Delta Lake
Familiarity with CI/CD pipelines, DevOps practices, and Git-based workflows.
Knowledge of data security, encryption, and compliance frameworks (GDPR, SOC2, ISO).good to have
Excellent problem-solving skills, leadership ability, and communication skills.

Preferred Qualifications

Certifications in Databricks, Azure good to have
Experience with DBT, APIs, or (BI integrations (Qlik, Power BI, Tableau).- good to have)

Company
Ubique Systems UK Limited
Location
London, United Kingdom
Hybrid / WFH Options
Employment Type
Permanent
Salary
GBP Annual
Posted
Company
Ubique Systems UK Limited
Location
London, United Kingdom
Hybrid / WFH Options
Employment Type
Permanent
Salary
GBP Annual
Posted