Databricks Architect

Job Title: Databricks Architect

Location: London, UK

Work Mode: Onsite

Experience: 10–15 Years (minimum 5+ years in Databricks)

Employment Type: Full-time

About the Role

We are seeking a highly experienced Databricks Architect to lead the design, development, and optimization of scalable data platforms and advanced analytics solutions. This role requires a deep understanding of Databricks Lakehouse architecture, big data engineering, cloud ecosystems, and enterprise data strategy. The ideal candidate will partner with business and technology leaders to deliver high-performance data systems that support analytics, AI/ML, and real-time insights.

Key Responsibilities

Architecture & Solution Design

  • Architect end-to-end Databricks Lakehouse solutions including data ingestion, ETL/ELT pipelines, Delta Lake, data warehousing, and data governance frameworks.
  • Define and enforce best practices, security standards, and performance optimization across Databricks workloads.
  • Design scalable big data architectures leveraging Spark, Delta Live Tables, Unity Catalog, and MLflow.
  • Lead cloud architecture design on Azure/AWS/GCP (based on organization need) integrated with Databricks.

Data Engineering & Platform Build

  • Build and optimize large-scale ETL/ELT pipelines using PySpark, SQL, and Delta.
  • Oversee data quality frameworks, metadata management, lineage, and monitoring.

Leadership & Stakeholder Engagement

  • Work closely with business leaders and product teams to translate requirements into robust technical solutions.
  • Guide development teams, perform architecture reviews, and ensure platform engineering excellence.
  • Conduct technical workshops, POCs, and roadmap planning for the Databricks environment.

Performance, Security & Governance

  • Optimize Databricks clusters, query performance, and cost management.
  • Implement data governance standards using Unity Catalog, RBAC, and compliance guidelines (GDPR, ISO).
  • Ensure resilience, scalability, and disaster recovery readiness.

Required Skills & Qualifications

  • 10–15 years overall experience in data engineering or data architecture.
  • 5+ years of hands-on architecture experience specifically in Databricks.
  • Expert in Apache Spark, PySpark, Databricks SQL, Delta Lake, and distributed systems.
  • Strong understanding of Lakehouse architecture, BI ecosystems, and modern data platforms.
  • Cloud expertise in Azure / AWS / GCP (Azure preferred for many UK clients).
  • Experience with CI/CD, Infrastructure-as-Code (Terraform preferred), and DevOps.
  • Proven experience leading teams and delivering enterprise-grade data solutions.

Preferred Qualifications

  • Databricks certifications (e.g., Databricks Certified Data Engineer Professional, Architect).
  • Experience building ML solutions using MLflow or integrating with cloud ML services.
  • Experience in BFSI, Retail, Telecom, or Healthcare data environments.
  • Experience with real-time data processing (Kafka, EventHub, Kinesis).

Soft Skills

  • Excellent communication and stakeholder management skills.
  • Strong problem-solving and solution-oriented mindset.
  • Ability to lead in a fast-paced and dynamic environment.
  • Collaborative, proactive, and able to provide strong technical direction.

Why Join Us?

  • Work onsite with a high-impact, cutting-edge data team in London.
  • Opportunity to lead large-scale data transformation initiatives.
  • Exposure to enterprise data strategy, advanced analytics, and AI projects.
  • Competitive compensation and career-growth opportunities.

Job Details

Company
Tredence Inc
Location
London, UK
Employment Type
Full-time
Posted