Data Engineer

Role: Senior Databricks Architect

Location: London, UK

Mode: Hybrid (3 days onsite)

Job Description:

We are looking for an experienced Databricks Architect/Data Engineer to design, build, and optimize our Lakehouse architecture on Databricks. You will play a key role in shaping our data strategy, ensuring scalability, performance, and governance while working with Delta Lake, Data Catalog, and PySpark.

Key Responsibilities:

Databricks Lakehouse Architecture: Design and implement scalable Databricks Lakehouse solutions with Delta Lake for optimized storage and analytics.

Data Governance & Cataloging: Establish data cataloging, lineage, and metadata management for improved discoverability.

Performance Optimization: Tune Spark/PySpark jobs for efficiency in large-scale data processing.

Data Modelling & Quality: Develop dimensional/data vault models and enforce data quality checks.

Collaboration: Work with data scientists, analysts, and business teams to enable self-service analytics.

CI/CD & Automation: Implement Databricks workflows and integrate with Azure/AWS/GCP data ecosystems.

Primary Skills (Must-Have):

Databricks – Architecture, Delta Lake, Lakehouse, Unity Catalog/Data Catalog

PySpark (optimization, UDFs, Delta operations)

SQL (advanced querying, performance tuning)

Data Lake/Warehouse best practices

Secondary Skills (Nice-to-Have):

Python (for scripting & automation)

Data Modelling (star schema, Kimball, Data Vault)

Data Quality/Validation frameworks

ETL/ELT pipelines

Work Arrangement:

Hybrid (3 days in office – ideally Tues-Thurs, Paddington, London)

Flexible remote work (2 days/week)

Job Details

Company
Quantum World Technologies Inc
Location
Bradford, UK
Hybrid / Remote Options
Employment Type
Full-time
Posted