Data Architect
Derby, England, United Kingdom
SPG Resourcing
a high-performing and secure environment. The role reports to a project delivery lead and works closely with internal technical teams. Key Responsibilities: Design and implement Databricks Lakehouse architecture (Delta Lake, Unity Catalog, etc.) Develop ETL/ELT pipelines using Spark, Python, SQL, and Databricks workflows Integrate with Azure services and BI tools (e.g., Power BI) Optimise performance … and support CI/CD and MLOps pipelines Enable knowledge transfer through code reviews, training, and reusable templates Key Skill s: In-depth experience with Databricks (Delta Lake, Unity Catalog, Lakehouse architecture). Strong knowledge of Azure services (e.g. Data Lake, Data Factory, Synapse). Solid hands-on skills in Spark, Python, PySpark, and SQL. Understanding of More ❯
Posted: