Databricks SME

Job Title: Solutions Architect (Databricks Expert)

Rate: Competitive

Location: Europe

Contract Length: 6-12 months

A consultancy client of ours have secured a project requiring a Databricks expert. This is an exciting opportunity to work on cutting-edge AI projects and build cloud-based systems that deliver real impact.

Databricks Solution Architect Key Responsibilities:

  • Architect and optimise scalable, high-performance AI and data solutions leveraging Azure Databricks and modern data platform technologies.
  • Serve as a subject matter expert on Databricks architecture, performance tuning, and best practices to enable advanced analytics and machine learning use cases.
  • Partner with data engineering, BI, analytics, and AI/ML teams to design robust, reusable, and production-grade data pipelines and model deployment frameworks.
  • Champion the adoption of Databricks capabilities including Delta Lake, Unity Catalog, and MLflow, ensuring alignment with enterprise AI strategy.
  • Lead the migration of legacy ETL and data processing workflows to modern, Databricks-native architectures that support AI-driven initiatives.
  • Enforce data quality, governance, lineage, and security standards to maintain a trusted and compliant AI data ecosystem.
  • Mentor and uplift teams, promoting best practices in Databricks usage, scalable data engineering, and MLOps integration.
  • Troubleshoot and resolve complex platform issues, acting as the senior escalation point for Databricks and AI architecture concerns.
  • Continuously improve data platform architecture, tools, and engineering practices to support evolving AI and analytics demands.
  • Collaborate closely with business and technical stakeholders to translate strategic data and AI needs into fit-for-purpose, production-ready solutions.

Databricks Solution Architect Experience and Qualifications Required:

  • Databricks Certified Professional.
  • Working knowledge of MLOps.
  • Extensive experience in a consulting environment i.e. strong stakeholder engagement capability.
  • Strong programming proficiency in Python, SQL, and Apache Spark, with a proven ability to design, optimise, and scale high-performance data pipelines for AI and analytics applications.
  • Deep understanding of cloud-native architecture within the Azure ecosystem — including Data Lake, Data Factory, and supporting services — to build resilient and scalable AI data platforms.
  • Skilled in data modelling and solution design, applying dimensional modelling principles (e.g., Kimball methodology) to support advanced analytics and machine learning.
  • Demonstrated success delivering enterprise-grade data and AI products in complex, large-scale environments with high reliability and performance requirements.
  • Solid grounding in data governance, security, and compliance frameworks, ensuring solutions meet organisational and regulatory standards.
  • Hands-on experience with CI/CD and MLOps practices, leveraging modern DevOps tooling to enable reliable and automated deployment of data and AI pipelines.
  • Exceptional problem-solving abilities with a track record of diagnosing and resolving complex technical issues in distributed data environments.
  • Strong communication and stakeholder engagement skills, able to bridge technical and business domains effectively.
  • Proven experience mentoring and upskilling data and AI engineers, fostering a culture of technical excellence and continuous learning.

If this sounds like an exciting opportunity please apply with your CV.

Company
X4 Technology
Location
United Kingdom, UK
Posted
Company
X4 Technology
Location
United Kingdom, UK
Posted