Forward Deployed Engineer - Databricks

Who we are

DecisionForest is a Bronze-Tier Databricks C&SI Partner, specialising in modern and scalable Data & AI platforms.

Mission

As a Forward Deployed Engineer you will work directly with client teams to design, build and deploy production solutions on Databricks.

You’ll help customers solve real business problems using the Databricks platform, including data engineering, Lakehouse architecture, Unity Catalog, platform governance, workflow orchestration, performance optimisation and AI/data product enablement.

You will operate close to the customer. That means joining discovery sessions, understanding current-state systems, identifying implementation gaps, building technical artefacts, shipping code, documenting patterns and helping customers adopt Databricks technologies.

Responsibilities

  • Embed with customer teams to deliver production Databricks solutions.
  • Build data pipelines using Databricks, PySpark, SQL, Delta Lake, and Databricks Workflows.
  • Design and implement Lakehouse architecture patterns across bronze, silver and gold layers.
  • Implement Unity Catalog structures, including catalogs, schemas, tables, volumes, external locations, permissions, lineage and governance patterns.
  • Support migration from legacy data platforms or fragmented warehouse environments.
  • Build reliable ingestion, transformation and serving layers for analytics, AI and operational use cases.
  • Deliver technical artefacts for customers, including notebooks, jobs, pipelines, deployment templates, reference architectures, documentation and handover materials.
  • Provide deployment support for Databricks in enterprise environments.
  • Diagnose and resolve performance, reliability, cost and data quality issues.
  • Work with customer stakeholders across data engineering, analytics, governance, security, architecture and leadership teams.
  • Translate ambiguous customer requirements into clear technical plans.
  • Identify repeatable implementation patterns and feed them back into DecisionForest delivery standards.
  • Support pre-sales and post-sales activity where deep Databricks expertise is required.
  • Represent DecisionForest as a specialist Databricks partner in customer environments.

What you must bring

  • 4+ years of experience in data engineering, platform engineering, analytics engineering or technical consulting.
  • Demonstrable experience owning at least 2 end-to-end Databricks implementations in a production environment.
  • Strong production experience with PySpark and SQL.
  • Experience with Databricks Lakeflow, notebooks, repos, SQL Warehouses, and cluster/serverless compute patterns.
  • Solid understanding of distributed computing concepts (partitioning, shuffling, optimisation techniques).
  • Practical experience with cloud data platforms on at least one major provider (ideally both Azure and AWS), including storage, networking and security basics.
  • Good knowledge of data warehousing and Lakehouse patterns.
  • Experience with CI/CD and modern SDLC (Git-based workflows, code reviews, deployment pipelines, infrastructure-as-code is a plus).
  • Strong problem-solving skills and the ability to debug complex data issues across multiple systems (data quality, performance, reliability).
  • Confident communicator, able to translate technical detail into business language for non-technical stakeholders.
  • Self-driven, curious and committed to continuous learning in data engineering, cloud and analytics.
  • Familiarity with data governance, security and compliance in regulated environments.

Job Details

Company
DecisionForest
Location
United Kingdom
Posted