Prompt Engineer
Job Description
Find out if this opportunity is a good fit by reading all of the information that follows below.
The Opportunity
A leading global professional services organisation is seeking an experienced Analytics/ Prompt Engineer to join a high-performing Data & AI transformation team.
This role sits at the intersection of data engineering and analytics - transforming raw, ingested data into trusted, structured, analytics-ready data products that power business intelligence, reporting, and advanced analytics initiatives across the enterprise.
You will play a key role in building semantic layers, defining KPIs, and ensuring data quality and governance across a modern cloud data platform.
Key Responsibilities
- Design and implement scalable data models within Databricks.
- Transform raw datasets into curated, analytics-ready data products using Delta Lake and Spark.
- Build and maintain semantic layers for business reporting.
- Translate stakeholder requirements into technical specifications.
- Define KPIs and controlled vocabulary across data products.
- Own reconciliation logic and ensure data consistency across teams.
- Optimise SQL queries and workflows for performance.
- Collaborate closely with Data Engineers, Analysts and Data Scientists.
- Maintain documentation and governance standards.
- Support CI/CD processes and version control workflows.
Required Experience
- Strong SQL expertise (data modelling & optimisation).
- Hands-on Databricks experience (Delta Lake + Spark).
- Experience with dbt or similar transformation frameworks.
- Python for data manipulation and automation.
- Azure cloud experience.
- Familiarity with BI tools (Power BI preferred).
- Strong understanding of data warehousing concepts.
- Exposure to Git-based version control & CI/CD pipelines.
- Ability to engage confidently with business stakeholders.
Nice to Have
- Exposure to ML/AI workflows within Databricks.
- Experience building semantic layers from scratch.
- Enterprise-scale governance experience.
Ideal Profile
This role suits someone who:
- Thinks in terms of data products, not just pipelines.
- Enjoys bridging technical delivery with business impact.
- Can operate in a fast-paced, collaborative environment.
- Takes ownership of data quality and KPI consistency.
Why Apply?
- Modern cloud stack (Databricks + Azure).
- Strategic transformation programme.
- Enterprise-scale data challenges.
- High stakeholder visibility.
- Strong likelihood of contract extension.
If you would like to explore further, please apply or get in touch directly for a confidential discussion. xkybehq
McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds.