Data Engineer - Databricks Contract
A leading UK organisation is undergoing a major customer and data transformation programme and is looking for an experienced Contract Data Engineer to support a large-scale platform and data migration.
This is a high-impact role within a fast-paced environment, suited to a contractor who can operate autonomously, bring structure to complex data challenges, and implement best-practice ingestion and engineering processes across a modern cloud stack.
THE ROLEYou will join a critical transformation programme focused on consolidating multiple legacy platforms into a new customer engagement ecosystem. The role will centre around building and optimising data ingestion pipelines, supporting real-time integrations, and enabling analytics-ready datasets across the business.
This is a hands-on engineering role requiring strong experience across Databricks, Python, and modern cloud data environments, as well as the ability to work across multiple third-party systems and stakeholders.
KEY RESPONSIBILITIES-
Design, build and optimise real-time and batch data ingestion pipelines.
-
Implement best practice data engineering and architecture across Databricks and Azure.
-
Support migration and consolidation of data from multiple legacy systems into a new platform ecosystem.
-
Cleanse, merge, and deduplicate complex customer and transactional datasets.
-
Integrate data across multiple sources including CRM, loyalty platforms, and third-party systems.
-
Develop real-time ingestion solutions using APIs and webhooks.
-
Build and maintain analytics-ready datasets and data models.
-
Support complex data modelling and transformation logic for reporting and analytics.
-
Review and enhance existing Databricks architecture to ensure scalability and performance.
-
Develop API-led solutions to manage data updates and deletions across systems.
-
Support ingestion from cloud storage (including Azure Blob) and bespoke data processes.
-
Ensure strong data governance practices across ingestion, deletion, and data consistency.
Essential:
-
Strong Python experience for data engineering.
-
Proven track record building real-time and batch ingestion pipelines.
-
Hands-on Databricks experience within production environments.
-
Experience working within modern cloud data platforms (Azure preferred).
-
Strong experience integrating data via APIs and webhooks.
-
Background in complex data migration, consolidation, and cleansing projects.
-
Ability to work independently within fast-moving, high-pressure programmes.
-
Strong stakeholder engagement and problem-solving capability.
Desirable:
-
Experience working with customer, CRM, or loyalty data platforms.
-
Exposure to Snowflake or similar cloud data warehouses.
-
Experience in large-scale customer or data transformation programmes.
-
Experience implementing data engineering best practice in evolving environments.
-
Day Rate: Competitive (DOE)
-
Location: Midlands-based office (ideally 1 day per week; flexible for strong candidates able to commute weekly)
-
Start Date: ASAP
-
Duration: Initial 4 months with strong extension potential
If you are a hands-on Data Engineer with strong ingestion and Databricks experience looking to join a high-impact transformation programme, please apply to find out more.