the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows. DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases. Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented, and version-controlled analytics More ❯
watford, hertfordshire, east anglia, united kingdom
Allwyn UK
validation or QA steps, confirming accuracy of metrics and performance insights. 8.User Satisfaction & Self-Service Promotion Advocate for self-service capabilities, supporting creation of curated datasets, intuitive dashboards, or semantic layers for non-technical users Conduct user training or adoption sessions, gathering feedback to refine solutions and measure user satisfaction. 9.Project & Delivery Management Oversee end-to-end analytics initiatives More ❯
using protocols like OPC UA, MQTT, REST. Design and manage data lakes, warehouses, and streaming platforms for predictive analytics, digital twins, and operational intelligence. Define and maintain asset hierarchies, semantic models, and metadata frameworks for contextualized industrial data. Implement CI/CD pipelines for data workflows and ensure lineage, observability, and compliance across environments. Collaborate with AI/ML … Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian systems. Information Modeling:Experience in defining semantic layers, asset hierarchies, and contextual models. Data Governance:Hands-on experience Data Quality:Ability to implement profiling, cleansing, standardization, and anomaly detection frameworks. Security & Compliance:Knowledge of data privacy More ❯
bracknell, south east england, united kingdom Hybrid/Remote Options
Akkodis
mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development to production. Support Terraform adoption and infrastructure-as-code practices. Why This … days/week onsite. What We're Looking For Proven technical leadership in data engineering teams. Deep experience with Databricks, Azure, SQL, and ideally Python. Strong understanding of MDM, semantic modeling, and Unity Catalog. Ability to handle messy, unstructured data and build scalable solutions. Comfortable mentoring and guiding a small, agile team. Finance sector experience is a bonus-but … not essential. Interview Process Initial coffee chat to explore team fit and technical challenges. Technical interview covering Databricks, MDM, semantic layers, pipeline design, and governance. This is a rare opportunity to lead, build, and shape the future of data in a growing financial services business. Ready to make your mark? Let's talk. Modis International Ltd acts as an More ❯
class technology powerhouse? Apply now! Responsibilities: Manage, and maintain the enterprise data architecture within SAP Datasphere, ensuring scalability, security, and performance Implement advanced data models using spaces, views, and semantic layers to support operational and analytical use cases Integrate data from SAP (eg, S/4HANA, BW/4HANA) and non-SAP sources (eg, APIs, cloud platforms) to enable … within SAP Datasphere Requirements: Hands-on experience designing, implementing, and optimizing solutions in SAP Datasphere (formerly SAP Data Warehouse Cloud) Proficiency in building complex data models using spaces, views, semantic layers, and graphical modeling tools Deep understanding of integrating SAP (S/4HANA, BW/4HANA) and non-SAP sources (cloud platforms, APIs, flat files) into Datasphere Experience in More ❯