the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows. DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases. Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented, and version-controlled analytics More ❯
the design, build, and maintain scalable data pipelines using Azure Data Factory and Databricks to automate data ingestion, transformation, and processing workflows. DCreate and maintain dimensional data models and semantic layers that support business intelligence and analytics use cases. Build and optimise data transformation workflows using dbt, SQL, and Python to create clean, well-documented, and version-controlled analytics More ❯
Uxbridge, England, United Kingdom Hybrid/Remote Options
Pepper Advantage
scalable, modular, reusable, and aligned with enterprise architecture and data strategy. Lead the selection and integration of appropriate technologies and platforms (e.g., data lakes, warehouses, real-time processing, APIs, semantic layers). Data Product Development Collaborate with product managers and data teams to translate business needs into technical data product designs. Architect solutions that support data discoverability, governance, lineage More ❯
future of data intelligence? We are looking to onboard a Solution Architect to own the end to end architecture for a data platform-a self serve, AI native BI layer on top of a financial services data lake/mesh (gold/silver/bronze). You'll unify discovery, composition, and contribution; enable NLQ + chat driven analytics … and enforce enterprise grade governance, security, and observability across payments, cards, lending, and partner ecosystems. Roles and Responsibilities Define target & interim architecture : reference diagrams, data contracts, semantic/NLQ models, API/event schemas, and write back patterns. Conversational BI : design NLQ orchestration, semanticlayer, query planner, auto viz/forecasting, and "board mode"; integrate with the … organization's internal GPT platform. Contribution layer : architect governed upload/stream/write back with schema & quality gates, stewardship workflows, and automatic metadata capture into the dictionary. Data platform : lakehouse (medallion) and curated marts; federation; cost/perf optimization; caching; workload isolation. Governance & observability : lineage, audit trail, prompt/response logging, evaluations and drift monitors for AI generated More ❯
Reigate, England, United Kingdom Hybrid/Remote Options
esure Group
powered by technologyExtensive experience designing fact and dimension tables across domains such as policy, quote, claims, pricing, and fraud, ensuring consistency and alignment with business metricsDeep practical knowledge of semanticlayer design and implementation using DBT, SQL, and Delta Live TablesStrong background in building high-performance, scalable data models that support self-service BI and regulatory reporting requirementsDirect More ❯
Reigate, England, United Kingdom Hybrid/Remote Options
esure Group
technology Extensive experience designing fact and dimension tables across domains such as policy, quote, claims, pricing, and fraud, ensuring consistency and alignment with business metrics Deep practical knowledge of semanticlayer design and implementation using DBT, SQL, and Delta Live Tables Strong background in building high-performance, scalable data models that support self-service BI and regulatory reporting More ❯
ecosystem for scalability and accuracy. Key responsibilities: Lead customer onboarding for analytics platforms, delivering training and technical support. Enrich datasets for AI/LLM applications, including metadata management and semantic layers. Monitor Natural Language Query (NLQ) performance and implement optimisation strategies. Develop and maintain a strategic roadmap for platform scalability and accuracy. Design and develop integrated data models using More ❯
Responsibilities: Design and maintain scalable data platforms using cloud-native technologies such as Databricks and AWS across development, staging, and production environments Build and govern dimensional data models and semantic layers to power consistent and trusted analytics Integrate data from diverse sources including cloud warehouses, APIs, and operational systems Define semantic layers using dbt and Delta Live Tables … complex business processes into scalable and intuitive data models that support analytics and AI Extensive experience designing fact and dimension tables across core business domains Deep practical knowledge of semanticlayer design using dbt, SQL, and Delta Live Tables Experience building and maintaining data pipelines across batch and streaming environments Strong understanding of governance frameworks, access controls, and More ❯
Responsibilities: Design and maintain scalable data platforms using cloud-native technologies such as Databricks and AWS across development, staging, and production environments Build and govern dimensional data models and semantic layers to power consistent and trusted analytics Integrate data from diverse sources including cloud warehouses, APIs, and operational systems Define semantic layers using dbt and Delta Live Tables … complex business processes into scalable and intuitive data models that support analytics and AI Extensive experience designing fact and dimension tables across core business domains Deep practical knowledge of semanticlayer design using dbt, SQL, and Delta Live Tables Experience building and maintaining data pipelines across batch and streaming environments Strong understanding of governance frameworks, access controls, and More ❯
using protocols like OPC UA, MQTT, REST. Design and manage data lakes, warehouses, and streaming platforms for predictive analytics, digital twins, and operational intelligence. Define and maintain asset hierarchies, semantic models, and metadata frameworks for contextualized industrial data. Implement CI/CD pipelines for data workflows and ensure lineage, observability, and compliance across environments. Collaborate with AI/ML … Apache Airflow, Glue, Kafka, Informatica, EventBridge etc. Industrial Data Integration:Familiarity with OT data schema originating from OSIsoft PI, SCADA, MES, and Historian systems. Information Modeling:Experience in defining semantic layers, asset hierarchies, and contextual models. Data Governance:Hands-on experience Data Quality:Ability to implement profiling, cleansing, standardization, and anomaly detection frameworks. Security & Compliance:Knowledge of data privacy More ❯
Leeds, England, United Kingdom Hybrid/Remote Options
Fruition Group
learning, and self-service insights. Senior Analytics Engineer/Data Modeller Responsibilities Design, build, and maintain analytical and AI-ready data marts sourced from the enterprise data lakehouse. Develop semantic models and dimensional structures optimised for BI, dashboarding, and machine learning. Ensure documentation, governance, and data consistency across domains. Collaborate with data engineers to support robust ETL/ELT … and hands-on experience with platforms such as Snowflake, Azure Synapse, BigQuery, or Redshift. Deep understanding of Data Vault, Kimball, and dimensional modelling techniques. Experience designing data marts and semantic layers for BI tools (Power BI, Tableau, Looker). Familiarity with analytics engineering tools including dbt, Airflow, and Git for version control. Excellent collaboration and communication skills, with strong More ❯
Microsoft AI certifications/training (AI-102/AI-900/DW-104). Experience designing and deploying AI-powered dashboards using Power BI. Understanding of data modelling and semanticlayer design to support intelligent analytics. Ability to integrate AI outputs into reporting pipelines for real-time or predictive insights. Knowledge of DAX and Power Query in the More ❯
BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to build modular, testable transformation pipelines Practical mastery of LookML and semanticlayer design within Looker, including Explores, joins, derived tables, and scalability best practices It will also help you to have Experience establishing and enforcing data governance standards through More ❯
mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development to production. Support Terraform adoption and infrastructure-as-code practices. Why This … days/week onsite. What We're Looking For Proven technical leadership in data engineering teams. Deep experience with Databricks, Azure, SQL, and ideally Python. Strong understanding of MDM, semantic modeling, and Unity Catalog. Ability to handle messy, unstructured data and build scalable solutions. Comfortable mentoring and guiding a small, agile team. Finance sector experience is a bonus-but … not essential. Interview Process Initial coffee chat to explore team fit and technical challenges. Technical interview covering Databricks, MDM, semantic layers, pipeline design, and governance. This is a rare opportunity to lead, build, and shape the future of data in a growing financial services business. Ready to make your mark? Let's talk. Modis International Ltd acts as an More ❯
South East, England, United Kingdom Hybrid/Remote Options
Akkodis
mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development to production. Support Terraform adoption and infrastructure-as-code practices. Why This … days/week onsite. What We're Looking For Proven technical leadership in data engineering teams. Deep experience with Databricks, Azure, SQL, and ideally Python. Strong understanding of MDM, semantic modeling, and Unity Catalog. Ability to handle messy, unstructured data and build scalable solutions. Comfortable mentoring and guiding a small, agile team. Finance sector experience is a bonus-but … not essential. Interview Process Initial coffee chat to explore team fit and technical challenges. Technical interview covering Databricks, MDM, semantic layers, pipeline design, and governance. This is a rare opportunity to lead, build, and shape the future of data in a growing financial services business. Ready to make your mark? Let's talk. Modis International Ltd acts as an More ❯
bracknell, south east england, united kingdom Hybrid/Remote Options
Akkodis
mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development to production. Support Terraform adoption and infrastructure-as-code practices. Why This … days/week onsite. What We're Looking For Proven technical leadership in data engineering teams. Deep experience with Databricks, Azure, SQL, and ideally Python. Strong understanding of MDM, semantic modeling, and Unity Catalog. Ability to handle messy, unstructured data and build scalable solutions. Comfortable mentoring and guiding a small, agile team. Finance sector experience is a bonus-but … not essential. Interview Process Initial coffee chat to explore team fit and technical challenges. Technical interview covering Databricks, MDM, semantic layers, pipeline design, and governance. This is a rare opportunity to lead, build, and shape the future of data in a growing financial services business. Ready to make your mark? Let's talk. Modis International Ltd acts as an More ❯
mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development to production. Support Terraform adoption and infrastructure-as-code practices. Why This … days/week onsite. What We're Looking For Proven technical leadership in data engineering teams. Deep experience with Databricks, Azure, SQL, and ideally Python. Strong understanding of MDM, semantic modeling, and Unity Catalog. Ability to handle messy, unstructured data and build scalable solutions. Comfortable mentoring and guiding a small, agile team. Finance sector experience is a bonus-but … not essential. Interview Process Initial coffee chat to explore team fit and technical challenges. Technical interview covering Databricks, MDM, semantic layers, pipeline design, and governance. This is a rare opportunity to lead, build, and shape the future of data in a growing financial services business. Ready to make your mark? Let's talk. Modis International Ltd acts as an More ❯
Employment Type: Permanent
Salary: £70000 - £80000/annum Car Allowance + Bonus
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Akkodis
mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development to production. Support Terraform adoption and infrastructure-as-code practices. Why This … days/week onsite. What We're Looking For Proven technical leadership in data engineering teams. Deep experience with Databricks, Azure, SQL, and ideally Python. Strong understanding of MDM, semantic modeling, and Unity Catalog. Ability to handle messy, unstructured data and build scalable solutions. Comfortable mentoring and guiding a small, agile team. Finance sector experience is a bonus-but … not essential. Interview Process Initial coffee chat to explore team fit and technical challenges. Technical interview covering Databricks, MDM, semantic layers, pipeline design, and governance. This is a rare opportunity to lead, build, and shape the future of data in a growing financial services business. Ready to make your mark? Let's talk. Modis International Ltd acts as an More ❯
main brand. As part of this, they're building a central database spanning multiple datacubes across commercial, advertising, and revenue functions. They previously worked with advisory specialists to create semantic layers, but there are gaps in how this translates into clear, usable insight - which is why they need a strong analyst to take ownership of the visualisation and logic More ❯
main brand. As part of this, they're building a central database spanning multiple datacubes across commercial, advertising, and revenue functions. They previously worked with advisory specialists to create semantic layers, but there are gaps in how this translates into clear, usable insight - which is why they need a strong analyst to take ownership of the visualisation and logic More ❯
10+ PMs), crafted outcome-oriented strategies, and delivered internal platforms that scale. Technically fluent: You speak the language of modern data architecture - from data mesh to streaming pipelines and semantic layers - and work seamlessly with engineers. Business-aware: You can translate technical capability into tangible commercial and customer impact. Governance-ready: You understand the fine print - privacy, consent, compliance More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Intec Select
Bonus & Benefits – FinTech Overview: An established global FinTech organisation is seeking a skilled Senior Analytics Engineer/Data Engineer to help define and manage its analytical data models and semantic layers. You will support the development of reliable, well-structured datasets that enable accurate reporting, improved data accessibility, and better decision-making across the business. Fluent Russian language skills … are essential for this role. Role & Responsibilities: Build and maintain scalable semantic/analytics layers to create consistent business metrics and definitions. Work with teams across the business to understand requirements and translate them into reliable models. Develop core data models following modern data warehouse principles. Write high-quality SQL and maintain dbt-based transformations, tests, and documentation. Support … business requirements into logical, scalable data models. Knowledge of cloud data platforms (e.g., Snowflake, Redshift, BigQuery). Strong communication and documentation skills. Structured, detail-oriented mindset. Desirable: Experience with semantic modelling tools (e.g., dbt SL, LookML). Familiarity with workflow orchestration and BI tooling. Version control experience (Git). Python for scripting. Offer Details: Type: Permanent Location: London/ More ❯
Bonus & Benefits – FinTech Overview: An established global FinTech organisation is seeking a skilled Senior Analytics Engineer/Data Engineer to help define and manage its analytical data models and semantic layers. You will support the development of reliable, well-structured datasets that enable accurate reporting, improved data accessibility, and better decision-making across the business. Fluent Russian language skills … are essential for this role. Role & Responsibilities: Build and maintain scalable semantic/analytics layers to create consistent business metrics and definitions. Work with teams across the business to understand requirements and translate them into reliable models. Develop core data models following modern data warehouse principles. Write high-quality SQL and maintain dbt-based transformations, tests, and documentation. Support … business requirements into logical, scalable data models. Knowledge of cloud data platforms (e.g., Snowflake, Redshift, BigQuery). Strong communication and documentation skills. Structured, detail-oriented mindset. Desirable: Experience with semantic modelling tools (e.g., dbt SL, LookML). Familiarity with workflow orchestration and BI tooling. Version control experience (Git). Python for scripting. Offer Details: Type: Permanent Location: London/ More ❯
You Proven Strategic Leader: Demonstrated success leading large product teams (10+ PMs) and delivering outcomes at scale. Technically Fluent: Strong grasp of modern data architectures (data mesh, streaming platforms, semantic layers) and the ability to partner credibly with technical teams. Commercially Astute: Skilled at translating complex technical initiatives into executive-level priorities with clear business value. Governance-Minded: Experienced More ❯