South East London, England, United Kingdom Hybrid / WFH Options
Osmii
Azure Synapse) and architecting cloud-native data platforms. Programming Proficiency: Expert-level skills in Python (PySpark) and SQL for data engineering and transformation. Scala is a strong plus. Data Modelling: Strong understanding and practical experience with data warehousing, data lake, and dimensionalmodelling concepts. ETL/ELT & Data Pipelines: Proven track record of designing, building, and optimizing More ❯
and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensionalmodelling , and data architecture . Experience working with Delta Lake and large-scale data processing. Experience building ETL pipelines in Azure Data Factory or similar orchestration tools. Familiarity More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
Cloud2 Consult
experience working with Fabric-based solutions. Azure Ecosystem: At least 2 years of experience delivering business solutions using: Azure Data Factory/Synapse Power BI Azure SQL Server Data Modelling: Strong knowledge of dimensionalmodelling (Kimball). Familiarity with other techniques such as Data Vault 2.0. Development Experience: Minimum 3 years in database and/or analytics More ❯
Crawley, England, United Kingdom Hybrid / WFH Options
McCabe & Barton
design and deliver scalable data products that enable self-service analytics using Snowflake, Power BI, Python, and SQL. What You’ll Do: Build robust ETL/ELT pipelines and dimensional models for analytics Enforce data quality, security, and governance in a Data Mesh setup Enable self-service insights through intuitive data models and training Own data products end-to … continuous improvement Promote innovation and best practices across the organisation What We’re Looking For: Strong skills in SQL and Power BI. Python is beneficial Solid experience with data modelling and ETL pipelines Knowledge of cloud environments – Azure Experience with Snowflake and Databricks Familiarity with data governance in decentralised environments Excellent communication and stakeholder engagement skills A proactive, problem More ❯
South East London, England, United Kingdom Hybrid / WFH Options
83data
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux containers (e.g., Docker More ❯
South East London, England, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
and emerging technologies. What You’ll Bring ✅ Extensive hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensionalmodelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge More ❯
About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting and visualisation 🧠 What You’ll Do: Design and implement … BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and Fivetran Apply best practices in dimensionalmodelling, layered architecture, and data quality ✅ What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms … like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across technical and non-technical teams Nice to have: Experience with Power BI or similar BI tools Familiarity with CI/CD practices in data environments Exposure to data More ❯
and Dimensions with accurate end-user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI Deep understanding of Data Warehouse design, including Star schema and dimensionalmodelling Strong analytical skills and technical aptitude, with the ability to influence system architecture decisions Experience leading testing disciplines within agile projects Self-starter with initiative and enthusiasm More ❯
and Dimensions with accurate end-user reports Proficiency with reporting tools such as Oracle OAS and Microsoft Power BI Deep understanding of Data Warehouse design, including Star schema and dimensionalmodelling Strong analytical skills and technical aptitude, with the ability to influence system architecture decisions Experience leading testing disciplines within agile projects Self-starter with initiative and enthusiasm More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
business applications. Building/optimising data pipelines and integrations across cloud platforms. Strong hands-on SQL development skills including: MS SQL Server, T-SQL, indexing, stored procedures, relational/dimensionalmodelling, data dashboards. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers More ❯
developing + maintaining SQL Server database solutions that power core business applications. Strong hands-on SQL Server database development skills including: complex stored procedures T-SQL, indexing, relational/dimensionalmodelling, data dashboards. Building/optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
at Landmarc, you'll need a strong foundation in data engineering, with proven experience as a data professional. You should have a solid background in data warehouse design particularly dimensionalmodelling and be confident working with data mining techniques. A deep understanding of database management systems, OLAP, and ETL frameworks is essential, along with hands-on experience using More ❯
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯
SQL environments Work with business and technical teams to gather and document data requirements Design, implement, update, and support data pipelines in Azure Data Factory (ADF) Maintain data warehouse dimensional model and develop data marts Develop standards and processes to optimize cost and service delivery Provide ongoing support, debug issues, and perform root cause analysis Enforce data management governance More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
DGH Recruitment Ltd
SQL environments Work with business and technical teams to gather and document data requirements Design, implement, update, and support data pipelines in Azure Data Factory (ADF) Maintain data warehouse dimensional model and develop data marts Develop standards and processes to optimize cost and service delivery Provide ongoing support, debug issues, and perform root cause analysis Enforce data management governance More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita Consulting
PowerBI/Tableau developer, with a solid track record in delivering complex solutions. They will be proficient in desktop and cloud, with expert level experience in calculated metrics, data modelling, and optimising reports. You will Lead the design and development of visualisation dashboards that meet complex data requirements and present information in a clear and concise manner Drive data … design with business users Ability to perform data blending, data aggregation and complex calculated fields when required Possess hands on experience with SQL and knowledge of data warehousing and dimensionalmodelling will be advantageous Experience using large data platforms such as Snowflake, Databricks or similar Exposure to other visualisation platforms is helpful, but not essential Required Characteristics Cares More ❯
systems to support changing and evolving business intelligence use cases. Plan, design, implement, update and support data pipelines in ADF. Support the development and maintenance of the data warehouse dimensional model and development of data marts. Ensure the adoption of best practices, develop standards and new processes to optimise cost and service delivery. Provide technical knowledge and expertise on … pipelines. Previous experience in working with data cubes and migrating SSRS reports to PowerBI and tabular model is highly advantageous. Experience with business and technical requirements analysis, business process modelling/mapping and methodology development, and data mapping. General knowledge of database solutions, application services, data architecture and architecture patterns. Experience of working in a legal or professional services More ❯
Data Factory, Data Lake and SQL. Role & Responsibilities Building end to end data pipelines in Azure (Data Factory and SQL) Building workflows in SQL, Spark and DBT Data and dimensionalmodelling Skills & Qualifications Azure Data factory, Synapse and SSIS Python/Park/PySpark Ideally Snowflake and DBT More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
org level contributions. Rotate around the business to build relationships and act as a multiplier. What we look for in you Understand, assess and effectively apply modern data architectures (dimensional model, data mesh, data lake). Experience in applying and using data observability methods effectively. Experience in modern software development practices (agile, CI/CD, DevOps, infrastructure as code More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Identify Solutions
You'll have broad career progression opportunities across Group, which includes several high profile household name What you'll bring: Strong Python & TDD skills Expertise in modern data architecture (dimensionalmodelling, data mesh, data lake) & best practices (agile, CI/CD, IaC, observability). Experience with Cloud and big data technologies (e.g. Spark/Databricks/Delta Lake More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
gen2fund.com
The position requires at least 2 years of experience using QlikView version 11 or higher, with proven expertise in the following areas: Good knowledge of SQL, relational databases, and Dimensional Modeling Experience working with large data sets and complex data models involving more than 10 tables Integrating data from multiple sources into QlikView Data Models, including social media content … and API extensions Use of complex QlikView functions and developing optimal scripts for solutions Optimizing Dimensional data models for performance Primary Responsibilities: Creating and providing reporting and dashboard applications using QlikView and NPrinting to facilitate better decision-making Collaborating with stakeholders to gather requirements, and translating these into system and functional specifications Creating prototypes and conducting proof of concepts More ❯