Crawley, England, United Kingdom Hybrid / WFH Options
McCabe & Barton
design and deliver scalable data products that enable self-service analytics using Snowflake, Power BI, Python, and SQL. What You’ll Do: Build robust ETL/ELT pipelines and dimensional models for analytics Enforce data quality, security, and governance in a Data Mesh setup Enable self-service insights through intuitive data models and training Own data products end-to … continuous improvement Promote innovation and best practices across the organisation What We’re Looking For: Strong skills in SQL and Power BI. Python is beneficial Solid experience with data modelling and ETL pipelines Knowledge of cloud environments – Azure Experience with Snowflake and Databricks Familiarity with data governance in decentralised environments Excellent communication and stakeholder engagement skills A proactive, problem More ❯
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD, and Infrastructure-as-Code (Terraform or similar). More ❯
depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. · Hands-on experience with: · Developing data pipelines (Snowflake), writing complex SQL queries. · Building ETL/ELT/data pipelines. · Kubernetes and Linux containers (e.g., Docker More ❯
South East London, England, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
and emerging technologies. What You’ll Bring ✅ Extensive hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensionalmodelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
South East London, England, United Kingdom Hybrid / WFH Options
VirtueTech Recruitment Group
central Data Platform, working closely with the Data Management Team. Analysis of normalised transactional data and analytic data sources. Analysis of semi structured and file-based data extracts. Data modelling to transform operational data into analytic/reporting structures such as Kimball style multi-dimensional models. Essential Skills for the Data Analyst: Proven Data Analyst experience of at … least 5 years, within a Financial Services environment, with particular emphasis on Risk or cross asset derivatives areas. Data modelling, cleansing and enrichment, with experience in conceptual, logical and physical data modelling and data cleansing and standardisation. Familiarity with Data warehousing and analytical data structures. Data quality assurance, validation and linkage. Experience creating BI models and dashboards, ideally More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita Consulting
PowerBI/Tableau developer, with a solid track record in delivering complex solutions. They will be proficient in desktop and cloud, with expert level experience in calculated metrics, data modelling, and optimising reports. You will Lead the design and development of visualisation dashboards that meet complex data requirements and present information in a clear and concise manner Drive data … design with business users Ability to perform data blending, data aggregation and complex calculated fields when required Possess hands on experience with SQL and knowledge of data warehousing and dimensionalmodelling will be advantageous Experience using large data platforms such as Snowflake, Databricks or similar Exposure to other visualisation platforms is helpful, but not essential Required Characteristics Cares More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
gen2fund.com
The position requires at least 2 years of experience using QlikView version 11 or higher, with proven expertise in the following areas: Good knowledge of SQL, relational databases, and Dimensional Modeling Experience working with large data sets and complex data models involving more than 10 tables Integrating data from multiple sources into QlikView Data Models, including social media content … and API extensions Use of complex QlikView functions and developing optimal scripts for solutions Optimizing Dimensional data models for performance Primary Responsibilities: Creating and providing reporting and dashboard applications using QlikView and NPrinting to facilitate better decision-making Collaborating with stakeholders to gather requirements, and translating these into system and functional specifications Creating prototypes and conducting proof of concepts More ❯
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
Gen II Fund Services
The position requires at least 2 years of experience using QlikView version 11 or higher with proven experience in the following areas: Good knowledge of SQL, relational databases, and Dimensional Modeling Working with large data sets & experience with complex data models involving more than 10 tables Integrating data from multiple data sources into a QlikView Data Model, Social Media … content, API extensions, etc. Use of complex QlikView functions and developing optimal scripts for a given solution Optimization of Dimensional data model for performance. Primary Responsibilities Will Include: Using technologies such as QlikView and NPrinting to create and provide the business areas with new reporting and dashboard applications to enable better decision making Working with stakeholders, business users, and More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
on SQL experience for basic analysis and investigations. Experience with product management tools such as backlog management, road mapping, and prototyping. Passion for data; knowledge of data warehousing and dimensional modeling is a plus. Required Characteristics Ownership and accountability for tasks, with problem-solving skills. Collaborative mindset to work with stakeholders and development teams. Effective communication with technical and More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita Consulting
A strong understanding of business needs and data maximisation solutions Experience with digital or data initiatives as a Product Owner, managing full lifecycle Familiarity with data architecture, engineering, analytics, modelling, visualisation, and governance Hands-on SQL experience for analysis and investigations Experience with product owner tools: backlog management, road mapping, prototyping Passion for data, with knowledge of warehousing and … dimensionalmodelling as a plus Required Characteristics Ownership and problem-solving skills Team-oriented, stakeholder and developer collaboration Effective communication with technical and non-technical audiences Quick understanding of client contexts and business expertise If you don't fit all these exactly but are interested, we encourage you to get in touch - we hire people, not just job More ❯
observability, and engineering best practice A role designed for someone who enjoys building, influencing, and mentoring What you’ll do Design and evolve modern data architecture (data mesh, lakes, dimensionalmodelling) Build robust, event-driven pipelines with Python, SQL, and cloud-native tooling Improve the delivery and reliability of data products and observability tooling Lead by example: mentor More ❯