quality rules, lineage and catalogue standards About You Solid track record in Data Engineering, BI Development or Analytics Engineering roles Strong SQL skills and a solid understanding of data-modelling principles (e.g., dimensionalmodelling, 3NF) Experience designing and building modern data pipelines and Lakehouse architectures Hands-on experience with at least one enterprise-grade data platform (e.g. More ❯
quality rules, lineage and catalogue standards About You Solid track record in Data Engineering, BI Development or Analytics Engineering roles Strong SQL skills and a solid understanding of data-modelling principles (e.g., dimensionalmodelling, 3NF) Experience designing and building modern data pipelines and Lakehouse architectures Hands-on experience with at least one enterprise-grade data platform (e.g. More ❯
the mobile/telecoms industry would be a bonus! Key outputs for the role• Design, build, and maintain scalable and trustworthy data models in dbt, making use of Kimball Dimensional and One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google Cloud Platform environments, primarily … and implement robust, well-documented and performant dbt models that serve as a single source of truth for business reporting.• Implement and champion data quality testing, documentation standards, and modelling best practices within dbt projects.• Troubleshoot and resolve any issues or errors in the data pipelines.• Stay updated with the latest cloud technologies and industry best practices to continuously … be expecting to see: • Expert level proficiency in SQL.• Deep, practical experience of building and architecting data models with dbt.• A strong understanding of data warehousing concepts and data modelling techniques (eg, Kimball, DimensionalModelling, One Big Table).• Solid, hands-on experience within the Google Cloud Platform, especially with BigQuery.• Proven experience working directly with business More ❯
s strategic objectives Provide expert input into data platform design and help evaluate new technologies and tools as the organisation scales Act as a subject matter expert on data modelling and architectural frameworks across the organisation About You Requirements Minimum of 5 years' experience in a Data Architect role Proven experience designing and implementing enterprise-scale data architecture in … complex environments Deep understanding of data modelling techniques-including conceptual, logical, and physical modelling Strong expertise in Kimball methodology and dimensionalmodelling (e.g. star schema design) Experience with modern cloud data platforms, ideally including Microsoft Azure, Databricks, and associated tools (e.g., Azure Data Factory, Azure SQL, Synapse) Familiarity with modern data engineering practices including the use More ❯
s strategic objectives Provide expert input into data platform design and help evaluate new technologies and tools as the organisation scales Act as a subject matter expert on data modelling and architectural frameworks across the organisation About You Requirements Minimum of 5 years' experience in a Data Architect role Proven experience designing and implementing enterprise-scale data architecture in … complex environments Deep understanding of data modelling techniques-including conceptual, logical, and physical modelling Strong expertise in Kimball methodology and dimensionalmodelling (e.g. star schema design) Experience with modern cloud data platforms, ideally including Microsoft Azure, Databricks, and associated tools (e.g., Azure Data Factory, Azure SQL, Synapse) Familiarity with modern data engineering practices including the use More ❯
EMR, Redshift, Athena, S3, Lambda, Step Functions, CloudWatch. Proficiency in Python and PySpark for ETL, transformation, and automation. Strong experience in SQL and data modeling (star schema, snowflake schema, dimensional models Experience building scalable, high-performance data pipelines in cloud environments. Knowledge of data governance, data quality frameworks, and security best practices. Excellent communication and problem-solving skills. Preferred More ❯
EMR, Redshift, Athena, S3, Lambda, Step Functions, CloudWatch. Proficiency in Python and PySpark for ETL, transformation, and automation. Strong experience in SQL and data modeling (star schema, snowflake schema, dimensional models Experience building scalable, high-performance data pipelines in cloud environments. Knowledge of data governance, data quality frameworks, and security best practices. Excellent communication and problem-solving skills. Preferred More ❯
Analytics), and Azure Data Factory (ADF). Proficiency in SQL, data manipulation, analysis, and visualization. Some experience with Python and Machine Learning concepts. Basic understanding of data warehousing concepts, dimensional modeling, and ETL processes. Excellent problem-solving skills and attention to detail with the ability to work independently and collaboratively in a team environment. Effective communication skills with the More ❯
for data mgt & BI development. Environment: Role incl. defining data platform tech stack. Example tech below; not required to have experience in all. Data Platforms: DW & lake concepts incl. dimensional modeling & cloud services (S3, AWS Redshift, RDS, Azure Data Lake Storage, Synapse Analytics, BigQuery, Databricks, Snowflake, Informatica); Databases: SQL & relational/non-relational (SQL Server, Oracle, PostgreSQL, MongoDB); BI More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Gleeson Recruitment Group
processes and reduce reliance on spreadsheets and legacy tools. Collaborate with stakeholders to define reporting needs, translating complex requirements into actionable technical specifications. Provide expert guidance on data architecture, modelling, and analytics approaches. Contribute to policies on data governance and promote best practices across the organisation. Mentor junior BI developers and train stakeholders in the effective use of BI … skills and experience: Significant expertise in Power BI for data visualisation and reporting. Hands-on experience with Azure services (Azure Data Factory, Azure SQL Server). Strong understanding of dimensionalmodelling (e.g., Kimball methodology). Proficiency in sourcing, manipulating, and interpreting complex datasets. Strong analytical mindset with excellent attention to detail and accuracy. Experience managing stakeholders, gathering business More ❯
Employment Type: Permanent
Salary: £55000 - £65000/annum Final salary pension
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Ingeus
and scalability, including implementing data partitioning, indexing, and compression strategies. Implement data quality and data governance frameworks to ensure data accuracy, consistency, and compliance with industry standards. Perform data modelling and schema design to support data analysis, reporting, and visualisation needs Monitor and troubleshoot data pipelines, identify, and resolve performance bottlenecks, and ensure data integrity and availability. To be … experience as a Data Engineer Strong proficiency in Python and experience with data manipulation frameworks such as Apache Spark In-depth knowledge of relational and non-relational databases, data modelling, and SQL Experience with cloud platforms including Azure Synapse and Fabric Proficiency in designing and implementing data pipelines using technologies Solid understanding of data warehousing concepts, dimensionalmodellingMore ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
We Are Dcoded Limited
delivering enterprise-level solutions. Essential Skills: 5 years' experience in Data Engineering. Strong expertise in Databricks, Azure Data Factory, Azure SQL, and Azure Synapse/DW. Solid understanding of dimensionalmodelling (Kimball, Star Schema) and EDW solutions. Experience working with structured and unstructured data. Familiarity with cloud and DevOps practices - ie Azure, CI/CD pipelines, scaling, cost More ❯
valuable). Experience managing large and complex datasets, with a strong command of SQL for cloud-based environments (Fabric, Snowflake, BigQuery, Redshift, etc.). A solid understanding of data modelling techniques (star schema, data vault, dimensionalmodelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
valuable). Experience managing large and complex datasets, with a strong command of SQL for cloud-based environments (Fabric, Snowflake, BigQuery, Redshift, etc.). A solid understanding of data modelling techniques (star schema, data vault, dimensionalmodelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands-on experience with data pipeline orchestration tools More ❯
in an Agile environment. Expert data reporting and visualisation using Power BI & strong Power Query and DAX skills. Experience of working with large data sets in an enterprise environment. Dimensional model design and implementation. Experience in Microsoft data components including: Azure Analysis Services Databricks Azure SQL Data Warehouse (Synapse Analytics) Tools and techniques for ensuring data quality, security, validation More ❯
Reigate, Surrey, England, United Kingdom Hybrid / WFH Options
esure Group
a team of data engineers, analytics engineers, data scientists and AI specialists to design and evolve scalable data platforms and modern data products that enable self-service analytics, advanced modelling, and AI-driven decision-making across our insurance business. What you’ll do: Design and manage scalable cloud data platforms (Databricks on AWS) across development, staging, and production environments … ensuring reliable performance and cost efficiency. Integrate and model data from diverse sources – including warehouses, APIs, marketing platforms, and operational systems – using DBT, Delta Live Tables, and dimensionalmodelling to deliver consistent, trusted analytics. Enable advanced AI and ML use cases by building pipelines for vector search, retrieval-augmented generation (RAG), feature engineering, and model deployment. Ensure security … to architectural decisions, authoring ADRs, and participating in reviews, data councils, and platform enablement initiatives. Qualifications What we’d love you to bring: Proven, hands-on expertise in data modelling, with a strong track record of designing and implementing complex dimensional models, star and snowflake schemas, and enterprise-wide canonical data models Proficiency in converting intricate insurance business More ❯
the delivery of a functioning data mart and reporting models, not dashboarding. YOUR SKILLS & EXPERIENCE The successful candidate will have: Strong SQL and DBT experience. Familiarity with Kimball methodology, dimensionalmodelling, and star schema design. Proven experience with Redshift or Snowflake. Strong background in cloud-based data environments (AWS preferred). Hands-on experience with Airflow for orchestration. More ❯
into a management Very strong technical skills that will include - SQL, SSIS, SSRS, SAS - Power BI, Power Platform - Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensionalmodelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering tools and automation More ❯
team development etc) Very strong technical skills that will include - SQL, SSIS, SSRS, SAS, Power BI, Power Platform, Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensionalmodelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering tools and automation More ❯
Sterling, Virginia, United States Hybrid / WFH Options
Progression Inc
practices Experience building and maintaining high-volume data pipelines Knowledge of data modeling and ETL best practices Familiarity with SQL and NoSQL databases Understanding of data warehouse concepts and dimensional modeling Bachelor's degree in Computer Science, Engineering, or related technical field DUTIES: As a Data Engineer, you will: Design and implement scalable, production-grade data pipelines. Optimize existing More ❯
like Informatica, Talend, DataStage, or custom Python/Scala frameworks. Familiarity with or experience in using Rhine for metadata-driven pipeline orchestration. Working knowledge of data warehousing concepts and dimensional modeling. Exposure to cloud platforms (AWS, Azure, or GCP) and tools such as Snowflake, Redshift, or BigQuery is a plus. Experience with version control (e.g., Git) and CI/ More ❯
them with robust data pipelines. DESIRABLE LANGUAGES/TOOLS Proficiency in programming languages such as Python, Java, Scala, or SQL for data manipulation and scripting. Strong understanding of data modelling concepts and techniques, including relational and dimensional modelling. Experience in big data technologies and frameworks such as Databricks, Spark, Kafka, and Flink. Experience in using modern data architectures More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita Consulting
PowerBI/Tableau developer, with a solid track record in delivering complex solutions. They will be proficient in desktop and cloud, with expert level experience in calculated metrics, data modelling, and optimising reports. You will Lead the design and development of visualisation dashboards that meet complex data requirements and present information in a clear and concise manner Drive data … design with business users Ability to perform data blending, data aggregation and complex calculated fields when required Possess hands on experience with SQL and knowledge of data warehousing and dimensionalmodelling will be advantageous Experience using large data platforms such as Snowflake, Databricks or similar Exposure to other visualisation platforms is helpful, but not essential Required Characteristics Cares More ❯
Gloucester, Gloucestershire, England, United Kingdom
IMT Resourcing Solutions
and driving actionable insight. The role: Design, build, and maintain data pipelines and warehouse solutions Administer and optimise Snowflake in line with best practice Build data models using Kimball DimensionalModelling Leverage dbt, Fivetran, and Power BI for data transformation and reporting Deliver dashboards and reports up to executive level Support data governance, quality, and compliance across multiple More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
existing plans to streamline reporting and automate data workflows. What you'll do: Collaborate with the analytics and engineering teams to set up a data warehouse using modern data modelling techniques, focusing on Kimball methodology and a Medallion architecture (bronze, silver, gold layers). Develop and maintain DBT projects and configure incremental loads with built-in unit testing. Support … with star schema design to power business reporting and dashboards (PowerBI experience a plus). Skills & Experience: Strong SQL expertise and hands-on experience with DBT. Familiarity with Kimball dimensionalmodelling concepts. Experience working with cloud data warehouses such as Redshift or Snowflake. Knowledge of Airflow for workflow management. Comfortable in AWS environments and data orchestration. Bonus: Python More ❯