Bachelor’s or master’s degree in a relevant field 5+ years of proven experience as a Data Engineer Strong knowledge of SQL and relational databases Experience with data modelling, data warehouses, data lakes, and ETL Familiarity with cloud-based BI solutions Experience with dimensionalmodelling Solid analytical and communication skills Fluent in Dutch (written and spoken More ❯
data visualization platforms. Demonstrable experience planning and executing complex reporting & analytics projects across multiple stakeholders. Understanding of data quality frameworks and importance of availability of reliable data Knowledge of dimensionalmodelling and experience Strong analytical thinking and problem-solving skills with the ability to interpret complex data and provide actionable insights. Curiosity and willingness to explore complex and More ❯
useful but not mandatory – we'll help you get those! Core: SQL Server (T-SQL and DBA) Microsoft Fabric Power BI, DAX Supporting: Python, PySpark Microsoft Purview Relational and DimensionalModelling Data Warehouse Theory Data platform Architecture models such as Lakehouse Data Science Master Data Management Data Governance We don’t believe hiring is a tick box exercise More ❯
using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯
Employment Type: Contract
Rate: £350 - £400/day PTO, pension and national insurance
experience; expert level 1+ years of dbt experience preferred Familiarity with BI tools such as Tableau, Looker, or similar Excellent presentation and communication skills Experience in Data Architecture for Dimensional Models, Data Lakes, and Data Lakehouses Experience working in event-driven architectures and cloud platforms (AWS, Azure, GCP) Experience with Snowflake, Redshift, and ETL tools like Fivetran or Stitch More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
capabilities, they are evolving toward a clearer separation between Data Engineering, Analytics Engineering, and Data Product disciplines. This role will sit firmly in the Analytics Engineering function, focused on modelling and building the semantic layer that powers consistent, reliable insights across the company’s BI and data science platforms. This role will focus on the “middle layer", designing dimensional … other downstream consumers. Work closely with Data Engineers responsible for ingestion (from source systems to raw layers such as S3 or cloud storage), but focus your efforts on the modelling and transformation stage. Collaborate with the Data Product team to ensure the semantic layer serves evolving business and analytical needs. Support best practices in CI/CD (using GitHub … maintaining dbt pipelines. Contribute to a common, reusable data model that serves BI, Data Science, and AI/ML teams alike. Required Skills & Experience: Strong experience with SQL and dimensionalmodelling in dbt. Proven experience building and maintaining semantic layers in modern data platforms. Familiarity with Medallion architecture, CI/CD processes (GitHub), and version-controlled data workflows. More ❯
business applications. Building/optimising data pipelines and integrations across cloud platforms. Strong hands-on SQL development skills including: MS SQL Server, T-SQL, indexing, stored procedures, relational/dimensionalmodelling, data dashboards. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. Working closely with key stakeholders including data architects, analysts, testers More ❯
developing + maintaining SQL Server database solutions that power core business applications. Strong hands-on SQL Server database development skills including: complex stored procedures T-SQL, indexing, relational/dimensionalmodelling, data dashboards. Building/optimising data pipelines and integrations across cloud platforms. Any cloud platforms experience would be helpful including: such as Snowflake, Databricks, BigQuery, Azure SQL. More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
Development or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
or Data-Warehouse projects, across technologies used in the enterprise space. Software development experience using: Object-oriented languages (e.g., Python, PySpark,) and frameworks Stakeholder Management Expertise in relational and dimensionalmodelling, including big data technologies. Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure More ❯
Basingstoke, England, United Kingdom Hybrid / WFH Options
Castle Trust Group
semi-structured data formats, including JSON, XML, Hive tables, and Parquet files Cloud platforms (e.g., Azure, AWS, GCP), including deployment, configuration, and integration of database services Data warehousing, including dimensionalmodelling and ETL processes Machine learning pipeline awareness Experience with data governance and metadata management What is also important to us is that you are highly motivated and More ❯
systems to support changing and evolving business intelligence use cases. Plan, design, implement, update and support data pipelines in ADF. Support the development and maintenance of the data warehouse dimensional model and development of data marts. Ensure the adoption of best practices, develop standards and new processes to optimise cost and service delivery. Provide technical knowledge and expertise on … pipelines. Previous experience in working with data cubes and migrating SSRS reports to PowerBI and tabular model is highly advantageous. Experience with business and technical requirements analysis, business process modelling/mapping and methodology development, and data mapping. General knowledge of database solutions, application services, data architecture and architecture patterns. Experience of working in a legal or professional services More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Medisoft Limited
with data extracts, data analysis and data validation as required The role requires: In-depth working knowledge and experience working with relational databases Background in data warehouse design (e.g. dimensionalmodelling) and data analysis Advanced MS SQL Server skillset (T-SQL, stored procedures, functions, dynamic SQL) Proven experience of developing complex reports using SSRS or closely equivalent tools More ❯
to-end BI development projects using Microsoft BI stack (SSIS, SSAS, SSRS) Design and implement complex SQL queries, stored procedures, and performance tuning Contribute to data warehouse architecture and dimensional modeling Work within OLAP and ETL frameworks to support scalable BI solutions Collaborate with cross-functional teams to translate business needs into technical solutions Required Skills & Experience: Proven experience More ❯
experience. Strong background in System Integration, Application Development, or Data-Warehouse projects across enterprise technologies. Experience with Object-oriented languages (e.g., Python, PySpark) and frameworks. Expertise in relational and dimensional modeling, including big data technologies. Proficiency in Microsoft Azure components like Azure Data Factory, Data Lake, SQL, DataBricks, HD Insights, ML Service. Good knowledge of Python and Spark. Experience More ❯
Python and Spark. Experience of coaching and developing your peer’s knowledge and understanding of modern data engineering principles and practices. Experience of data warehousing techniques such as Kimball dimensional modelling. Experience of line management or team leadership. Excellent communication, organizational and leadership skills. Strong problem-solving abilities and attention to detail. It would be great if you also More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
SQL to create scalable, intuitive data solutions that drive business value. Key Responsibilities Build Data Products: Collaborate with business domains to design and develop ETL/ELT pipelines and dimensional models optimised for Power BI Drive Governance: Define and enforce data ownership, quality, and security standards within the Data Mesh architecture Enable Self-Service: Create intuitive data models and … and collaboration. You should bring: Technical Expertise: Proven experience coding ETL/ELT pipelines with Python, SQL, or ETL tools, and proficiency in Power BI, Tableau, or Qlik Data Modelling Skills: Strong knowledge of dimensionalmodelling and database principles Governance Experience: Track record of working in democratized data environments, establishing controls and guardrails Collaboration & Communication: Ability to More ❯