including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83data
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
and governance initiatives—helping create a reliable, enterprise-grade data foundation. Requirements Proven experience in analytics engineering or advanced BI/data roles with a strong focus on data modelling and transformation Proficiency in SQL and cloud data platforms such as Snowflake or Azure Synapse Analytics Hands-on experience with DBT for developing, testing, and documenting transformations Understanding of … modern data stack principles, including layered modelling, modular SQL, and Git-based workflows Familiarity with dimensionalmodelling, OBT, activity schemes & data warehouse design principles Strong communication skills, with the ability to translate data into business context A proactive, detail-oriented mindset and a drive to build systems that scale Nice to Have Experience using Azure Data Factory More ❯
London, England, United Kingdom Hybrid / WFH Options
83zero Limited
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
Crawley, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
design and deliver scalable data products that enable self-service analytics using Snowflake, Power BI, Python, and SQL. What You’ll Do: Build robust ETL/ELT pipelines and dimensional models for analytics Enforce data quality, security, and governance in a Data Mesh setup Enable self-service insights through intuitive data models and training Own data products end-to … continuous improvement Promote innovation and best practices across the organisation What We’re Looking For: Strong skills in SQL and Power BI. Python is beneficial Solid experience with data modelling and ETL pipelines Knowledge of cloud environments – Azure Experience with Snowflake and Databricks Familiarity with data governance in decentralised environments Excellent communication and stakeholder engagement skills A proactive, problem More ❯
London, England, United Kingdom Hybrid / WFH Options
83data
including scheduling, monitoring, and alerting. Collaborate with cross-functional teams (Product, Engineering, Data Science, Compliance) to define data requirements and build reliable data flows. Champion best practices in data modelling, governance, and DevOps for data engineering (CI/CD, IaC). Serve as a key communicator between technical teams and business stakeholders, translating complex data needs into actionable plans. … Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding of data architecture, dimensionalmodelling, and metadata management. Experience with cloud platforms (AWS, GCP, or Azure). Familiarity with version control, CI/CD , and Infrastructure-as-Code (Terraform or similar). More ❯
Sale, England, United Kingdom Hybrid / WFH Options
Verastar Limited
analysis techniques and data usage and building solutions to enable. Experience Strong technical background and proven ability in data collection, transformation and integration, evidencing understanding of data warehousing and dimensionalmodelling principles with good understanding of data profiling or analysis. Data modelling. You understand the concepts and principles of data modelling and can produce relevant data models. More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Allegheny County Economic Development
Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models for BI tools Define and implement data quality, ownership, and security standards Empower business teams with intuitive, self-serve data models Own data products end-to-end, from … design to continuous improvement Promote innovation and best practices in data engineering About You: Strong experience with SQL, Python, and BI tools (e.g., Power BI) Solid understanding of dimensionalmodelling and data architecture Experience working in governed, decentralised data environments Excellent communication and stakeholder engagement skills Analytical mindset with a focus on delivering business value If you are More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
Partners, we are expanding our Data Services team, investing in a large-scale modernisation programme to drive innovation and insights. We’re building a new team to enhance data modelling, reporting, AI initiatives, and cloud data platforms like Snowflake. As part of our transformation, we’re growing our analytics capability to help develop the understanding of commercial performance and … critical in the development and support of the new Evelyn Data Platform, which is being engineered on Snowflake, utilising Azure Data Factory pipelines for data integration, dbt for data modelling, Azure BLOB Storage for data storage, and GitHub for version control and collaboration. As a Data Engineer, Your Responsibilities Will Include Design, develop, and implement data warehouse solutions using … ADF) and GitHub, for ETL development. Experience building and maintaining reports using SQL Server Reporting Services (SSRS). Solid understanding of data warehousing concepts, including star/snowflake schemas, dimensionalmodelling, and data vault 2.0, and of the principles of ETL technical design Ability to write efficient and optimized SQL queries for data retrieval and manipulation. Solid understanding More ❯
Database development An understanding of London Market insurance data, processes and terminology Experience working on data projects in the Finance and Actuarial business domains Experience with data transformation and modelling techniques. Thorough understanding and experience with data warehousing concepts and dimensional modelling. Experience with ETL processes and tools (Data Factory, T-SQL, Azure Databricks). Experience with Azure More ❯
and emerging technologies. What You’ll Bring ✅ Extensive hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensionalmodelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge More ❯
City of London, London, United Kingdom Hybrid / WFH Options
KDR Talent Solutions
and emerging technologies. What You’ll Bring ✅ Extensive hands-on experience with Databricks , and Microsoft Azure data tools (must-have: Azure Data Factory, Azure Synapse, or Azure SQL). ✅ Dimensionalmodelling expertise for analytics use cases. ✅ Strong ETL/ELT development skills. ✅ Python scripting experience for data automation. ✅ Experience with CI/CD methodologies for data platforms. ✅ Knowledge More ❯
Data Lakehouse Medallion architecture Microsoft Azure T-SQL Development (MS SQL Server 2005 onwards) Python, PySpark Experience of the following systems would also be advantageous: Azure DevOps MDS Kimball DimensionalModelling Methodology Power Bi Unity Catalogue Microsoft Fabric Experience of the following business areas would be advantageous: Insurance sector (Lloyds Syndicate, Underwriting, Broking) Qualifications: Degree educated in relevant More ❯
SQL Server) Qualifications Desired qualifications and experience: Experience with Programming languages such as (Python, SQL etc.) Experience building and maintaining data pipelines. Familiarity with medallion architecture. An understanding of dimensionalmodelling concepts. Proficiency with reporting tools, preferably Power BI. Practical experience working in an Agile Scrum environment. At Solus, we offer a collaborative and innovative work environment where More ❯
Data Lakehouse Medallion architecture Microsoft Azure T-SQL Development (MS SQL Server 2005 onwards) Python, PySpark Experience Of The Following Systems Would Also Be Advantageous Azure DevOps MDS Kimball DimensionalModelling Methodology Power Bi Unity Catalogue Microsoft Fabric Experience Of The Following Business Areas Would Be Advantageous Insurance sector (Lloyds Syndicate, Underwriting, Broking) Qualifications Degree educated in relevant More ❯
DBA) Azure Databricks Azure Data Factory Microsoft Fabric Power BI, DAX Azure Data Lake Supporting: Azure ML Azure AI Services Azure Infrastructure Python, PySpark Microsoft Purview Principles: Relational and DimensionalModelling Data Warehouse Theory Data platform Architecture models such as Lakehouse Data Science Master Data Management Data Governance Additional Information We don’t believe hiring is a tick More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
Private Medical Health Pension Scheme And more... Candidate requirements include experience in: Extensive experience with solutions involving Databricks, Azure Data Factory, Synapse Excellent understanding of Microsoft SQL Server, Data Modelling, Data Marts, DimensionalModelling Understanding of SQL/database management Strong understanding of the wider Azure stack (ADF, Synapse, ADLS) and Data Modelling tools (ERwin, ER More ❯
About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting and visualisation ? What You’ll Do: Design and implement … BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and Fivetran Apply best practices in dimensionalmodelling, layered architecture, and data quality What We’re Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms … like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across technical and non-technical teams Nice to have: Experience with Power BI or similar BI tools Familiarity with CI/CD practices in data environments Exposure to data More ❯
About the Platform: This greenfield initiative is focused on building a next-gen data ecosystem with a tech stack including: Snowflake for cloud data warehousing dbt for transformation and modelling Azure for cloud infrastructure and orchestration Fivetran for automated data ingestion Power BI and other modern BI tools for reporting and visualisation What You'll Do: Design and implement … BI developers to deliver insightful, high-performance dashboards Work with Data Engineers to optimise data ingestion and orchestration pipelines using Azure Data Factory and Fivetran Apply best practices in dimensionalmodelling, layered architecture, and data quality What We're Looking For: Strong experience in data modelling and SQL Hands-on experience with dbt and cloud data platforms … like Snowflake or Azure Synapse Analytics Solid understanding of modern data stack principles , including layered modelling and data warehouse design Excellent communication skills and the ability to work with stakeholders across technical and non-technical teams Nice to have: Experience with Power BI or similar BI tools Familiarity with CI/CD practices in data environments Exposure to data More ❯
Bachelor’s or master’s degree in a relevant field 5+ years of proven experience as a Data Engineer Strong knowledge of SQL and relational databases Experience with data modelling, data warehouses, data lakes, and ETL Familiarity with cloud-based BI solutions Experience with dimensionalmodelling Solid analytical and communication skills Fluent in Dutch (written and spoken More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Evelyn Partners
Partners, we are expanding our Data Services team, investing in a large-scale modernisation programme to drive innovation and insights. We’re building a new team to enhance data modelling, reporting, AI initiatives, and cloud data platforms like Snowflake. As part of our transformation, we’re growing our analytics capability to help develop the understanding of commercial performance and … critical in the development and support of the new Evelyn Data Platform, which is being engineered on Snowflake, utilising Azure Data Factory pipelines for data integration, dbt for data modelling, Azure BLOB Storage for data storage, and GitHub for version control and collaboration. As a Data Engineer, your responsibilities will include: Design, develop, and implement data warehouse solutions using … Partners, we are expanding our Data Services team, investing in a large-scale modernisation programme to drive innovation and insights. We’re building a new team to enhance data modelling, reporting, AI initiatives, and cloud data platforms like Snowflake. As part of our transformation, we’re growing our analytics capability to help develop the understanding of commercial performance and More ❯
seamlessly and effectively integrated into the organizations systems and processes. The team has a variety of skills and experience in technical domains such as Software development, Automation, Data-pipeline-modelling, IT project management, and Data visualisation. We also pride ourselves on our close integration with non-technical/non-data teams as our success comes from effectively working with … in dbt. Experience with GCP. Experience with Terraform. Strong Python skills. Experience with version control for data models (e.g., dbt testing frameworks, data documentation). Demonstrated experience with data modelling concepts (dimensionalmodelling, star schemas). Experience writing efficient and optimised SQL for large-scale data transformations. Understanding of data warehouse design principles and best practices. Proactive … Experience working with APIs. Experience working with large-scale spatial datasets (billions of rows) and performing geospatial at scale using BigQuery GIS or similar tools. Experience with advanced analytical modelling techniques, including statistical analysis and predictive modelling, particularly applying these to large-scale datasets to derive actionable insights. Knowledge of data governance and metadata management. Experience with data More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
Databricks, Azure SQL, and Data Factory. Deep technical knowledge of SQL Server including stored procedures and complex data transformation logic. Proven experience in designing and delivering data warehousing and dimensionalmodelling solutions. Excellent collaboration skills with a track record of working in Agile teams. Experience with Azure DevOps, GIT, and CI/CD pipelines. Comfortable liaising directly with More ❯
Business Intelligence Engineer Hybrid – London Up to £450 a day Inside IR35 6 Months Key Skills: Business Intelligence (Data modelling, data warehousing, Dashboarding) SQL & Python AWS (S3, Lambda, Glue, Redshift) The Senior Business Intelligence Engineer occupies a unique role at the intersection of technology, marketing, finance, statistics, data mining, and social science. We provide the key insight into customer … using AWS CDK - Proficiency in ETL/ELT processes and best practices - Experience with data visualization tools (Quicksight) Required Skills: - Strong analytical and problem-solving abilities - Excellent understanding of dimensional modeling and star schema design (Facts, dimensions, scd type 2) - Experience with agile development methodologies - Strong communication skills and ability to work with cross-functional teams - Background in data More ❯