featherstone, yorkshire and the humber, united kingdom Hybrid / WFH Options
GSF Car Parts
as SSIS, Azure Data Factory, or similar. Familiarity with relational databases (e.g., MS SQL Server, MySQL, PostgreSQL). Exposure to data warehousing concepts and dimensional modelling (e.g., star/snowflake schemas). Experience working with cloud platforms (preferably Azure, but AWS or GCP also valuable). Understanding of data governance, data quality, and security best practices. Some experience with More ❯
M22, Northenden, Manchester, United Kingdom Hybrid / WFH Options
Express Solicitors
Experience: Experience integrating data from external systems via APIs. Knowledge of Python, R, or similar languages for data manipulation and automation. Familiarity with data warehousing concepts, including star/snowflakeschema design. Experience working in a professional services or legal sector environment. Understanding of data governance, compliance, and security best practices. Exposure to other Microsoft data tools such More ❯
Sharston, Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Express Solicitors
Experience: Experience integrating data from external systems via APIs. Knowledge of Python, R, or similar languages for data manipulation and automation. Familiarity with data warehousing concepts, including star/snowflakeschema design. Experience working in a professional services or legal sector environment. Understanding of data governance, compliance, and security best practices. Exposure to other Microsoft data tools such More ❯
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and maintain Power BI dashboards and reports to deliver actionable insights to the business. Automate workflows and More ❯
develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and maintain Power BI dashboards and reports to deliver actionable insights to the business. Automate workflows and More ❯
REST APIs , Power BI Embedded , and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensional modelling , star/snowflake schemas , and data warehouse best practices Preferred Qualifications Microsoft certifications such as DA-100 , DP-500 , or MCSE: BI Familiarity with CI/CD for BI assets (e.g. Git More ❯
ETL/ELT pipelines using SQL and Python Integrate internal/external data sources via APIs and platform connectors Model and structure data for scalable analytics (e.g., star/snowflake schemas) Administer Microsoft Fabric Lakehouse and Azure services Optimise performance across queries, datasets, and pipelines Apply data validation, cleansing, and standardisation rules Document pipeline logic and contribute to business More ❯
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
london (city of london), south east england, united kingdom
i3
impactful dashboards Advanced SQL (T-SQL preferred) for managing and querying relational databases Experience with ETL tools (SSIS, Azure Data Factory, or similar) Strong data modelling skills (star/snowflakeschema) Familiarity with Azure SQL, Synapse Analytics , or other cloud platforms is a plus Domain Knowledge Solid experience in the Lloyd’s/London Market insurance environment Strong More ❯
EXPERIENCE: A successful Analytics Engineer will bring: Strong SQL skills and hands-on experience with dbt (or similar tools) Experience designing as well as building data models Exposure to Snowflake and/or data pipeline tools Understanding of testing, CI/CD, and data quality frameworks THE BENEFITS: You will receive a salary dependent on experience-up to More ❯
EXPERIENCE: A successful Analytics Engineer will bring: Strong SQL skills and hands-on experience with dbt (or similar tools) Experience designing as well as building data models Exposure to Snowflake and/or data pipeline tools Understanding of testing, CI/CD, and data quality frameworks THE BENEFITS: You will receive a salary dependent on experience-up to More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
dashboards, DAX, Power Query, and complex data modeling Strong SQL skills for data extraction, transformation, and performance optimisation (essential) Solid understanding of data warehousing principles such as star and snowflake schemas, as well as ETL processes Experience in designing and implementing semantic and tabular models for reporting solutions Excellent communication abilities with proven experience collaborating with clients (essential) Contract More ❯
and help deliver actionable business insights to stakeholders across the organisation. Key Responsibilities*Design and develop Power BI datasets, dashboards, and reports*Build efficient, scalable data models (star/snowflake schemas)*Implement advanced DAX calculations and KPIs*Gather and interpret business reporting requirements*Ensure data accuracy, compliance, and security in all reporting*Liaise between business, IT, and analytics teams More ❯
and help deliver actionable business insights to stakeholders across the organisation. Key Responsibilities *Design and develop Power BI datasets, dashboards, and reports *Build efficient, scalable data models (star/snowflake schemas) *Implement advanced DAX calculations and KPIs *Gather and interpret business reporting requirements *Ensure data accuracy, compliance, and security in all reporting *Liaise between business, IT, and analytics teams More ❯
into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role … s confident communicating with data, product, and engineering teams, not a 'heads-down coder' type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version … build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production More ❯
a Business Data Analyst to join their expanding data team during an exciting cloud data transformation programme and a Workday (HR & Finance) implementation. The business is building a new Snowflake-based data warehouse , and this role will play a pivotal part in shaping how data is structured, accessed, and used across the organisation. You will act as the key … interface between technical teams and business stakeholders, ensuring requirements are translated into high-quality Snowflake data models, pipelines, and reports that deliver measurable business value. Key Responsibilities Snowflake Data Analysis & Modelling Collaborate with stakeholders to capture, refine and document business and technical requirements for Snowflake-based data warehouses, reports, and dashboards . Design and maintain optimised Snowflake … star schemas) to ensure scalable, reliable reporting and analytics. Support Strategic Data Initiatives & Reporting Optimisation Evaluate existing reports and conduct technical analyses to identify and close data gaps within Snowflake's data model. Ensure alignment of data models with evolving business requirements and governance standards. Development, Testing & Operational Support Investigate and resolve data integrity or reporting issues within Snowflake. More ❯
products. Data Engineering Consultant, key responsibilities: Work cross-functionally with non-technical stakeholders to understand data requirements Expert knowledge of SQL to query, analysis and model data Experience with Snowflake Using DBT for data transforms and modelling Data ingestion using Python Build foundations for business functions to be able to access the correct data and insights Experience working in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Reed.co.uk
Looker solutions that support sales performance and revenue growth. Develop, document, and distribute Looker reports, ensuring consistency, accuracy, and usability. Build and maintain robust data models in DBT and Snowflake, ensuring clean, trustworthy data flows into reporting. Support data integrity within Salesforce and other commercial data sources, enhancing quality and enabling consistent reporting. Automate ingestion and transformation of financial … tool such as Looker, Tableau, Power Bi or similar Good SQL skills, you know your way around a database. Some exposure to a cloud-based data warehouse such as Snowflake, Redshift, Big Query or similar. Some knowledge of ETL/ELT and data automation. A desire to work with complex Salesforce detail and master that data. Excellent problem-solving More ❯
to modern finance workflows. Key Responsibilities Design, develop, and maintain robust ETL/ELT pipelines to ingest, transform, and securely store data from NetSuite and other finance systems into Snowflake, ensuring data integrity, compliance, and security best practices (e.g., encryption, access controls, and auditing). Collaborate with finance and data teams to define data models, schemas, and governance policies … 5+ years of experience as a data engineer or analytics engineer, with a proven track record in full stack data development (from ingestion to visualization). Strong expertise in Snowflake, including data modeling, warehousing, and performance optimization. Hands-on experience with ETL tools (e.g., Apache Airflow, dbt, Fivetran) and integrating data from ERP systems like NetSuite. Proficiency in SQL More ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯