Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (star schema, snowflakeschema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark More ❯
Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (star schema, snowflakeschema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark More ❯
Foundational understanding of Big Data architecture (Data Lakes, Data Warehouses) and distributed processing concepts (e.g., MapReduce). ETL/ELT Basic knowledge of ETL principles and data modeling (star schema, snowflakeschema). Version Control Practical experience with Git (branching, merging, pull requests). Preferred Qualifications (A Plus) Experience with a distributed computing framework like Apache Spark More ❯
tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or SnowflakeSchema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. Cloud More ❯
tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or SnowflakeSchema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. Cloud More ❯
watford, hertfordshire, east anglia, united kingdom
Akkodis
tools such as MuleSoft, Boomi, Informatica, Talend, SSIS, or custom scripting languages (Python, PySpark, SQL) for data extraction and transformation. Prior experience with Data warehousing and Data modelling (Star Schema or SnowflakeSchema). Skilled in security frameworks such as GDPR, HIPAA, ISO 27001, NIST, SOX, and PII, with expertise in IAM, KMS, and RBAC implementation. Cloud More ❯
We are seeking a highly skilled Data Engineer/ETL Developer with strong experience in data warehousing, ETL development, and Snowflake cloud data platforms. The ideal candidate will have advanced proficiency in SQL and Unix scripting to design, develop, and maintain scalable and efficient data pipelines that support business intelligence, analytics, and reporting needs. Key Responsibilities Design, develop, and … maintain data warehouse solutions to support analytics and reporting. Build and optimize ETL pipelines for data ingestion, transformation, and loading across multiple systems. Work extensively with Snowflake for data modeling, performance tuning, and data sharing. Develop and maintain Unix shell scripts for automation, job scheduling, and system monitoring. Write complex and optimized SQL queries, stored procedures, and performance-tuned … . 3-7 years of hands-on experience in data warehousing and ETL development. Strong proficiency in SQL (query optimization, complex joins, CTEs, window functions). Practical experience with Snowflake (data loading, virtual warehouses, security, cost optimization). Solid experience with Unix/Linux scripting and automation tools. Familiarity with ETL tools such as Informatica, Talend, DataStage, SSIS, or More ❯
Havant, England, United Kingdom Hybrid / WFH Options
FatFace
SSAS, data modelling, and BI pipelines, who thrives in a collaborative environment and is eager to drive innovation in data infrastructure. Role Responsibilities Apply best practice to Star/Snowflakeschema data modelling with considerations in challenging and verifying data quality and accuracy. Lead the addition of new data sources to the Data Lake from 3rd party sources. More ❯
southampton, south east england, united kingdom Hybrid / WFH Options
FatFace
SSAS, data modelling, and BI pipelines, who thrives in a collaborative environment and is eager to drive innovation in data infrastructure. Role Responsibilities Apply best practice to Star/Snowflakeschema data modelling with considerations in challenging and verifying data quality and accuracy. Lead the addition of new data sources to the Data Lake from 3rd party sources. More ❯
Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT More ❯
Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT More ❯
Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT More ❯
london (city of london), south east england, united kingdom
Sanderson
Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT More ❯
Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT More ❯
M22, Northenden, Manchester, United Kingdom Hybrid / WFH Options
Express Solicitors
Experience: Experience integrating data from external systems via APIs. Knowledge of Python, R, or similar languages for data manipulation and automation. Familiarity with data warehousing concepts, including star/snowflakeschema design. Experience working in a professional services or legal sector environment. Understanding of data governance, compliance, and security best practices. Exposure to other Microsoft data tools such More ❯
Sharston, Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Express Solicitors
Experience: Experience integrating data from external systems via APIs. Knowledge of Python, R, or similar languages for data manipulation and automation. Familiarity with data warehousing concepts, including star/snowflakeschema design. Experience working in a professional services or legal sector environment. Understanding of data governance, compliance, and security best practices. Exposure to other Microsoft data tools such More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Staffworx
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
london, south east england, united kingdom Hybrid / WFH Options
Staffworx
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Staffworx
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Staffworx
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
REST APIs , Power BI Embedded , and programmatic data access patterns Data Engineering & Modelling Strong T-SQL skills for data retrieval and performance tuning Knowledge of dimensional modelling , star/snowflake schemas , and data warehouse best practices Preferred Qualifications Microsoft certifications such as DA-100 , DP-500 , or MCSE: BI Familiarity with CI/CD for BI assets (e.g. Git More ❯
Role: Snowflake Engineer/Developer Location: Manchester, UK Work Mode: Permanent - Hybrid - 4 Days/Week (Mandatory) JOB DESCRITION: Mandatory Possess good knowledge in Cloud computing Snowflake DBT airflow Very good working knowledge in data models viz Dimensional Data Model ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge … in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles dynamic tables streams and tasks policies etc Very good working experience in data related projects or applications Good working knowledge in Gitlab Good working knowledge in Python Good working knowledge in Data science and Machine Learning Good working knowledge in data management and data … or both Good working knowledge in investment banking and finance Good working knowledge in Statistics Good working knowledge in Power BI Ability to work in multiple projects Mandatory Skills: Snowflake, ANSI-SQL, Dimensional Data Modelling, Snowpark Container services, Snowflake-Data Science More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Robert Half
Data Engineer (Azure, Snowflake, DBT) | Insurance | London (Hybrid) Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced Data Engineer to join a major insurance client engagement. The role focuses on building out a Snowflake Data Warehouse established last year and scaling it to support multiple new data … Tax deducted at source, unlike umbrella companies and no umbrella company admin fees) Role Overview You’ll be working within a growing data engineering function, focused on scaling a Snowflake + DBT platform to support multiple analytical and operational use cases. The team is looking for an experienced engineer with strong technical depth and an insurance background, capable of … owning and extending core pipelines across the Azure and Snowflake stack. Key Skills & Experience Strong hands-on experience with Snowflake Cloud Data Warehouse (schemas, RBAC, performance tuning, ELT best practices). Proven commercial experience with DBT for modular data modelling, testing, documentation, and CI/CD integration. Skilled in Azure Data Factory, Synapse, and Databricks for end-to More ❯
Data Engineer (Azure, Snowflake, DBT) | Insurance | London (Hybrid) Robert Half International (an S&P 500 global staffing provider) is supporting a global consulting firm in sourcing an experienced Data Engineer to join a major insurance client engagement. The role focuses on building out a Snowflake Data Warehouse established last year and scaling it to support multiple new data … Tax deducted at source, unlike umbrella companies and no umbrella company admin fees) Role Overview You’ll be working within a growing data engineering function, focused on scaling a Snowflake + DBT platform to support multiple analytical and operational use cases. The team is looking for an experienced engineer with strong technical depth and an insurance background, capable of … owning and extending core pipelines across the Azure and Snowflake stack. Key Skills & Experience Strong hands-on experience with Snowflake Cloud Data Warehouse (schemas, RBAC, performance tuning, ELT best practices). Proven commercial experience with DBT for modular data modelling, testing, documentation, and CI/CD integration. Skilled in Azure Data Factory, Synapse, and Databricks for end-to More ❯