Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT More ❯
develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and maintain Power BI dashboards and reports to deliver actionable insights to the business. Automate workflows and More ❯
develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and maintain Power BI dashboards and reports to deliver actionable insights to the business. Automate workflows and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
dashboards, DAX, Power Query, and complex data modeling Strong SQL skills for data extraction, transformation, and performance optimisation (essential) Solid understanding of data warehousing principles such as star and snowflake schemas, as well as ETL processes Experience in designing and implementing semantic and tabular models for reporting solutions Excellent communication abilities with proven experience collaborating with clients (essential) Contract More ❯
products. Data Engineering Consultant, key responsibilities: Work cross-functionally with non-technical stakeholders to understand data requirements Expert knowledge of SQL to query, analysis and model data Experience with Snowflake Using DBT for data transforms and modelling Data ingestion using Python Build foundations for business functions to be able to access the correct data and insights Experience working in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
stakeholders to fulfil their data requirements and support their work.* Collect, process, and transform datasets from various sources, ensuring data quality and integrity.* Optimise and maintain the organisation's Snowflake infrastructure, providing performance tuning, enhancements, and regular "MOT" checks.* Develop and maintain DBT pipelines, mainly on Snowflake, to enable efficient transformation and modelling of data.* Implement efficient data … Keep abreast of industry trends and emerging technologies in data engineering and cloud infrastructure, continuously improving skills and knowledge. Profile * The Data Engineer will have proven experience working with Snowflake infrastructure, including optimisation and maintenance.* Experience with DBT for data transformation and modelling, ideally in a Snowflake environment.* Proficient in SQL and experienced in database design and administration. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
with data scientists, analysts, and engineers to deliver clean, structured, and reliable data. Develop robust data transformations in Python and SQL, ensuring performance and accuracy. Work hands-on with Snowflake to model, optimise, and manage data flows. Continuously improve data engineering practices - from automation to observability. Bring ideas to the table: help shape how data is collected, processed, and … years of experience in data engineering or a similar role. Strong Python skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/ More ❯