Implement CI/CD processes using Azure DevOps for secure, reliable deployment Technical Skills: Strong expertise in: Power BI and paginated reporting SQL and data architecture Dimensional modelling (star schema, snowflake, denormalised structures, SCD handling) DAX, Visual Studio and data transformation logic Azure Fabric, Azure Data Factory, Synapse, Data Lakes and Lakehouse/Warehouse technologies ETL/ELT More ❯
IronPython automation, document properties. Conduct detailed analysis of Spotfire markings, filters and visual structures to produce functional specification documents for migration. Define and redesign semantic data models (star/snowflake schemas) suitable for Power BI. Collaborate with data engineers and Power BI developers to align source data, dataflows and model transformations. Work with business stakeholders to define functional parity More ❯
develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and maintain Power BI dashboards and reports to deliver actionable insights to the business. Automate workflows and More ❯
develop end-to-end Azure Data Warehouse solutions Build and maintain robust ETL/ELT pipelines using Azure Data Factory. Implement and maintain efficient data models and star/snowflake schemas. Optimize queries, improve performance, and ensure data quality and integrity. Develop and maintain Power BI dashboards and reports to deliver actionable insights to the business. Automate workflows and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Method Resourcing
dashboards, DAX, Power Query, and complex data modeling Strong SQL skills for data extraction, transformation, and performance optimisation (essential) Solid understanding of data warehousing principles such as star and snowflake schemas, as well as ETL processes Experience in designing and implementing semantic and tabular models for reporting solutions Excellent communication abilities with proven experience collaborating with clients (essential) Contract More ❯
products. Data Engineering Consultant, key responsibilities: Work cross-functionally with non-technical stakeholders to understand data requirements Expert knowledge of SQL to query, analysis and model data Experience with Snowflake Using DBT for data transforms and modelling Data ingestion using Python Build foundations for business functions to be able to access the correct data and insights Experience working in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
stakeholders to fulfil their data requirements and support their work.* Collect, process, and transform datasets from various sources, ensuring data quality and integrity.* Optimise and maintain the organisation's Snowflake infrastructure, providing performance tuning, enhancements, and regular "MOT" checks.* Develop and maintain DBT pipelines, mainly on Snowflake, to enable efficient transformation and modelling of data.* Implement efficient data … Keep abreast of industry trends and emerging technologies in data engineering and cloud infrastructure, continuously improving skills and knowledge. Profile * The Data Engineer will have proven experience working with Snowflake infrastructure, including optimisation and maintenance.* Experience with DBT for data transformation and modelling, ideally in a Snowflake environment.* Proficient in SQL and experienced in database design and administration. More ❯
Galway, Galway County, Republic of Ireland Hybrid / WFH Options
CompuStaff
high-growth, innovation-led team shaping the next generation of cloud security solutions. What You’ll Do Design, build, and maintain scalable data pipelines and models using dbt and Snowflake . Develop automated workflows and ETL processes in Python to ensure high reliability and performance. Collaborate with analytics and engineering teams to turn business needs into robust data solutions. … a best-in-class data culture. What You’ll Bring Proven experience as a Data Engineer in a fast-moving tech or SaaS environment. Strong expertise in dbt Core , Snowflake , Python , and SQL . Experience with ETL/ELT pipelines , workflow orchestration (ideally Prefect ), and AWS (S3, ECS). Familiarity with BI governance, compliance, and CI/CD practices. More ❯
of 10 years of commercial experience working in a data-centric environment with a proven track record in financial services.Understanding of data modelling principles (eg: relationships, normalisation, star/snowflake schemas) You will have experience with enterprise data modelling tools (eg: ER/Studio, Erwin). Deep knowledge of capital markets and/or investment banking products (bonds, repos More ❯
of 10 years of commercial experience working in a data-centric environment with a proven track record in financial services.Understanding of data modelling principles (eg: relationships, normalisation, star/snowflake schemas) You will have experience with enterprise data modelling tools (eg: ER/Studio, Erwin). Deep knowledge of capital markets and/or investment banking products (bonds, repos More ❯
Snowflake Data Modeller Location: Remote (Mostly remote with some occasional travel to Hemel Hempstead) Contract: Outside IR35 Day rate: Up to £550 per day Duration: 6 months Start date: ASAP Key skills: Snowflake, DBT, SQL, Python, AWS and Kimball The Client who are in the process of migrating to SnowFlake therefore require extra support. As a result … opportunity to be at the cutting edge of data engineering. YOUR SKILLS AND EXPERIENCE A successful Senior Data Engineer here will have experience in the following: - Advanced SQL knowledge - SnowFlake (ideally certified) - Python development - AWS cloud experience essential, relating to data tooling and development - Working knowledge of Data Build Tool (DBT). o Develop staging, intermediate and marts in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
with data scientists, analysts, and engineers to deliver clean, structured, and reliable data. Develop robust data transformations in Python and SQL, ensuring performance and accuracy. Work hands-on with Snowflake to model, optimise, and manage data flows. Continuously improve data engineering practices - from automation to observability. Bring ideas to the table: help shape how data is collected, processed, and … years of experience in data engineering or a similar role. Strong Python skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/ More ❯
Role/Job Title: AWS Solution Architect Work Location: London 250 Bishopsgate (Onsite) The Role As an AWS Solution Architect , you will be working closely with the technical team on development and implementation journeys that transform traditional banking infrastructure and More ❯