london (city of london), south east england, united kingdom
Capgemini
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflakeMore ❯
Role: Snowflake Data Architect Location: Hove, UK Type: Permanent Role Work Mode: Hybrid Role & Responsibilities Define and implement the end-to-end architecture of Data warehouse on Snowflake. Create and maintain conceptual, logical, and physical data models in Snowflake. Design data pipelines and ingestion frameworks using Snowflake native tools. Collaborate with Data Governance teams to establish data lineage More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
Senior Engineer to join our Data & Analytics team. This role is instrumental in delivering clean, modern, and efficient data solutions across cloud-native platforms. Key Responsibilities Develop solutions across Snowflake, Azure, and DBT platforms. Lead migration and optimisation of applications using Azure cloud-native services. Write clean, testable, and maintainable code following industry standards. Implement CI/CD pipelines … deliver user-centric solutions. About the Candidate The ideal candidate will possess the following: Strong understanding of data warehousing, ELT/ETL processes, and data modelling. Proficiency in Azure, Snowflake, and DBT. Experience in application modernisation and migration. Ability to produce clean, testable, maintainable code. CI/CD pipeline implementation and test automation. Familiarity with AI-powered development tools More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
with data scientists, analysts, and engineers to deliver clean, structured, and reliable data. Develop robust data transformations in Python and SQL, ensuring performance and accuracy. Work hands-on with Snowflake to model, optimise, and manage data flows. Continuously improve data engineering practices - from automation to observability. Bring ideas to the table: help shape how data is collected, processed, and … years of experience in data engineering or a similar role. Strong Python skills for data processing and pipeline development. Proven experience writing complex and efficient SQL. Hands-on Snowflake experience (data modelling, performance tuning, pipelines). Familiarity with orchestration tools (e.g., Airflow, dbt) is a plus. A solid understanding of data best practices, version control (Git), and CI/ More ❯
bridge between business teams and technical teams to ensure alignment on KPIs, metrics, and data logic. Data Exploration and Identification o Identify and extract the most relevant data from Snowflake to support business questions and performance tracking. o Conduct exploratory data analysis to uncover trends, anomalies, and opportunities for improvement in logistics operations. Data Modelling and Metric Logic o … stakeholders in understanding and using analytical outputs effectively. Continuous Improvement and Innovation o Continuously seek opportunities to improve data models, metric logic, and analytical processes. o Stay current with Snowflake capabilities, data analysis best practices, and logistics trends to enhance analytical value. Cross-Functional Collaboration o Partner with Qlik Developers, Data Engineers, and business units to ensure seamless integration … Alignment: Strong analytical skills to interpret complex data and translate business requirements into actionable insights. Ability to validate KPIs and metrics aligned with operational goals across departments. • Data Modeling & Snowflake Expertise: Proficiency in designing scalable and efficient data models using Snowflake. Experience with SQL for querying, transforming, and preparing data for analysis and visualization. Understanding of data warehousing principles More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Salt Search
landscapes (MDM, CRM, ERP, Cloud DWH) through pragmatic and scalable architectural blueprints. Cloud Data Platform Leadership Design and implement high-performance cloud data platforms (AWS, Azure, Google Cloud, Databricks, Snowflake), overseeing data modelling, integration, transformation, and DevOps pipelines. Integrated Solution Architecture Design seamless integrations between cloud data platforms, AI/GenAI platforms, and business-critical systems (e.g., MDM, CRM … as a Data Architect , leading design and implementation of complex cloud-based data ecosystems. Solid engineering background with hands-on data platform implementation experience (AWS, Azure, GCP, Databricks, or Snowflake). Proven ability to evaluate data architecture decisions, influence business and IT stakeholders, and define strategic data direction. Strong understanding of coding best practices , code quality tools (e.g., SonarQube More ❯