be great if you have: Experience of relevant cloud services within AWS, Azure or GCP. Experience working in an Agile environment. Experience working with common vendor products such as Snowflake or Data Bricks. Experience working with CI/CD tooling. What you'll get in return is: 25 days' annual leave, rising to 30 days with each year of service. More ❯
Job Title: Snowflake Centre of Excellence Lead Location: Central London (Hybrid - 2 to 3 days on site per week) Employment Type: Permanent Salary: up to £120,000 per annum + benefits About the Role: We are working with a prestigious client based in London who are seeking a Snowflake Lead to play a pivotal role in establishing and scaling their … Snowflake capability. This is a unique opportunity for a seasoned Snowflake professional to build a Centre of Excellence from the ground up within a fast-paced, high-impact environment. As the Snowflake CoE Lead, you will be instrumental in shaping the organisation's Snowflake strategy, architecture, and delivery model. You'll bring your deep technical expertise, leadership experience, and direct … engagement with Snowflake to build a best-in-class data platform offering. Key Responsibilities: Lead the design, setup, and growth of a Snowflake practice, including establishing a Centre of Excellence. Architect, implement, and maintain scalable data solutions using Snowflake. Collaborate closely with stakeholders across the organisation and with Snowflake directly to influence strategic direction. Mentor and lead a team of More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
problem-solving skills and attention to detail. Excellent communication skills and a collaborative mindset. Nice To Have Experience with Python for data transformation, scripting, or automation tasks. Familiarity with Snowflake or other modern cloud data warehouses. Exposure to investment management, financial services, or regulated environments. Understanding of data modelling, governance, and best practices in data architecture. Company Market leading financial More ❯
Cambridge, Cambridgeshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
training materials for clients Present findings and recommendations to senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks More ❯
Oxford, Oxfordshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
training materials for clients Present findings and recommendations to senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Oscar Technology
training materials for clients Present findings and recommendations to senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks More ❯
Milton Keynes, Buckinghamshire, England, United Kingdom Hybrid / WFH Options
Oscar Technology
training materials for clients Present findings and recommendations to senior client stakeholders Tech Stack You'll Work With BI Tools : Power BI, Tableau, Looker, Google Data Studio Data Warehousing : Snowflake, BigQuery, Redshift, SQL Server Languages : SQL, DAX, Python (Pandas, NumPy, Scikit-learn - desirable) CRM Systems : Salesforce, HubSpot, Dynamics 365, Zoho ETL Tools : dbt, Fivetran, Talend, Alteryx AI/ML Frameworks More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Dept Holding B.V
with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
with cloud platforms (GCP, AWS, Azure) and their data-specific services Proficiency in Python, SQL, and data orchestration tools (e.g., Airflow, DBT) Experience with modern data warehouse technologies (BigQuery, Snowflake, Redshift, etc.) Strong understanding of data modeling, data governance, and data quality principles Excellent communication skills with the ability to translate complex technical concepts for business stakeholders Strategic thinking with More ❯
on building scalable data solutions. Experience with data pipeline orchestration tools such as Dagster or similar. Familiarity with cloud platforms (e.g. AWS) and their data services (e.g., S3, Redshift, Snowflake). Understanding of data warehousing concepts and experience with modern warehousing solutions. Experience with GitHub Actions (or similar) and implementing CI/CD pipelines for data workflows and version-controlled More ❯
client requirements into solution blueprints and supporting proposal development. Key Responsibilities Architect and oversee the delivery of enterprise-scale data platforms (data lakes, lakehouses, warehouses) using tools like Databricks , Snowflake , Synapse , and Azure Fabric Define and execute cloud migration strategies, leveraging CI/CD pipelines, Terraform, Azure DevOps, and GitHub Support RFP/RFI responses, conduct client workshops, and shape More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McCabe & Barton
/remote working and a salary between £50,000 and £70,000. As part of the data engineering team, youll design and deliver scalable data products using technologies like Snowflake, Power BI, Python, and SQL. Your work will enable self-service analytics and support data governance across the business. Key Responsibilities: Develop robust ETL/ELT pipelines and dimensional models More ❯
Azure DevOps . Excellent communication skills and stakeholder engagement capabilities. Nice to Have Familiarity with data visualization tools (e.g., Power BI, Tableau). Exposure to cloud platforms (e.g., Databricks , Snowflake ). Understanding of data governance , lineage , and metadata management . Experience with data cataloguing and quality frameworks . More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
Engineer or in a similar data engineering/BI role Advanced SQL skills with hands-on experience using dbt for data modeling Strong familiarity with modern data platforms like Snowflake, Looker, AWS/GCP, Tableau, or Airflow Experience with version control tools (e.g., Git) Ability to design, build, and document scalable, reliable data models Comfortable gathering business requirements and translating More ❯
Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design Strong More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Softwire
Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design Strong More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Octad Recruitment Consultants (Octad Ltd )
engineering experience (IaaS ? PaaS), including Infra as Code. Strong SQL skills and proficiency in Python or PySpark . Built or maintained data lakes/warehouses using Synapse , Fabric , Databricks , Snowflake , or Redshift . Experience hardening cloud environments (NSGs, identity, Defender). Demonstrated automation of backups, CI/CD deployments, or DR workflows. Nice-to-Haves: Experience with Azure OpenAI , vector More ❯
Cardiff, South Glamorgan, Wales, United Kingdom Hybrid / WFH Options
Octad Recruitment Ltd
engineering experience (IaaS PaaS), including Infra as Code. Strong SQL skills and proficiency in Python or PySpark . Built or maintained data lakes/warehouses using Synapse , Fabric , Databricks , Snowflake , or Redshift . Experience hardening cloud environments (NSGs, identity, Defender). Demonstrated automation of backups, CI/CD deployments, or DR workflows. Nice-to-Haves: Experience with Azure OpenAI , vector More ❯
Spark Streaming, Kinesis) Familiarity with schema design and semi-structured data formats Exposure to containerisation, graph databases, or machine learning concepts Proficiency with cloud-native data tools (BigQuery, Redshift, Snowflake) Enthusiasm for learning and experimenting with new technologies Why Join Capco Deliver high-impact technology solutions for Tier 1 financial institutions Work in a collaborative, flat, and entrepreneurial consulting culture More ❯
tools. Understanding of Agile methodologies. Additional Skills Experience mentoring or supporting team development. Knowledge of Azure SQL DB, Data Factory, Data Lake, Logic Apps, Data Bricks (Spark SQL), and Snowflake is advantageous. More ❯
Technical Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through More ❯
Technical Skills: Proven expertise in designing, building, and operating data pipelines, warehouses, and scalable data architectures. Deep hands-on experience with modern data stacks. Our tech includes Python, SQL, Snowflake, Apache Iceberg, AWS S3, PostgresDB, Airflow, dbt, and Apache Spark, deployed via AWS, Docker, and Terraform. Experience with similar technologies is essential. Coaching & Growth Mindset: Passion for developing others through More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
use of machine learning. Key Focus Areas Own and execute enterprise data strategy Build and lead a multi-disciplinary data & AI team Drive modern data platform development (dbt, Airflow, Snowflake, Looker/Power BI) Deliver business-critical analytics and reporting Support responsible AI/ML initiatives Define data governance, privacy, and compliance frameworks What We're Looking For Proven data More ❯
Bromsgrove, Worcestershire, United Kingdom Hybrid / WFH Options
Reed Technology
deliverables Producing and maintaining high-quality technical documentation Championing data engineering best practices and standards across the business Technical skills Cloud data platforms - Azure, AWS, or GCP (Azure preferred) Snowflake - Deep knowledge and hands-on experience Matillion - Expertise in ETL orchestration Data warehousing and advanced analytics Dimensional modelling and data vault methodologies Stakeholder engagement and cross-functional collaboration Flexible hybrid More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
Talent Hero Ltd
Optimise data storage and retrieval for performance Work with batch and real-time processing frameworks Implement and manage ETL processes Use tools like Python, SQL, Spark, Airflow, Kafka, dbt, Snowflake, Redshift, BigQuery Requirements Bachelors degree in Computer Science, Engineering, or a related field 1+ year in a Data Engineer or similar role Proficiency in SQL and Python (or another scripting More ❯