Snowflake DBA - Database Administration, Snowflake Admin £Market Rate (Inside IR35) London/Hybrid 6 months We are currently working with a client who urgently require a Snowflake DBA/Snowflake Administrator with Snowflake expertise as well as excellent Database, DevOps and Automation processes. Key Requirements: Proven expertise in Snowflake Database Administration Solid understanding of Snowflake Administration/Snowflake Networking/… Snowflake Identity processes/Snowflake AI (Cortex)/Snowflake Container Services Strong understanding of Databases Excellent command of Development and Automation Strong working knowledge of Azure-based Data Platforms Familiar with Security and Data Protection Excellent communication skills and stakeholder engagement Nice to have: Immediate availability Awareness of operational challenges Good awareness of the DevOps approach Hays Specialist Recruitment Limited More ❯
Snowflake Engineer - Database, Automation, DevOps £Market Rate (Inside IR35) London/Hybrid 6 months We are currently working with a client who urgently require a Snowflake Engineer with Snowflake expertise as well as excellent Database, DevOps and Automation processes.Key Requirements: · Proven expertise in Snowflake Engineering (can be as a DBA/Consultant/DevOps Engineer) · Strong understanding of Databases · Excellent … based Data Platforms · Familiar with Security and Data Protection · Good awareness of the DevOps approach · Excellent communication skills and stakeholder engagement Nice to have: · Immediate availability · Solid understanding of Snowflake Administration/Snowflake Networking/Snowflake Identity processes/Snowflake AI (Cortex)/Snowflake Container Services · Awareness of operational challengesIf you're interested in this role, click 'apply now' to More ❯
data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages More ❯
data engineering role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Billigence
4+ years of experience across data engineering, cloud computing, or data warehousing Minimum 2 years in hands-on development capacity Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes More ❯
4+ years of experience across data engineering, cloud computing, or data warehousing Minimum 2 years in hands-on development capacity Expertise in one or more modern cloud data platforms: Snowflake, Databricks, AWS Redshift, Microsoft Fabric, or similar Understanding of data modelling principles, dimensional modelling, and database design Proficiency in SQL and query optimization Comprehensive knowledge of ETL/ELT processes More ❯
/CD pipelines for data solutions Solid experience with cloud platforms (e.g., AWS, Azure, or GCP), and their data services. Deep understanding of data warehousing concepts and technologies (e.g., Snowflake, Redshift). Strong knowledge of ETL/ELT processes and tools. Solid experience of utilising PowerBI or similar visualisation tools Experience working with big data technologies and frameworks (e.g., Spark More ❯
environments (AWS, GCP, or Azure), infrastructure-as-code practices, and ideally container orchestration with Kubernetes. Familiarity with SQL and NoSQL databases (Cassandra, Postgres), ideally combined with data collaboration platforms (Snowflake, Databricks) Strong DevOps mindset with experience in CI/CD pipelines, monitoring, and observability tools (Grafana or equivalent). Exposure to analytics, reporting, and BI tools such as Apache Superset More ❯
Snowflake Data Engineer Location: Central London office space by Mansion House/Cannon Street/Monument (EC4V) Transform lives through data and AI Who we are: We deliver powerful data and AI solutions that minimise operational costs, strengthen resilience against risk, and uncover revenue opportunities. Clients can retain expert teams to drive lasting adoption while futureproofing their workforce with exceptional … including financial services, insurance, asset management, pharmaceuticals, energy and natural resources, retail, healthcare, manufacturing, and mobility. As a preferred partner of today’s leading technology providers - such as Databricks, Snowflake and Collibra - delivery is accelerated and revolutionary solutions are co-created. The lives of consultants, clients, and their customers are transformed through data and AI. The Role: We are seeking … a Snowflake Data Engineer to lead the design, development and deployment of cloud data platforms and analytics solutions using Snowflake. This combines hands-on engineering, solution architecture and technical leadership. You will translate business requirements into scalable Snowflake solutions, guide project teams on best practices for data modelling and ELT design, and ensure that outcomes deliver measurable value for our More ❯
Snowflake Data Engineer Location: Central London office space by Mansion House/Cannon Street/Monument (EC4V) Transform lives through data and AI Who we are: We deliver powerful data and AI solutions that minimise operational costs, strengthen resilience against risk, and uncover revenue opportunities. Clients can retain expert teams to drive lasting adoption while futureproofing their workforce with exceptional … including financial services, insurance, asset management, pharmaceuticals, energy and natural resources, retail, healthcare, manufacturing, and mobility. As a preferred partner of today’s leading technology providers - such as Databricks, Snowflake and Collibra - delivery is accelerated and revolutionary solutions are co-created. The lives of consultants, clients, and their customers are transformed through data and AI. The Role: We are seeking … a Snowflake Data Engineer to lead the design, development and deployment of cloud data platforms and analytics solutions using Snowflake. This combines hands-on engineering, solution architecture and technical leadership. You will translate business requirements into scalable Snowflake solutions, guide project teams on best practices for data modelling and ELT design, and ensure that outcomes deliver measurable value for our More ❯
analytics teams with reliable, well-modelled data Maintain documentation, contribute to architectural decisions, and drive best practices Essential Requirements: Strong SQL skills and experience with modern cloud data warehouses (Snowflake preferred) ELT pipeline development and modern data stack understanding Solid software engineering practices (Git, testing, CI/CD) Own pipeline health, monitoring, and optimisations. API integration experience (REST APIs, webhooks More ❯
in analytics engineering, data engineering, or closely related roles Strong proficiency in SQL, Python, and dbt (strongly preferred) Hands-on experience with Azure Databricks and cloud-based data platforms (Snowflake experience also valued) Solid understanding of dimensional modelling, lakehouse/warehouse design, and modern data stack Familiarity with Git, CI/CD, and software engineering best practices Experience with Power More ❯
Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
CV TECHNICAL LTD
Kafka, Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
Stockport, England, United Kingdom Hybrid/Remote Options
Gravitas Recruitment Group (Global) Ltd
ensure smooth releases and integration. Key Skills Data Modelling Python & SQL AWS/Redshift 3–5+ years of experience in data engineering Nice to Have Airflow, Tableau, Power BI, Snowflake, Databricks Data governance/data quality tooling Degree preferred Atlassian/Jira, CI/CD, Terraform Why Join? Career Growth: Clear progression to Tech Lead. Variety: Exposure to multiple squads More ❯
sources, including APIs, databases, and flat files. Assist in the setup and optimisation of data infrastructure and storage solutions, such as data warehouses and data lakes, using platforms like Snowflake, BigQuery, or Azure Synapse. Help manage cloud-based infrastructure and services across AWS, Azure, or GCP environments. Learn to monitor data quality and integrity across systems and gain an understanding More ❯
engineering teams delivering end-to-end data solutions across cloud and on-premise environments. Solution Architecture – Design modern data lakes, warehouses, and lakehouses using Databricks , with additional exposure to Snowflake, Azure Synapse, or Fabric being a bonus. Data Modelling – Build and optimise enterprise-grade data models across varied data layers. ETL/ELT Engineering – Use tooling such as Databricks, SSIS More ❯
stakeholders, including business teams. Build relationships across the bank, establishing a strong peer network and helping to strengthen collaboration.Skills and Experience: Essential Advanced proficiency in databases - SQL Server or Snowflake Advanced experience with low-code/no-code data engineering/ETL tool, preferably Markit EDM (v19.2 or above), however similar tools such as Informatica Power Centre may be acceptable More ❯
flows through everything they build. What You’ll Do Build and own data pipelines that connect product, analytics, and operations Design scalable architectures using tools like dbt , Airflow , and Snowflake/BigQuery Work with engineers and product teams to make data easily accessible and actionable Help evolve their data warehouse and ensure high data quality and reliability Experiment with automation More ❯
Glasgow, Scotland, United Kingdom Hybrid/Remote Options
NLB Services
as Pandas, NumPy, PySpark, etc. · 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines · 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions · 3+ years of experience in data development and solutions in highly complex data environments with large data volumes. · Solid understanding of ETL principles More ❯
version control (Git), testing, and code review practices Familiarity with cloud-based data environments (e.g. AWS, GCP, or Azure) Exposure to modern data tools such as Airflow, dbt, or Snowflake Experience or strong interest in streaming technologies like Apache Kafka Interest in MLOps and modern data engineering best practices Why join: You’ll be part of a company with a More ❯
version control (Git), testing, and code review practices Familiarity with cloud-based data environments (e.g. AWS, GCP, or Azure) Exposure to modern data tools such as Airflow, dbt, or Snowflake Experience or strong interest in streaming technologies like Apache Kafka Interest in MLOps and modern data engineering best practices Why join: You’ll be part of a company with a More ❯
support governance and compliance requirements. Skills & Experience Advanced technical proficiency in SQL, Python, and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar). Proven experience designing and implementing scalable data architectures, including dimensional modelling, data lakehouse/warehouse concepts, and modern data stack technologies. Strong software engineering practices including version More ❯
support governance and compliance requirements. Skills & Experience Advanced technical proficiency in SQL, Python, and modern data transformation tools (dbt strongly preferred), with experience in cloud data platforms (Azure Databricks, Snowflake, or similar). Proven experience designing and implementing scalable data architectures, including dimensional modelling, data lakehouse/warehouse concepts, and modern data stack technologies. Strong software engineering practices including version More ❯
and SQL . Hands-on experience with data orchestration tools (Airflow, dbt, Dagster, or Prefect). Solid understanding of cloud data platforms (AWS, GCP, or Azure) and data warehousing (Snowflake, BigQuery, Redshift). Experience with streaming technologies (Kafka, Kinesis, or similar). Strong knowledge of data modelling, governance, and architecture best practices . Excellent leadership, communication, and stakeholder management skills. More ❯