into a more engineering-focused position, someone who enjoys understanding the business context just as much as building the data solutions behind it. You'll work extensively with Python , Snowflake , SQL , and dbt to design, build, and maintain scalable, high-quality data pipelines and models that support decision-making across the business. This is a hands-on, collaborative role, suited … s confident communicating with data, product, and engineering teams, not a "heads-down coder" type. Top 4 Core Skills Python - workflow automation, data processing, and ETL/ELT development. Snowflake - scalable data architecture, performance optimisation, and governance. SQL - expert-level query writing and optimisation for analytics and transformations. dbt (Data Build Tool) - modular data modelling, testing, documentation, and version control. … build, and maintain dbt models and SQL transformations to support analytical and operational use cases. Develop and maintain Python workflows for data ingestion, transformation, and automation. Engineer scalable, performant Snowflake pipelines and data models aligned with business and product needs. Partner closely with analysts, product managers, and engineers to translate complex business requirements into data-driven solutions. Write production-grade More ❯
Robert Half have partnered with one of our Pharmaceutical manufacturing organisations to help them recruit for an experience Snowflake Data Engineer on an initial 6 months experienced based out of their London office. You will play a key role in designing and managing our Snowflake data warehouse and leveraging dbt (Data Build Tool) to transform raw data into reliable, analysis … data pipelines to support manufacturing, quality, and supply chain data workflows. Implement and manage data transformation models using dbt to standardise and validate datasets. Optimise and monitor performance of Snowflake data warehouse environments. Collaborate with cross-functional teams (Data Scientists, Quality, Manufacturing, IT) to define data requirements and deliver reliable data solutions. Develop and maintain ETL/ELT workflows using … Information Systems, or related field. 3-5+ years of experience as a Data Engineer, ideally in a pharmaceutical, biotech, or regulated manufacturing environment. Strong hands-on experience with: Snowflake (architecture, performance tuning, cost optimisation) DBT (model development, testing, documentation, deployment) SQL (advanced query optimisation and debugging) Experience with data integration and orchestration tools (Airflow, Dagster, Prefect, or similar). More ❯
robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration … Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes … of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure-as-code - Experience with More ❯
robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration … Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes … of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure-as-code - Experience with More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
storage. Drive innovation and continuous improvement Explore and integrate emerging tools in cloud, automation, and data architecture. Lead or support migrations to modern platforms such as AWS, Azure, GCP, Snowflake, or Databricks. Proactively identify opportunities to streamline and optimize performance. What You’ll Bring Experience: Proven hands-on experience as a Data Engineer or in a similar data-focused role. … Proficiency in SQL and Python . Strong grasp of ETL/ELT pipelines , data modeling , and data warehousing . Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and More ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflake schemas More ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflake schemas More ❯
Glasgow, Lanarkshire, Scotland, United Kingdom Hybrid / WFH Options
KBC Technologies UK LTD
looking for Data Engineer for Glasgow location. Mode of Work - hybrid Databricks being (primarily) a Managed Spark engine – strong Spark experience is a must-have. Databricks (BigData/Spark) & Snowflake specialists – and general Data Engineer skills, with RDBMS Fundamentals, SQL, ETL. More ❯
One of our International customers is now has a new opportunity for a Snowflake Developer/Architect The initial contract will be for 6 months with a possible extension. Job Title: Snowflake Developer/Architect Location: Manchester , 4 days onsite & 1 day remote. Job Type: 6 Months contract with possible extension Language: English Speaking Contract Type: Inside IR35 Role Mandatory … good working knowledge in data models viz Dimensional Data Model ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles dynamic tables streams and tasks policies etc Mandatory Skills : Snowflake, ANSI-SQL, Dimensional Data Modeling, Snow park Container services … Snowflake-Data Science , DBT, Airflow If you are interested, please contact hrajendran@redglobal.com or apply here. More ❯
collaboration with internal development teams to ensure stability and performance of critical business systems. Key Skills Provide user support, monitor system performance, and ensure smooth application operations. Experience in Snowflake, Power Platform and Azure integration Triage, investigate, and resolve technical issues related to in-house and third-party applications, integrations, and data feeds. Proficiency in Excel or SQL for data More ❯
collaboration with internal development teams to ensure stability and performance of critical business systems. Key Skills Provide user support, monitor system performance, and ensure smooth application operations. Experience in Snowflake, Power Platform and Azure integration Triage, investigate, and resolve technical issues related to in-house and third-party applications, integrations, and data feeds. Proficiency in Excel or SQL for data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
Senior Snowflake Developer Location: Hybrid – London - 2 days per week increasing in the New Year Day Rate - Inside around £500/day Starting ASAP for 6 months - likely to extend About the Role We’re partnering with a financial client that’s undertaking a large-scale data transformation, modernising legacy systems and building a unified analytics platform in the cloud. … As part of this initiative, you’ll play a key role in developing and optimising a Snowflake-based data warehouse and delivering high-quality visualisations and reporting through Power BI. The role offers the chance to work on greenfield architecture, migrate legacy SQL Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll … Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS) systems into a modern Snowflake/Power BI environment. Work collaboratively with business users to understand requirements More ❯
Senior Snowflake Developer Location: Hybrid – London - 2 days per week increasing in the New Year Day Rate - Inside around £500/day Starting ASAP for 6 months - likely to extend About the Role We’re partnering with a financial client that’s undertaking a large-scale data transformation, modernising legacy systems and building a unified analytics platform in the cloud. … As part of this initiative, you’ll play a key role in developing and optimising a Snowflake-based data warehouse and delivering high-quality visualisations and reporting through Power BI. The role offers the chance to work on greenfield architecture, migrate legacy SQL Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll … Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS) systems into a modern Snowflake/Power BI environment. Work collaboratively with business users to understand requirements More ❯
embedding them across teams and functions. About you: Proven experience leading large, cross-functional data teams in complex organisations. Deep understanding of modern data architectures, platforms, and tools (e.g. Snowflake, cloud ecosystems, AI/ML) Skilled in delivering enterprise-scale data strategies that drive measurable impact Strong leadership, stakeholder engagement, and communication skills. More ❯
embedding them across teams and functions. About you: Proven experience leading large, cross-functional data teams in complex organisations. Deep understanding of modern data architectures, platforms, and tools (e.g. Snowflake, cloud ecosystems, AI/ML) Skilled in delivering enterprise-scale data strategies that drive measurable impact Strong leadership, stakeholder engagement, and communication skills. More ❯
be responsible for troubleshooting, DevOps, and facilitating cross-platform integration in a hybrid working environment. Key Responsibilities: System Integration & Architecture Design and implement solution integrations across Databricks, ServiceNow, SAP, Snowflake, and Mulesoft . Architect containerized applications with secure connectivity to backend services. Develop and test APIs for integration across cloud and on-premises environments. Application Development & Deployment Oversee the build More ❯
re looking for: Proven experience in Azure cloud engineering and infrastructure automation. Strong knowledge of CI/CD, Kubernetes, Helm, and API management. Experience supporting data platforms (Databricks, dbt, Snowflake). Comfortable working in Agile teams and collaborating with stakeholders. Certifications like Azure Solutions Architect Expert or Terraform Associate are a plus. Why join? You’ll be part of a More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Morson Edge
and orchestrating workflows using tools such as AWS Glue, Azure Data Factory or Google Cloud Dataflow Working with leading data platforms like Amazon S3, Azure Synapse, Google BigQuery and Snowflake Implementing Lakehouse architectures using Databricks or Snowflake Collaborating with engineers, analysts and client teams to deliver value-focused data solutions What We’re Looking For: Strong experience with Python, SQL More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
MRJ Recruitment
GitHub). Containerization: Familiarity with Docker and Kubernetes. Core Fundamentals: Solid understanding of networking, IAM, and platform security. Database Acumen: Experience with SQL, NoSQL, and data warehouses (MySQL, DynamoDB, Snowflake). Scripting Versatility: Proficient in Bash, Python, or Go. Mentorship: Proven experience mentoring individuals and fostering team growth. Collaboration: Excellent communication and teamwork. Ready to lead critical infrastructure rebuilds and More ❯
environments Proven ability to deliver AI capabilities in production, including model training, deployment, monitoring, and lifecycle management Strong technical background across machine learning, data pipelines, and cloud platforms (e.g. Snowflake, Databricks, Azure) Exceptional communication and influencing skills, with the ability to align executive stakeholders around AI strategy A pragmatic, delivery-focused mindset and passion for scalable, secure, and privacy-conscious More ❯
ll Be Doing Own campaign strategy and execution across paid media, webinars, email nurture, SEO, and content syndication. Design and run ecosystem co-marketing campaigns with our technology (Microsoft, Snowflake, Databricks, etc.) and channel partners. Develop and optimise multi-channel lead journeys that drive measurable lift in MQL→SQL conversions. Build regional and global programs that scale pipeline coverage. Partner More ❯
robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration … Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes … of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure-as-code - Experience with More ❯
4+ years’ experience in data architecture, automation, or AI solution development. Expertise with Microsoft Power Platform , Power Automate , and Copilot Studio . Strong understanding of cloud-based environments (Azure, Snowflake, or Databricks). Excellent stakeholder engagement skills and the ability to communicate complex concepts clearly. Automation | AI | Data Analytics | Machine Learning | Power BI | Power Automate | Copilot Studio | Microsoft Power Platform … Data Architecture | Data Engineering | Azure | Snowflake | Databricks | SQL | Process Automation | Predictive Analytics | Cloud Solutions | Digital Transformation | Industrial Automation | PLC | HMI | SCADA | IoT | Data Modelling | Data Visualisation | Business Intelligence | Agile Delivery | Continuous Improvement | Technical Leadership | Automation Strategy | Process Optimisation | Power Apps | Python | Data Governance | Data Integration | Data Warehousing | Robotics | IT Systems | Manufacturing Systems | Automation Projects | Artificial Intelligence More ❯
Expect to work across diverse client projects, collaborating with both internal engineers and client tech teams. Key responsibilities: Build and maintain ETL pipelines and cloud data warehouses (AWS, GCP, Snowflake) Develop custom data connectors (e.g. Salesforce, SAP) Automate data cleansing and transformation workflows Support analytics and AI teams with clean, production-ready data About you: 3+ years in Analytics/… Data Engineering Strong SQL and database design skills Proficient in Python or R for automation Hands-on with cloud platforms (AWS/GCP/Azure/Snowflake) Passion for data quality, scalability, and collaboration Nice to have: Experience with SaaS products, analytics tooling, or modern data stack tools (dbt, Airflow). Perks: EMI share options • Training budget • Private healthcare • Pension More ❯