robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration … Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes … of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure-as-code - Experience with More ❯
robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration … Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes … of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure-as-code - Experience with More ❯
robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration … Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes … of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure-as-code - Experience with More ❯
robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration … Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes … of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure-as-code - Experience with More ❯
london (city of london), south east england, united kingdom
Luxoft
robust, scalable data pipelines and infrastructure that power our analytics, machine learning, and business intelligence initiatives. You'll work with cutting-edge technologies like Python, PySpark, AWS EMR, and Snowflake, and collaborate across teams to ensure data is clean, reliable, and actionable. Responsibilities: - Build and maintain scalable ETL pipelines using Python and PySpark to support data ingestion, transformation, and integration … Develop and optimize distributed data workflows on AWS EMR for high-performance processing of large datasets - Design, implement, and tune Snowflake data warehouses to support analytical workloads and reporting needs - Partner with data scientists, analysts, and product teams to deliver reliable, well-documented datasets - Ensure data integrity, consistency, and accuracy across multiple sources and systems - Automate data workflows and processes … of experience in data engineering or software development - Strong proficiency in Python and PySpark - Hands-on experience with AWS services, especially EMR, S3, Lambda, and Glue - Deep understanding of Snowflake architecture and performance tuning - Solid grasp of data modeling, warehousing concepts, and SQL optimization - Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions) and infrastructure-as-code - Experience with More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
storage. Drive innovation and continuous improvement Explore and integrate emerging tools in cloud, automation, and data architecture. Lead or support migrations to modern platforms such as AWS, Azure, GCP, Snowflake, or Databricks. Proactively identify opportunities to streamline and optimize performance. What You’ll Bring Experience: Proven hands-on experience as a Data Engineer or in a similar data-focused role. … Proficiency in SQL and Python . Strong grasp of ETL/ELT pipelines , data modeling , and data warehousing . Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and More ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflake schemas More ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflake schemas More ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflake schemas More ❯
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflake schemas More ❯
london (city of london), south east england, united kingdom
Capgemini
value of technology and build a more sustainable, more inclusive world. Your Role Capgemini Financial Services is seeking a Data Engineer with deep expertise in DBT (Data Build Tool), Snowflake, and PL/SQL to join our growing data team. Person will be responsible for designing, developing, and maintaining robust data transformation pipelines that support business intelligence, analytics, and data …/SQL code for complex data processing and transformation tasks. Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and deliver high-quality solutions. Optimize Snowflake performance through query tuning, clustering, and resource management. Ensure data quality, integrity, and governance through testing, documentation, and monitoring. Participate in code reviews, architecture discussions, and continuous improvement initiatives. Maintain … DBT projects. Required Qualifications: 5+ years of experience in data engineering or a related field. Strong hands-on experience with DBT (modular SQL development, testing, documentation). Proficiency in Snowflake (data warehousing, performance tuning, security). Advanced knowledge of PL/SQL and experience with stored procedures, functions, and packages. Solid understanding of data modeling concepts (star/snowflake schemas More ❯
One of our International customers is now has a new opportunity for a Snowflake Developer/Architect The initial contract will be for 6 months with a possible extension. Job Title: Snowflake Developer/Architect Location: Manchester , 4 days onsite & 1 day remote. Job Type: 6 Months contract with possible extension Language: English Speaking Contract Type: Inside IR35 Role Mandatory … good working knowledge in data models viz Dimensional Data Model ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles dynamic tables streams and tasks policies etc Mandatory Skills : Snowflake, ANSI-SQL, Dimensional Data Modeling, Snow park Container services … Snowflake-Data Science , DBT, Airflow If you are interested, please contact or apply here. More ❯
One of our International customers is now has a new opportunity for a Snowflake Developer/Architect The initial contract will be for 6 months with a possible extension. Job Title: Snowflake Developer/Architect Location: Manchester , 4 days onsite & 1 day remote. Job Type: 6 Months contract with possible extension Language: English Speaking Contract Type: Inside IR35 Role Mandatory … good working knowledge in data models viz Dimensional Data Model ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles dynamic tables streams and tasks policies etc Mandatory Skills : Snowflake, ANSI-SQL, Dimensional Data Modeling, Snow park Container services … Snowflake-Data Science , DBT, Airflow If you are interested, please contact hrajendran@redglobal.com or apply here. More ❯
bolton, greater manchester, north west england, united kingdom
RED Global
One of our International customers is now has a new opportunity for a Snowflake Developer/Architect The initial contract will be for 6 months with a possible extension. Job Title: Snowflake Developer/Architect Location: Manchester , 4 days onsite & 1 day remote. Job Type: 6 Months contract with possible extension Language: English Speaking Contract Type: Inside IR35 Role Mandatory … good working knowledge in data models viz Dimensional Data Model ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles dynamic tables streams and tasks policies etc Mandatory Skills : Snowflake, ANSI-SQL, Dimensional Data Modeling, Snow park Container services … Snowflake-Data Science , DBT, Airflow If you are interested, please contact hrajendran@redglobal.com or apply here. More ❯
warrington, cheshire, north west england, united kingdom
RED Global
One of our International customers is now has a new opportunity for a Snowflake Developer/Architect The initial contract will be for 6 months with a possible extension. Job Title: Snowflake Developer/Architect Location: Manchester , 4 days onsite & 1 day remote. Job Type: 6 Months contract with possible extension Language: English Speaking Contract Type: Inside IR35 Role Mandatory … good working knowledge in data models viz Dimensional Data Model ER Data Model and Data Vault Very good working knowledge in writing SQL queries Very good working knowledge in Snowflake Architecture Very good working knowledge in Snowflake internals like Snowflake Roles dynamic tables streams and tasks policies etc Mandatory Skills : Snowflake, ANSI-SQL, Dimensional Data Modeling, Snow park Container services … Snowflake-Data Science , DBT, Airflow If you are interested, please contact hrajendran@redglobal.com or apply here. More ❯
collaboration with internal development teams to ensure stability and performance of critical business systems. Key Skills Provide user support, monitor system performance, and ensure smooth application operations. Experience in Snowflake, Power Platform and Azure integration Triage, investigate, and resolve technical issues related to in-house and third-party applications, integrations, and data feeds. Proficiency in Excel or SQL for data More ❯
collaboration with internal development teams to ensure stability and performance of critical business systems. Key Skills Provide user support, monitor system performance, and ensure smooth application operations. Experience in Snowflake, Power Platform and Azure integration Triage, investigate, and resolve technical issues related to in-house and third-party applications, integrations, and data feeds. Proficiency in Excel or SQL for data More ❯
collaboration with internal development teams to ensure stability and performance of critical business systems. Key Skills Provide user support, monitor system performance, and ensure smooth application operations. Experience in Snowflake, Power Platform and Azure integration Triage, investigate, and resolve technical issues related to in-house and third-party applications, integrations, and data feeds. Proficiency in Excel or SQL for data More ❯
collaboration with internal development teams to ensure stability and performance of critical business systems. Key Skills Provide user support, monitor system performance, and ensure smooth application operations. Experience in Snowflake, Power Platform and Azure integration Triage, investigate, and resolve technical issues related to in-house and third-party applications, integrations, and data feeds. Proficiency in Excel or SQL for data More ❯
collaboration with internal development teams to ensure stability and performance of critical business systems. Key Skills Provide user support, monitor system performance, and ensure smooth application operations. Experience in Snowflake, Power Platform and Azure integration Triage, investigate, and resolve technical issues related to in-house and third-party applications, integrations, and data feeds. Proficiency in Excel or SQL for data More ❯
london (city of london), south east england, united kingdom
Bonhill Partners
collaboration with internal development teams to ensure stability and performance of critical business systems. Key Skills Provide user support, monitor system performance, and ensure smooth application operations. Experience in Snowflake, Power Platform and Azure integration Triage, investigate, and resolve technical issues related to in-house and third-party applications, integrations, and data feeds. Proficiency in Excel or SQL for data More ❯
Senior Snowflake Developer Location: Hybrid – London - 2 days per week increasing in the New Year Day Rate - Inside around £500/day Starting ASAP for 6 months - likely to extend About the Role We’re partnering with a financial client that’s undertaking a large-scale data transformation, modernising legacy systems and building a unified analytics platform in the cloud. … As part of this initiative, you’ll play a key role in developing and optimising a Snowflake-based data warehouse and delivering high-quality visualisations and reporting through Power BI. The role offers the chance to work on greenfield architecture, migrate legacy SQL Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll … Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS) systems into a modern Snowflake/Power BI environment. Work collaboratively with business users to understand requirements More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
Senior Snowflake Developer Location: Hybrid – London - 2 days per week increasing in the New Year Day Rate - Inside around £500/day Starting ASAP for 6 months - likely to extend About the Role We’re partnering with a financial client that’s undertaking a large-scale data transformation, modernising legacy systems and building a unified analytics platform in the cloud. … As part of this initiative, you’ll play a key role in developing and optimising a Snowflake-based data warehouse and delivering high-quality visualisations and reporting through Power BI. The role offers the chance to work on greenfield architecture, migrate legacy SQL Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll … Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS) systems into a modern Snowflake/Power BI environment. Work collaboratively with business users to understand requirements More ❯
london, south east england, united kingdom Hybrid / WFH Options
Omnis Partners
Senior Snowflake Developer Location: Hybrid – London - 2 days per week increasing in the New Year Day Rate - Inside around £500/day Starting ASAP for 6 months - likely to extend About the Role We’re partnering with a financial client that’s undertaking a large-scale data transformation, modernising legacy systems and building a unified analytics platform in the cloud. … As part of this initiative, you’ll play a key role in developing and optimising a Snowflake-based data warehouse and delivering high-quality visualisations and reporting through Power BI. The role offers the chance to work on greenfield architecture, migrate legacy SQL Server solutions and collaborate closely with senior stakeholders on complex, high-impact data initiatives. What You’ll … Be Doing Design, develop and maintain data pipelines and models using Snowflake, dbt and Python. Enhance and support Power BI dashboards and data models, ensuring performance, accuracy and user accessibility. Contribute to the migration of legacy SQL Server (SSIS/SSAS/SSRS) systems into a modern Snowflake/Power BI environment. Work collaboratively with business users to understand requirements More ❯