Snowflake Architect - ETL, AIRFLOW, AWS, SQL, Python, and ETL tools (Streamsets, DBT), RDBMS A Snowflake Architect is required for a long-term project with a fast-growing company. Responsibilities: -Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. -Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets. -Collaborate with data analysts … scientists, and other stakeholders to define and fulfill data requirements. -Optimize performance and scalability of Snowflake data warehouse, ensuring high availability and reliability. -Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake. -Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity. -Stay up-to-date with the latest trends and best … Cloud Services such as AWS Qualifications: >Bachelor’s degree in computer science, Engineering, or a related field. >5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. >Proficiency in SQL, Python, and ETL tools ( Streamsets, DBT etc.) >Hands on experience with Oracle RDBMS >Data Migration experience to Snowflake >Experience with AWS services such as S3 More ❯
City of London, London, United Kingdom Hybrid / WFH Options
I3 Resourcing Limited
Senior Data Engineer MUST HAVE SNOWFLAKE, AWS, SNOWPRO CORE Salary - £70-80k with 15% bonus Hybrid working - couple of days in the office City of London We are looking for: Good understanding of data engineering principles A good technical grasp of Snowflake and automating it, and transforming complex datasets SnowPro core certification. AWS Skillset Delivery experience Building solutions in … snowflake Insurance experience - advantageous but not necessary MUST HAVE SNOWFLAKE, AWS, SNOWPRO CORE Key Responsibilities: Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Mentoring of team members through code reviews and pair programming Build and support new AWS native cloud data warehouse solutions Develop and optimize ETL processes using AWS services … architecture, ensuring optimal health and efficiency. Skills and Experience: Bachelor's degree or higher in a technical discipline Proven experience as a data engineer with a strong focus on Snowflake and AWS services in large-scale enterprise environments Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Strong SQL skills for complex data queries and More ❯
Inside IR35 Location: Basildon,UK - Onsite Qualifications Bachelor’s degree in Computer Science , Engineering , or a related field. 5+ years of experience in data engineering, with strong expertise in Snowflake and AWS . Proficiency in SQL , Python , and ETL tools such as StreamSets and DBT . Hands-on experience with Oracle RDBMS . Proven data migration experience to Snowflake . More ❯
London, England, United Kingdom Hybrid / WFH Options
P&I Insurance Services
Senior Data Engineer MUST HAVE SNOWFLAKE, AWS, SNOWPRO CORE Salary - £70-80k with 15% bonus Hybrid working - couple of days in the office City of London We are looking for: Good understanding of data engineering principles A good technical grasp of Snowflake and automating it, and transforming complex datasets SnowPro core certification. AWS Skillset Delivery experience Building solutions in … snowflake Insurance experience - advantageous but not necessary MUST HAVE SNOWFLAKE, AWS, SNOWPRO CORE Key Responsibilities: Lead the design and implementation of Snowflake [and Redshift] based data warehousing solutions within an AWS environment Mentoring of team members through code reviews and pair programming Build and support new AWS native cloud data warehouse solutions Develop and optimize ETL processes using AWS services … architecture, ensuring optimal health and efficiency. Skills and Experience: Bachelor's degree or higher in a technical discipline Proven experience as a data engineer with a strong focus on Snowflake and AWS services in large-scale enterprise environments Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Strong SQL skills for complex data queries and More ❯
Join to apply for the Snowflake Architect role at Test Yantra 1 week ago Be among the first 25 applicants Join to apply for the Snowflake Architect role at Test Yantra Get AI-powered advice on this job and more exclusive features. Bachelor’s degree in Computer Science, Engineering, or a related field. 5+ years of experience in data engineering … with strong expertise in Snowflake and AWS. Proficiency in SQL, Python, and ETL tools such as StreamSets and DBT. Hands-on experience with Oracle RDBMS. Proven data migration experience to Snowflake. Experience with AWS services including S3, Lambda, Redshift, and Glue. Solid understanding of data warehousing concepts and data modeling. Excellent problem-solving and communication skills focused on delivering high More ❯
exclusive features. Location : Basildon, UK (Work from Client office 5 days week) Hiring Type - Contract/Permanent Responsibilities: Design, develop, and maintain robust data pipelines and ETL processes using Snowflake on AWS. Implement data warehousing solutions, ensuring efficient storage, retrieval, and transformation of large datasets. Collaborate with data analysts, scientists, and other stakeholders to define and fulfil data requirements. Optimize … performance and scalability of Snowflake data warehouse, ensuring high availability and reliability. Develop and maintain data integration solutions, ensuring seamless data flow between various sources and Snowflake. Monitor, troubleshoot, and resolve data pipeline issues, ensuring data quality and integrity. Stay up to date with the latest trends and best practices in data engineering and cloud technologies. Cloud Services such as … AWS Qualifications: Bachelor’s degree in computer science, Engineering, or a related field. 5+ years of experience in data engineering, with a strong focus on Snowflake and AWS. Proficiency in SQL, Python, and ETL tools (Streamsets, DBT etc.) Hands on experience with Oracle RDBMS Data Migration experience to Snowflake Experience with AWS services such as S3, Lambda, Redshift, and Glue. More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Snap Analytics
Principal Data Consultant (Snowflake/Matillion) Application Deadline: 18 June 2025 Department: Data Engineering Employment Type: Full Time Location: Bristol, UK Compensation: £85,000 - £100,000/year Description As a Principal Data Consultant, at Snap, you'll be at the forefront of our most strategic initiatives. Your role will involve leading client engagements, managing large-scale projects, and shaping … as a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You'll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You'll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You'll work closely … with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You'll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and incorporating things More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Snap Analytics
Principal Data Consultant (Snowflake/Matillion) 3 days ago Be among the first 25 applicants Get AI-powered advice on this job and more exclusive features. Application Deadline: 20 July 2025 Department: Data Engineering Employment Type: Full Time Location: Bristol, UK Compensation: £85,000 - £100,000/year Description As a Principal Data Consultant, at Snap, you’ll be at … as a thought leader by authoring blogs, presenting at webinars, and engaging in external speaking opportunities. Technical Delivery Excellence You’ll design and optimise cloud-based data architectures (e.g. Snowflake, Databricks, Redshift, Google BigQuery, Azure Synapse) to support advanced analytics. You’ll build and automate scalable, secure, and high-performance data pipelines handling diverse data sources. You’ll work closely … with our clients to design the correct data models to support their analytic requirements following best practices such as Kimball star schemas and snowflake schemas, ensuring performance and ease of use for the client. You’ll manage the delivery of ETL/ELT processes using tools like Matillion, Informatica, or Talend, adhering to data engineering best practices and incorporating things More ❯
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each … system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models to ensure … features. · Excellent communication skills. · Strong knowledge of Python. · Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc. · In-depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. More ❯
As a Senior Data Engineer, you will be responsible for the development of complex data sources and pipelines into our data platform (i.e. Snowflake) along with other data applications (i.e. Azure, Airflow, etc.) and automation. The Senior Data Engineer will work closely with the data, Architecture, Business Analyst, Data Stewards to integrate and align requirements, specifications and constraints of each … system reliability. Utilize data quality tools like Great Expectations or Soda to ensure the accuracy, reliability, and integrity of data throughout its lifecycle. Create & maintain data pipelines using Airflow & Snowflake as primary tools Create SQL Stored procs to perform complex transformation Understand data requirements and design optimal pipelines to fulfil the use-cases Creating logical & physical data models to ensure … features. · Excellent communication skills. · Strong knowledge of Python. · Familiarity with Azure Services such as Blobs, Functions, Azure Data Factory, Service Principal, Containers, Key Vault, etc. · In-depth knowledge of Snowflake architecture, features, and best practices. · Experience with CI/CD pipelines using Git and Git Actions. · Knowledge of various data modeling techniques, including Star Schema, Dimensional models, and Data Vault. More ❯
to expand their data engineering practice this year. They are now on the look out for an experienced Senior Data Engineer to join the team, bringing technical expertise in Snowflake, AWS, DBT, and Terraform. Salary and Benefits Competitive salary of up to £100k Up to 10% Bonus Up to 14% Pensions Contribution Hybrid working from London office (2 in office … Duties will include product-based work as well as migration tasks Due to your seniority you will also be tasked with mentoring junior engineers Work extensively and proficiently with Snowflake, AWS, DBT, Terraform, Airflow, SQL, and Python. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and Python Data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
to expand their data engineering practice this year. They are now on the look out for an experienced Senior Data Engineer to join the team, bringing technical expertise in Snowflake, AWS, DBT, and Terraform. Salary and Benefits Competitive salary of up to £100k Up to 10% Bonus Up to 14% Pensions Contribution Hybrid working from London office (2 in office … Duties will include product-based work as well as migration tasks Due to your seniority you will also be tasked with mentoring junior engineers Work extensively and proficiently with Snowflake, AWS, DBT, Terraform, Airflow, SQL, and Python. Requirements: 3+ Years data engineering experience Snowflake experience Proficiency across an AWS tech stack DBT expertise Terraform experience Expert SQL and Python Data More ❯
Substantial experience in data engineering roles, particularly with large-scale data environments. Proficiency in programming languages such as Python and TypeScript Strong experience with data warehousing solutions, such as Snowflake or similar Expert knowledge of ETL/ELT tools like Apache Airflow, DBT, or similar. Deep understanding of SQL and NoSQL databases, such as Snowflake, MongoDB, or similar. Experience with More ❯
London, England, United Kingdom Hybrid / WFH Options
Winston Fox
The technology stack is similarly varied including a range of legacy and modern systems, across on-premises and cloud infrastructure with technologies and tooling such as Python, dbt, KDB+, Snowflake, SQL and interfacing with Market Data vendors such as Bloomberg, Refinitiv, Factset and MorningStar. This is an exciting time for you to join the team they consolidate our technology estate … and/or working with Quant Trading Technology Expertise in Python and SQL and familiarity with relational and time-series databases. Exposure to Airflow and dbt, as well as Snowflake, Databricks or other Cloud Data Warehouses preferred. Experience implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git, Docker, Jenkins/TeamCity, monitoring, testing More ❯
London, England, United Kingdom Hybrid / WFH Options
Artefact
and a proven track record of leading data projects in a fast-paced environment. Key Responsibilities Design, build, and maintain scalable and robust data pipelines using SQL, Python, Databricks, Snowflake, Azure Data Factory, AWS Glue, Apache Airflow and Pyspark. Lead the integration of complex data systems and ensure consistency and accuracy of data across multiple platforms. Implement continuous integration and … communication and interpersonal skills. Excellent understanding of data architecture including data mesh, data lake and data warehouse. Preferred Qualifications: Certifications in Azure, AWS, or similar technologies. Certifications in Databricks, Snowflake or similar technologies Experience in the leading large scale data engineering projects Working Conditions This position may require occasional travel. Hybrid work arrangement: two days per week working from the More ❯
London, England, United Kingdom Hybrid / WFH Options
Whitehall Resources Ltd
engineer and modernize existing ETL processes from legacy systems into scalable cloud-native solutions • Contribute to the development and optimization of a cloud-based data platform, leveraging tools like snowflake, AWS and airflow • Work closely with data architects, analysts and other engineers to deliver high-quality, production-ready code • Participate in code reviews, ensuring adherence to best practices and high … techniques • Hands-on exposure to cloud platforms, especially AWS • Experience working in agile teams and using version control and CI/CD practices Desirable skills and experience: • Experience with Snowflake or other cloud-native data warehouse technologies • Familiarity with GraphQL and its use in data-driven APIs • Exposure to data governance, data quality and metadata management tools • Interest or experience More ❯
stakeholders • Partner with engineering teams to develop scalable business processes and data pipelines to support our clients • Develop data models, analytics and reporting using capabilities such as Legend Studio, Snowflake, Alteryx, SQL, Tableau, R, Python • Manage prioritization and stakeholder engagement to maximize delivery towards established business goals. BASIC QUALIFICATIONS • Minimum of 4 years of experience in Business Intelligence, Data Engineering … Graduate or Undergraduate degree in Computer Science, Statistics, Math, or Engineering • Experience in financial services, operations fields • Experience in gathering and documenting requirements with full testing traceability • Experience in Snowflake, Databricks, Legend Studio platforms • Data governance and modelling experience ABOUT GOLDMAN SACHS At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities More ❯
and architectural direction across entire programmes and client estates. Key Responsibilities Senior Data Architects: Lead the design and delivery of cloud-native data solutions using modern platforms (e.g. Databricks, Snowflake, Kafka, Confluent) Architect data lakes, lakehouses, streaming pipelines, and event-driven architectures Oversee engineering teams and collaborate with analysts and QA functions Translate complex requirements into scalable, robust data products … Proven track record in data architecture , either from a delivery or enterprise strategy perspective Deep experience with cloud platforms (Azure, AWS, GCP) and modern data ecosystems (Spark, Databricks, Kafka, Snowflake) Strong understanding of Data Mesh , Data Fabric , and data product-led approaches Data modelling expertise (relational, dimensional) and familiarity with tools like Erwin , Sparx , Archi Experience with ETL/ELT More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava
business objectives. Key Responsibilities Data Pipeline Development Architect, implement and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target mappings and quality requirements. Build ETL/ELT workflows, validation … and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage: Relational (PostgreSQL, SQL Server), NoSQL (MongoDB, Cassandra), Dimensional modelling. DevOps & Automation: Docker, Kubernetes More ❯
BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯
BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯
BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯
BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯
BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯
BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯