Strong analytical and troubleshooting skills, with the ability to resolve complex data-related issues. Nice to Have: Familiarity with cloud platforms (AWS, GCP, Azure) and cloud-based database services (Snowflake). Knowledge of data warehousing, orchestration and pipeline technologies (Apache Airflow/Kafka, Azure DataFactory etc.). Experience with DBT for modelling Server administration and networking fundamentals More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
LAGOFIRE SRL
needs. Qualifications Strong experience as a Product Owner or Data Product Owner in Agile environments. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks More ❯
London, England, United Kingdom Hybrid / WFH Options
Leonardo
United Kingdom 2 weeks ago London, England, United Kingdom 1 week ago Data Architect - Insurance/London Market Insurance London, England, United Kingdom 6 days ago Data Architect with Snowflake & Azure (Data Architect I) London, England, United Kingdom 2 days ago Horsham, England, United Kingdom 4 weeks ago Data & AI Cloud Architect, AWS Professional Services London, England, United Kingdom More ❯
Role: Snowflake Developer Location: New York (Hybrid) Position Type: Contract on W2 Only Experience Required: 8+ Years of Experience Eligible Visas: USC/GC/H4-EAD We are seeking an experienced Senior Snowflake Developer who will be responsible for designing, developing, and maintaining our data warehouse solutions on the Snowflake platform. The ideal candidate should also be proficient in … SQL and have a strong background in SQL Server Integration Services (SSIS) for data integration and ETL processes. Key Responsibilities: Design, develop, and optimize Snowflake-based data warehouse solutions to meet business requirements. Collaborate with stakeholders to gather and analyze data requirements, translating them into technical specifications. Develop and maintain ETL processes using SQL and SSIS to ensure efficient data … integration and migration. Implement data pipelines and workflows in Snowflake to support data analytics and reporting needs. Monitor and tune Snowflake performance to ensure optimal data processing and query execution. Ensure data quality, consistency, and integrity across the data warehouse. Provide technical support and troubleshooting for Snowflake and SSIS-related issues. Stay current with Snowflake features, best practices, and industry More ❯
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Cambridge, Cambridgeshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Nottingham, Nottinghamshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
Birmingham, West Midlands, West Midlands (County), United Kingdom Hybrid / WFH Options
Tenth Revolution Group
clients, are now looking for a hands-on Data Engineer to join their talented delivery team. This is a fantastic opportunity to work with a modern data stack - including Snowflake , dbt , AWS , Python , and SQL - on impactful projects that power reporting, automation, predictive analytics and Artificial Intelligence. This role is fully remote, and is therefore open to candidates across the … who wants to be part of company that is growing and maturing whilst continually learning. Key responsibilities: Designing and building robust data pipelines using SQL, dbt, and Python within Snowflake Developing scalable, testable, and maintainable code Collaborating with analytics, product, and client teams to deliver high-quality data solutions Supporting the development of data products like dashboards, APIs, and predictive … and operational efficiency through data tooling Contributing to internal best practices and agile delivery processes Experience required: Proven experience as a Data Engineer or Analytics Engineer Strong experience with Snowflake Hands-on experience with dbt Proficiency in Python and SQL Solid understanding of Git and development lifecycle best practices Experience integrating APIs or working with event/log data streams More ❯
London, England, United Kingdom Hybrid / WFH Options
P&I Insurance Services
Salary - £70-80k with 15% bonus Hybrid working - couple of days in the office City of London Good understanding of data engineering principles A good technical grasp of Snowflake and automating it, and transforming complex datasets AWS Skillset Delivery experience Building solutions in snowflake Insurance experience - advantageous but not necessary Salary - £70-80k with 15% bonus Hybrid working … couple of days in the office City of London We Are Looking For Good understanding of data engineering principles A good technical grasp of Snowflake and automating it, and transforming complex datasets SnowPro core certification. AWS Skillset Delivery experience Building solutions in snowflake Insurance experience - advantageous but not necessary Key Responsibilities Lead the design and implementation of Snowflake [and Redshift … architecture, ensuring optimal health and efficiency. Skills And Experience Bachelor's degree or higher in a technical discipline Proven experience as a data engineer with a strong focus on Snowflake and AWS services in large-scale enterprise environments Extensive experience in AWS services, e.g. EC2, S3, RDS, DynamoDB, Redshift, Lambda, API Gateway Strong SQL skills for complex data queries and More ❯
London, England, United Kingdom Hybrid / WFH Options
Whitehall Resources Ltd
engineer and modernize existing ETL processes from legacy systems into scalable cloud-native solutions • Contribute to the development and optimization of a cloud-based data platform, leveraging tools like snowflake, AWS and airflow • Work closely with data architects, analysts and other engineers to deliver high-quality, production-ready code • Participate in code reviews, ensuring adherence to best practices and high … techniques • Hands-on exposure to cloud platforms, especially AWS • Experience working in agile teams and using version control and CI/CD practices Desirable skills and experience: • Experience with Snowflake or other cloud-native data warehouse technologies • Familiarity with GraphQL and its use in data-driven APIs • Exposure to data governance, data quality and metadata management tools • Interest or experience More ❯
London, England, United Kingdom Hybrid / WFH Options
Substance Global
best practices. Take initiative to improve and optimise analytics engineering workflows and platforms Key Responsibilities Design, develop, and maintain scalable data pipelines on GCP (Or other cloud services ie. Snowflake) using services such as BigQuery and Cloud Functions. Designing and building data models for analytics, reporting, and data science applications in Looker/Metabase(Visualisation solutions) using (explores, views, etc … which will enable users across the organization to self-serve analytics. Strong Python (Pandas/SQL Alchemy) & SQL(BigQuery/Snowflake) skills for maintaining ETL/ELT pipelines. Developing tools and processes to ensure the quality and observability of data Monitoring and improving the performance and efficiency of the data stack Producing excellent documentation on all of the above Working … warehousing skills demonstrated in data environments Excellent Python & SQL and data transformation skills (e.g. ideally proficient in dbt or similar) Familiarity with at least one of these Cloud technologies: Snowflake, AWS, Google Cloud, Microsoft Azure Good attention to detail to highlight and address data quality issues Excellent time management and proactive problem-solving skills What’s in it for you More ❯
London, England, United Kingdom Hybrid / WFH Options
ScanmarQED
Professional Experience: 3–5 years in Data Engineering, Data Warehousing, or programming within a dynamic (software) project environment. Data Infrastructure and Engineering Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft … Azure or AWS. Programming and Scripting: Programming Languages: Strong skills in Python and SQL. Data Modeling and Query Optimization: Data Modeling: Designing star/snowflake schemas and understanding normalization and denormalization. SQL Expertise: Writing efficient queries and optimizing for performance. DevOps and CI/CD: Version Control: Using Git and platforms like GitHub, GitLab, or Bitbucket. Data Governance and Security More ❯
Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data solutions We'd love to talk to you More ❯
London, England, United Kingdom Hybrid / WFH Options
Endava
business objectives. Key Responsibilities Data Pipeline Development Architect, implement and maintain real-time and batch data pipelines to handle large datasets efficiently. Employ frameworks such as Apache Spark, Databricks, Snowflake or Airflow to automate ingestion, transformation, and delivery. Data Integration & Transformation Work with Data Analysts to understand source-to-target mappings and quality requirements. Build ETL/ELT workflows, validation … and ensure regulatory compliance (GDPR). Document data lineage and recommend improvements for data ownership and stewardship. Qualifications Programming: Python, SQL, Scala, Java. Big Data: Apache Spark, Hadoop, Databricks, Snowflake, etc. Cloud: AWS (Glue, Redshift), Azure (Synapse, Data Factory, Fabric), GCP (BigQuery, Dataflow). Data Modelling & Storage: Relational (PostgreSQL, SQL Server), NoSQL (MongoDB, Cassandra), Dimensional modelling. DevOps & Automation: Docker, Kubernetes More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
CreateFuture
Automating and orchestrating workflows using tools like AWS Glue, Azure Data Factory, or Google Cloud Dataflow Working with platform services such as Amazon S3, Azure Synapse, Google BigQuery, and Snowflake Implementing Lakehouse architectures using tools like Databricks or Snowflake Collaborating closely with engineers, analysts, and client teams to deliver value-focused data solutions We'd love to talk to you More ❯
London, England, United Kingdom Hybrid / WFH Options
Marshmallow
than others. Meaningful Work: Your efforts will positively impact people who are often financially disadvantaged when they arrive in the UK. Innovative Tech Stack: Work with modern tools like Snowflake, DBT, AWS Kinesis, Step Functions, Airflow, Datahub, and Looker. What You’ll Be Doing Platform Ownership: As the subject matter expert for our data platform, you'll manage and enhance … core components like DBT, Snowflake, AWS (S3, Kinisis, DMS, Step Functions, Dynamo CDC, Backup), Airbyte, Transcend, Segment and Looker. Holistic Understanding: Develop a deep understanding of our entire data platform to safely and effectively work across the data stack. User Collaboration: Work closely with platform users to ensure optimal usability and performance, driving long-term strategies to maximise value for More ❯
London, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
solid grasp of data governance, data modeling, and business intelligence best practices. Knowledge of Agile, DevOps, Git, APIs, microservices, and data pipeline development . Familiarity with Spark, Kafka, or Snowflake is a plus. Desirable Certifications: Microsoft Certified: Fabric Analytics Engineer Associate Why Join Us? Competitive salary up to £70,000 per year Opportunities for growth, training, and development Collaborative, innovative More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
Jet2
modelling stages of data warehouse projects and be comfortable with conceptual, logical and physical data modelling techniques as well as dimensional modelling techniques. Experience using cloud data warehouse technology- Snowflake (preferred), Google BigQuery, AWS Redshift or Azure Synapse and experience working with key services on either GCP, AWS or Azure. Key services include cloud storage, containerisation, event-driven services, orchestration More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Scott Logic
be great if you have: Experience of relevant cloud services within AWS, Azure or GCP. Experience working in an Agile environment. Experience working with common vendor products such as Snowflake or Data Bricks. Experience working with CI/CD tooling. What you’ll get in return is: 25 days’ annual leave, rising to 30 days with each year of service. More ❯
Cardiff, Wales, United Kingdom Hybrid / WFH Options
Creditsafe
all backgrounds and encourage applications from those with transferable experience. You’ll ideally bring: Experience building ETL pipelines in Python. Familiarity with analytical data warehouses (e.g. Redshift preferred, or Snowflake/BigQuery). Understanding of orchestration tools such as AWS Step Functions, Airflow, or AWS Batch. Experience working with Agile methodologies. Awareness of automated testing and delivery pipelines. Understanding of More ❯
Belfast, Northern Ireland, United Kingdom Hybrid / WFH Options
Enso Recruitment
role in developing and optimising data pipelines and infrastructure to ensure efficient data flow across the business. Key Responsibilities Develop, support and optimise robust data solutions using tools like Snowflake, dbt, Fivetran, and Azure Cloud services Collaborate with cross-functional teams to translate business needs into actionable data architecture Design and manage data pipelines and integration workflows, ensuring performance and More ❯
be great if you have: Experience of relevant cloud services within AWS, Azure or GCP. Experience working in an Agile environment. Experience working with common vendor products such as Snowflake or Data Bricks. Experience working with CI/CD tooling. What you'll get in return is: 25 days' annual leave, rising to 30 days with each year of service. More ❯
London, England, United Kingdom Hybrid / WFH Options
Noir
experience as a Data Engineer (3-5 years), preferably in the energy sector. Right to work in the UK. Strong proficiency in SQL and database technologies (e.g., MS SQL, Snowflake). Hands-on experience with ETL/ELT tools such as Azure Data Factory, DBT, AWS Glue, etc. Proficiency in Power BI and Advanced Analytics for insightful data visualisation. Strong More ❯
Gloucester, England, United Kingdom Hybrid / WFH Options
Benefact Group
optimization of data pipelines and resources. Knowledge, skills and experience Essential: Cloud Platforms: Experience with Azure, AWS, or Google Cloud for data engineering. Cloud Data Tools: Expertise in Databricks, Snowflake, BigQuery, or Synapse Analytics. Programming & Scripting: Strong knowledge of Python, SQL, Spark, or similar. Data Modelling & Warehousing: Experience with cloud-based data architecture. CI/CD & DevOps: Knowledge of Terraform More ❯