London, England, United Kingdom Hybrid / WFH Options
Thehealthylivingstore
play a key part in designing, building, and optimising scalable data pipelines, ensuring high-quality, actionable data is readily available to drive decision-making. Working with modern tools like Snowflake and dbt, you’ll build and maintain our data infrastructure and collaborate with cross-functional teams, including analysts and stakeholders, to support reporting and insights. You’ll also have the … organisation. Key Responsibilities Design, develop, and maintain scalable and reliable ETL pipelines to extract, transform, and load data from a variety of sources. Build and optimise data models using Snowflake and dbt to ensure accuracy, integrity, and accessibility across the organisation. Write and optimise complex SQL queries for data transformation, reporting, and analysis. Collaborate with analysts and stakeholders to understand More ❯
real-world business problems, eliminate technical roadblocks, and create measurable value. This role requires deep familiarity with modern data ecosystems. You'll leverage cloud data warehouses (CDWs) such as Snowflake, Databricks, and others to architect robust, scalable solutions. You’ll help customers harness the power of AI and large language models (LLMs)—including Domo’s Agent Catalyst framework—to build … AI agents, and prompt engineering for enhanced insights and automation; Working knowledge of database technologies (relational, columnar, NoSQL), e.g., MySQL, Oracle, MongoDB; Experience with modern cloud data warehouses (e.g., Snowflake, Databricks, BigQuery); Excellent organizational and multitasking skills across multiple sales cycles; Agile and adaptable to evolving customer needs and priorities; Creative problem-solver with a strategic, growth-oriented mindset; Strong More ❯
London, England, United Kingdom Hybrid / WFH Options
DELIVEROO
your role will be to provide clean, tested, well-documented and well-modelled data sets, that will enable and empower data scientists and business users alike, via tools like Snowflake and/or Looker. You'll work with product engineering teams to ensure modelling of source data meets downstream requirements. You will maintain and develop SQL data transformation scripts, and … Analytics Engineering/Data Engineering/BI Engineering experience Understanding of data warehousing, data modelling concepts and structuring new data tables Knowledge of cloud-based MPP data warehousing (e.g. Snowflake, BigQuery, Redshift) Nice to have Experience developing in a BI tool (Looker or similar) Good practical understanding of version control SQL ETL/ELT knowledge, experience with DAGs to manage More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Cognify Search
identify needs An understanding of data in relation to dynamic pricing, revenue operations & management, demand forecasting, competitor analysis etc. would be useful! SQL, Python, Cloud (GCP, Azure, AWS or Snowflake), Airflow/dbt etc. Interested in hearing more? Apply More ❯
identify needs An understanding of data in relation to dynamic pricing, revenue operations & management, demand forecasting, competitor analysis etc. would be useful! SQL, Python, Cloud (GCP, Azure, AWS or Snowflake), Airflow/dbt etc. Interested in hearing more? Apply More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Cognify Search
identify needs An understanding of data in relation to dynamic pricing, revenue operations & management, demand forecasting, competitor analysis etc. would be useful! SQL, Python, Cloud (GCP, Azure, AWS or Snowflake), Airflow/dbt etc. Interested in hearing more? Apply More ❯
data migration projects and cloud transformation Deep understanding of enterprise architecture frameworks and methodologies Experience with financial risk systems and analytics platforms Strong technical background in database platforms (Oracle, Snowflake) Understanding of cloud architecture principles (AWS preferred) Excellent communication and stakeholder management skills Ability to translate complex technical concepts for non-technical stakeholders PREFERRED CERTIFICATIONS Cloud platform certifications (AWS Certified … DAMA) Project/Program Management certifications (PMP, Prince2, Agile/Scrum) AI/ML certifications or specialized training TECHNICAL SKILLS Enterprise systems architecture AWS and cloud technologies Oracle and Snowflake platforms Data modeling and design Data lineage and governance tools ETL/ELT processes Big data and analytics platforms Financial risk systems and methodologies Programming knowledge (Python, SQL, etc.) DevOps More ❯
BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯
BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯
BI ecosystem , including : Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine Learning) Governance & Infrastructure: Centralised Data Catalogue & Access Control (Okta) Job Scheduling & Monitoring (AWS, Splunk) Agile Data More ❯
Edinburgh, Scotland, United Kingdom Hybrid / WFH Options
Motability Operations Ltd
feeds and related applications Writing, testing and peer review of ETL code in Oracle ODI Working with business users to design and configure self-serve data environments within our snowflake data lake Analysing, developing, delivering, and managing BI reports Assisting in the design of the data processes, including data quality, reconciliation, testing, and governance Contributing to technical process improvement initiatives … in overnight support rota You’ll need all of these. Experience of building a data warehouse using an ETL/ELT tool, preferably Oracle ODI Significant database experience in Snowflake or Oracle Good knowledge of standard data formats (XML, JSON, csv, etc) Proven experience of delivering BI solutions for business requirements Experience of developing using an Agile development approach Proven More ❯
pandas to name a few of the libraries we use extensively. We implement the systems that require the highest data throughput in Java. Within Data Engineering we use Dataiku, Snowflake, Prometheus, and ArcticDB heavily. We use Kafka for data pipelines, Apache Beam for ETL, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for … for ETL, data engineering and stream processing Proficient on Linux platforms with knowledge of various scripting languages Working knowledge of one or more relevant database technologies e.g. MongoDB, PostgreSQL, Snowflake, Oracle Proficient with a range of open source frameworks and development tools e.g. NumPy/SciPy/Pandas, Spark, Jupyter Advantageous Prior experience of working with financial market data or More ❯
SQL, Python, and cloud-based data platforms ( AWS/Azure/GCP ) ️ Experience with ETL processes, data lakes, and data warehouse architectures ️ Knowledge of big data technologies (Spark, Kafka, Snowflake, etc.) ️ Excellent stakeholder management and ability to translate business needs into scalable solutions What They Offer: Up to ~£95K DOE Hybrid flexibility Career growth - Join an ambitious, forward-thinking company More ❯
our product integrates and interacts with external systems, partners, and platforms. Our Tech Stack: Apache Airflow Python Django AWS (S3, RDS withPostgresql, ElastiCache, MSK, EC2, ECS, Fargate, Lamda etc.) Snowflake Terraform CircleCI Your mission Design and develop data pipelines, orchestrating key activities such as data ingestion, extraction andtransformation, task automation and serving of data to applications and dashboards. Work on … This includes reviewing and validating data from various sources, such as spreadsheets, to ensure accuracy and consistency. Contribute to the development and maintenance of our data warehouse solution on Snowflake Collaborate with our product managerand stakeholders to collect and refine data requirements. Optimise data storage, infrastructure performance and cost Practice and promote excellent data and cloud engineering best practices Work More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Michael Page (UK)
processes for model development, validation, implementation and documentation. The Successful Applicant The successful Data Engineer should have: Proficiency in Big Data Modelling, ETL and Data warehousing. Proficient in SQL Snowflake Tableau Understanding of cloud services providers. Excellent problem-solving abilities and communication skills. An understanding of Python and Java would be advantageous but not essential. What's on Offer An More ❯
London, England, United Kingdom Hybrid / WFH Options
IPS Group
Data Platform data pipeline patterns. Experience required: Insurance/financial services domain knowledge (preferred) Expertise in at least two of these key technologies: Azure, Azure Data Lake, Databricks, Matillion, Snowflake and Power BI (must have) Analytical system data architecture and ETL design experience Hands on experience in designing and implementing a data warehouse Comprehension of dimensional data modelling Comprehension of More ❯
SQL, Python, and cloud-based data platforms ( AWS/Azure/GCP ) ️ Experience with ETL processes, data lakes, and data warehouse architectures ️ Knowledge of big data technologies (Spark, Kafka, Snowflake, etc.) ️ Excellent stakeholder management and ability to translate business needs into scalable solutions What They Offer: ? Up to ~£95K DOE ? Hybrid flexibility ? Career growth - Join an ambitious, forward-thinking company More ❯
SQL, Python, and cloud-based data platforms ( AWS/Azure/GCP ) ️ Experience with ETL processes, data lakes, and data warehouse architectures ️ Knowledge of big data technologies (Spark, Kafka, Snowflake, etc.) ️ Excellent stakeholder management and ability to translate business needs into scalable solutions What They Offer: ? Up to ~£95K DOE ? Hybrid flexibility ? Career growth - Join an ambitious, forward-thinking company More ❯
analytics platforms (Tableau, Power BI, Looker) to inform product decisions. About You Experienced Product Owner with strong exposure to data products and platforms. Delivery experience with enterprise data warehouses (Snowflake, BigQuery), lakes, and CDPs. Understanding of data pipelines, contracts, lineage, APIs, and data governance frameworks. Comfortable navigating GDPR, CCPA and other compliance requirements. Familiar with dbt, Airflow, and cloud platforms More ❯
analytics platforms (Tableau, Power BI, Looker) to inform product decisions. About You Experienced Product Owner with strong exposure to data products and platforms. Delivery experience with enterprise data warehouses (Snowflake, BigQuery), lakes, and CDPs. Understanding of data pipelines, contracts, lineage, APIs, and data governance frameworks. Comfortable navigating GDPR, CCPA and other compliance requirements. Familiar with dbt, Airflow, and cloud platforms More ❯
use cases About you: Strong background in software and data engineering leadership Proficient in Python, SQL, and modern ELT practices (e.g. dbt, Fivetran, Airflow) Deep knowledge of data warehousing (Snowflake), AWS services (e.g. Lambda, Kinesis, S3), and IaC (Terraform) Experienced in building data platforms with a focus on governance, reliability, and business value Comfortable driving architectural conversations and mentoring engineers More ❯
London, England, United Kingdom Hybrid / WFH Options
Revybe IT Recruitment Ltd
technical stakeholders A collaborative mindset and genuine curiosity around solving data challenges Tech Stack Snapshot Python, SQL, Airflow AWS, Azure, & GCP depending on the project Bonus: Experience with dbt, Snowflake, or Looker would be amazing, as this business use these tools. If you're ready to step into a role where you can lead, innovate, and make a tangible difference More ❯
use cases About you: Strong background in software and data engineering leadership Proficient in Python, SQL, and modern ELT practices (e.g. dbt, Fivetran, Airflow) Deep knowledge of data warehousing (Snowflake), AWS services (e.g. Lambda, Kinesis, S3), and IaC (Terraform) Experienced in building data platforms with a focus on governance, reliability, and business value Comfortable driving architectural conversations and mentoring engineers More ❯
the source to building the complex mappings. Strong experience on data migration and reporting using ETL (Informatica, Boomi, DataStage, Matillion etc) and Reporting Tools (Power BI, Tableau) Experience on Snowflake Storage and Database. Thorough experience with writing complex SQL Thorough experience translating business requirement to technical requirements and vice versa Experience with data quality tools Background in engineering, ideally with More ❯
the source to building the complex mappings. Strong experience on data migration and reporting using ETL (Informatica, Boomi, DataStage, Matillion etc) and Reporting Tools (Power BI, Tableau) Experience on Snowflake Storage and Database. Thorough experience with writing complex SQL Thorough experience translating business requirement to technical requirements and vice versa Experience with data quality tools Background in engineering, ideally with More ❯