expertise to grow top line revenues and guide commercial initiatives from our data. You'll own the analysis of the end-to-end customer journey, using our data stack ( BigQuery, dbt, Hex ) to create data models, data products, metrics and find insights that fuel our growth. You'll work closely with other engineers, marketers, product teams, and commercial teams More ❯
power key business initiatives across the company. What You’ll Do Data Platform & Infrastructure: Architect, build, and maintain scalable and performant ELT pipelines and infrastructure using tools like GCP, BigQuery, dbt, third-party ingestion. Lead data ingestion and transformation from a wide range of first and third-party systems (e.g., GA4, Bloomreach CDP, our Inventory and Finance platforms) ensuring … analysts, marketers, and operators in better leveraging the data platform. What You’ll Bring Proven experience designing and maintaining modern data pipelines in cloud-native environments (preferably GCP and BigQuery). Strong command of SQL and Python; experience with ELT tools (dbt) and reverse-ELT tools like Census Hands-on experience modelling event-based datasets (e.g. GA4 in GBQ More ❯
to vendor surveillance products Core Skills & Experience Proven hands-on experience with Google Cloud Platform (GCP) and components such as Dataflow, Pub/Sub, Cloud Storage Deep expertise in BigQuery and Big Data processing Strong programming skills in Python or Java used extensively in surveillance data integration Solid SQL expertise for querying, transforming, and troubleshooting data Experience building robust More ❯
to vendor surveillance products Core Skills & Experience Proven hands-on experience with Google Cloud Platform (GCP) and components such as Dataflow, Pub/Sub, Cloud Storage Deep expertise in BigQuery and Big Data processing Strong programming skills in Python or Java used extensively in surveillance data integration Solid SQL expertise for querying, transforming, and troubleshooting data Experience building robust More ❯
London, England, United Kingdom Hybrid / WFH Options
DataAnalystJobs.io
management. Data Warehousing(highly desirable): Experience working on a data warehouse solution irrespective of underlying technology. Experience using cloud data warehouse technology would also be beneficial - Snowflake (preferred), GoogleBigQuery, AWS Redshift or Azure Synapse. Data Pipeline(highly desirable): Demonstrable experience working with data from a wide variety of data sources including different database platforms, flat files, API’s More ❯
event-driven architecture with Kafka Experience with Scala for data pipelines Experience with Python and SQL for data pipelines Experience with modern cloud data warehouses (like AWS Redshift, GCP BigQuery, Azure Synapse or Snowflake) Strong communication skills and fluency in English Experience with Apache Spark (in both batch and streaming) Experience with a job orchestrator (Airflow, Google Cloud Composer More ❯
and marketing data. Deliver end-to-end data modelling projects, connecting multiple sources and creating metrics/KPIs . Work primarily with SQL, dbt, and cloud data warehouses (Snowflake, BigQuery, Redshift). Technical Requirements Strong proficiency in SQL with the ability to write complex queries, optimise performance, and manipulate large datasets efficiently. This includes expertise in database management, data … extraction, transformation, and analysis, ensuring seamless data workflows for all stakeholders. Working with modern cloud data warehouses such as Snowflake , BigQuery , or Redshift . You should be comfortable creating robust data models, building scalable pipelines, and ensuring data quality within these cloud environments. Experience with dbt (Data Build Tool) , particularly in managing and automating data transformation. Solid understanding of More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
including Gen AI. Your role will include: Enterprise Architecture & Delivery: Spearhead design and implementation of scalable data ecosystems—spanning data lakes, lakehouses, and warehouses—using platforms such as GCP, BigQuery, Snowflake, Synapse, or Microsoft Fabric. Cloud Transformation Strategy: Lead cloud adoption and modernization initiatives across GCP, utilizing infrastructure-as-code (e.g., Terraform), CI/CD automation (e.g., Azure DevOps … or ER/Studio, ensuring models are optimized for performance, scalability, and maintainability. Data Integration & Pipeline Engineering: Design and implement resilient ETL/ELT workflows using technologies such as BigQuery, Dataflow, Informatica, or IBM DataStage, supporting real-time and batch processing. Governance & Data Quality Management: Establish data governance frameworks, including metadata management and quality assurance, using platforms like Unity More ❯
the market. Your role will include: Enterprise Architecture & Delivery: Spearhead the design and implementation of scalable data ecosystems-spanning data lakes, lakehouses, and warehouses-leveraging platforms such as GCP, BigQuery Snowflake, Synapse, or Microsoft Fabric. Cloud Transformation Strategy: Lead cloud adoption and modernization initiatives across GCP, utilizing infrastructure-as-code (e.g., Terraform), CI/CD automation (e.g., Azure DevOps …/Studio, ensuring models are optimized for performance, scalability, and long-term maintainability. Data Integration & Pipeline Engineering: Design and implement resilient ETL/ELT workflows using technologies such as BigQuery, Dataflow, Informatica, or IBM DataStage, supporting both real-time and batch data processing. Governance & Data Quality Management: Establish comprehensive data governance frameworks, including metadata management and quality assurance, using More ❯
the market. Your role will include: Enterprise Architecture & Delivery: Spearhead the design and implementation of scalable data ecosystems-spanning data lakes, lakehouses, and warehouses-leveraging platforms such as GCP, BigQuery Snowflake, Synapse, or Microsoft Fabric. Cloud Transformation Strategy: Lead cloud adoption and modernization initiatives across GCP, utilizing infrastructure-as-code (e.g., Terraform), CI/CD automation (e.g., Azure DevOps …/Studio, ensuring models are optimized for performance, scalability, and long-term maintainability. Data Integration & Pipeline Engineering: Design and implement resilient ETL/ELT workflows using technologies such as BigQuery, Dataflow, Informatica, or IBM DataStage, supporting both real-time and batch data processing. Governance & Data Quality Management: Establish comprehensive data governance frameworks, including metadata management and quality assurance, using More ❯
South West, England, United Kingdom Hybrid / WFH Options
Interquest
infrastrcutrure (design and lead implementation of enterprise-grade ETL and data pipeline solutions) - Take ownership of the data warehouse and related infrastructure - Experience working with cloud platforms (such as BigQuery, Snowflake, Azure) - Embed data governance strategies - Must be highly skilled in SQL and confident using other tools (such as Python, R, JavaScript) InterQuest Group is acting as an employment More ❯
distributed systems Experience with modern data tools such as Airflow , dbt , Spark , or similar Familiarity with cloud platforms (AWS, Azure, or GCP) Understanding of data warehousing principles (e.g. Snowflake, BigQuery, Redshift) Strong communication and documentation skills in Dutch Ability to work independently and manage project deadlines Based in Belgium, or able to be on-site in Ghent several days More ❯
platforms (Tableau, Power BI, Looker) to inform product decisions. About You Experienced Product Owner with strong exposure to data products and platforms. Delivery experience with enterprise data warehouses (Snowflake, BigQuery), lakes, and CDPs. Understanding of data pipelines, contracts, lineage, APIs, and data governance frameworks. Comfortable navigating GDPR, CCPA and other compliance requirements. Familiar with dbt, Airflow, and cloud platforms More ❯
platforms (Tableau, Power BI, Looker) to inform product decisions. About You Experienced Product Owner with strong exposure to data products and platforms. Delivery experience with enterprise data warehouses (Snowflake, BigQuery), lakes, and CDPs. Understanding of data pipelines, contracts, lineage, APIs, and data governance frameworks. Comfortable navigating GDPR, CCPA and other compliance requirements. Familiar with dbt, Airflow, and cloud platforms More ❯
platforms (Tableau, Power BI, Looker) to inform product decisions. About You Experienced Product Owner with strong exposure to data products and platforms. Delivery experience with enterprise data warehouses (Snowflake, BigQuery), lakes, and CDPs. Understanding of data pipelines, contracts, lineage, APIs, and data governance frameworks. Comfortable navigating GDPR, CCPA and other compliance requirements. Familiar with dbt, Airflow, and cloud platforms More ❯
data-driven culture within the organisation Essential Criteria - Bachelor's degree in Statistics, Mathematics, Computer Science, or a related quantitative discipline - 7+ years of experience with advanced SQL (Snowflake, BigQuery, Redshift, Oracle, PostgreSQL, MSSQL, etc.) - 5+ years of experience with reporting/visualization tools (Looker, Tableau, Power BI, etc.) - Strong knowledge of Looker/LookML highly desirable - Deep understanding More ❯
Newark on Trent, Nottinghamshire, United Kingdom Hybrid / WFH Options
Future Prospects Group Ltd
data modelling techniques. Strong analytical and problem-solving skills with an ability to work in agile development environment independently. Experience with data warehouse platforms (e.g., Snowflake, Azure Synapse, Redshift, BigQuery, or similar). Ability to work independently and manage multiple projects simultaneously. Excellent communication and collaboration skills. THE BENEFITS As a Data Warehouse Engineer , you will receive the following More ❯
experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based solutions for Azure/AWS More ❯
experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Preferred Skills : Designing Databricks based solutions for Azure/AWS More ❯
experience in a similar role. Ability to lead and mentor the architects. Required Skills : Mandatory Skills [at least 2 Hyperscalers]: GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF. Skills : Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
infrastructure as code (IaC), automation, CI/CD pipelines, and application modernization on GCP. Serve as a subject matter expert on various GCP services, including Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, IAM, and more. Troubleshoot and resolve complex technical issues, providing expert guidance to project teams. Quality Assurance and Best Practices: Conduct regular reviews of project deliverables to … Solid understanding of networking principles and their application within GCP. Experience with containerization technologies (Docker, Kubernetes/GKE). Familiarity with data management and analytics services on GCP (e.g., BigQuery, Cloud Dataflow, Cloud Storage). Strong understanding of identity and access management (IAM) within GCP. Familiarity of different Cloud Platform like AWS, Azure is nice to have Consulting and More ❯
infrastructure as code (IaC), automation, CI/CD pipelines, and application modernization on GCP. Serve as a subject matter expert on various GCP services, including Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, IAM, and more. Troubleshoot and resolve complex technical issues, providing expert guidance to project teams. Quality Assurance and Best Practices: Conduct regular reviews of project deliverables to … Solid understanding of networking principles and their application within GCP. Experience with containerization technologies (Docker, Kubernetes/GKE). Familiarity with data management and analytics services on GCP (e.g., BigQuery, Cloud Dataflow, Cloud Storage). Strong understanding of identity and access management (IAM) within GCP. Familiarity of different Cloud Platform like AWS, Azure is nice to have Consulting and More ❯