Senior Data Engineer London (Hybrid) We are Manufacturing the Future! Geomiq is revolutionizing traditional manufacturing by providing engineers worldwide with instant access to reliable production methods through our digital platform. As the UK's leading Digital Manufacturing Marketplace, we offer More ❯
Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
Newark on Trent, Nottinghamshire, United Kingdom Hybrid / WFH Options
Future Prospects Group Ltd
data modelling techniques. Strong analytical and problem-solving skills with an ability to work in agile development environment independently. Experience with data warehouse platforms (e.g., Snowflake, Azure Synapse, Redshift, BigQuery, or similar). Ability to work independently and manage multiple projects simultaneously. Excellent communication and collaboration skills. THE BENEFITS As a Data Warehouse Engineer , you will receive the following More ❯
Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
You Strong experience as a Product Owner or Data Product Owner in Agile environments. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Motability Operations Limited
About You Strong experience as a Product Owner or Data Product Owner in Agile environments. Ideally experience managing Finance and data products or platforms, ideally data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
needs. Qualifications Strong experience as a Product Owner or Data Product Owner in Agile environments. Ideally experience managing Finance and data products or platforms, ideally data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Strong understanding of data quality frameworks, data contracts, and lineage. Proficient in using analytics More ❯
Employment Type: Permanent, Part Time, Work From Home
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Motability Operations
Qualifications Strong experience as a Product Owner or Data Product Owner in Agile environments. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks More ❯
Employment Type: Permanent, Part Time, Work From Home
expertise to grow top line revenues and guide commercial initiatives from our data. You'll own the analysis of the end-to-end customer journey, using our data stack ( BigQuery, dbt, Hex ) to create data models, data products, metrics and find insights that fuel our growth. You'll work closely with other engineers, marketers, product teams, and commercial teams More ❯
management practices, system development life cycle management, IT services management, agile and lean methodologies, infrastructure and operations, and EA and ITIL frameworks. Proficiency with data warehousing solutions (e.g., GoogleBigQuery, Snowflake). Expertise in data modeling tools and techniques (e.g., SAP PowerDesigner, EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud More ❯
South West, England, United Kingdom Hybrid / WFH Options
Interquest
infrastrcutrure (design and lead implementation of enterprise-grade ETL and data pipeline solutions) - Take ownership of the data warehouse and related infrastructure - Experience working with cloud platforms (such as BigQuery, Snowflake, Azure) - Embed data governance strategies - Must be highly skilled in SQL and confident using other tools (such as Python, R, JavaScript) InterQuest Group is acting as an employment More ❯
Streams)? Select Which statement best describes your hands on responsibility for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, Azure Synapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines (Docker, GitHub More ❯
Leicester, Leicestershire, United Kingdom Hybrid / WFH Options
Effect
and optimization efforts with detailed recommendations. Establish and maintain data governance practices to ensure data quality and security. Identify opportunities to integrate and streamline data using tools like GoogleBigQuery and Snowflake New Business Development Provide insights and data support for new business pitches and case study development. Collaborate with teams to analyse user behaviour and campaign performance for … Proficiency in BI tools like Looker Studio and Power BI. Strong knowledge of dataLayer customisation for advanced tracking needs. Experience with ETL processes and data pipelines. Familiarity with GoogleBigQuery, SQL, and performance marketing analytics. Ability to analyse user journeys and identify optimisation opportunities. What we can offer you What we can offer you Flexible Working: Work from home More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
each use case Support knowledge sharing and technical best practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable More ❯
each use case Support knowledge sharing and technical best practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
each use case Support knowledge sharing and technical best practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable More ❯
marketing analytics for Marketing Mix Modelling, Forecasting, and Predictive Analysis. Experience with data transformation and parsing using SQL queries, Python, or R. Experience with cloud data warehouses (e.g., Snowflake, BigQuery), visualization tools (e.g., Tableau, Looker), and web analytics platforms. If you believe you have the relevant experience, please reply to this advert or email your CV to Not right More ❯
focus on backend technologies and building distributed services. Proficiency in one or more programming languages including Java, Python, Scala or Golang. Experience with columnar, analytical cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) and data processing frameworks like Apache Spark is essential. Experience with cloud platforms like AWS, Azure, or Google Cloud. Strong proficiency in designing, developing, and deploying microservices More ❯
advance measurement methodologies Proficiency in common data science coding languages such as in SQL, Python and/or R Practical experience with Google Cloud Platform and services such as BigQuery, Looker, and DataProc Additional Information The Power of One starts with our people! To do powerful things, we offer powerful resources. Our best-in-class wellness and benefits offerings More ❯
data solutions Migrating from Azure to GCP Automate data lifecycle processes and optimise cloud resource usage Support analytics and self-service reporting initiatives Skills & Experience Strong experience with GCP, BigQuery, Dataflow Proficiency in Python and Terraform Skilled in ETL/ELT, data modelling, and metadata management Familiarity with CI/CD, DevOps, and Infrastructure-as-Code Knowledge of insurance More ❯