Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
changes using offline and online methods. Navigating ambiguity and turning complex, messy data into actionable insights. Proficiency in SQL and navigating large scale data lakes and warehouses (e.g. GoogleBigQuery, Redshift). Experience with cloud platforms such as AWS, GCP is a plus! At JET, this is on the menu: Our teams forge connections internally and work with some More ❯
output—focusing on clarity, documentation, scalability, and reproducibility. Embed insights into key product and tech forums, influencing strategic direction and performance outcomes. Leverage a modern data stack including Redshift, BigQuery, Matillion, and Retool to drive efficiency and depth in analytics. The Person What We're Looking For Leadership: Proven ability to lead and develop analytics teams, fostering a culture More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
cycles. Champion the use of data across the commercial org, raising data literacy and coaching stakeholders on how to use insights. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool. Location: This role is based in Nuneaton and is required to occasionally travel to other location of H&B. We support flexibility and productivity of our More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
deliver end-to-end data products. Mentor mid-level analysts and contribute to capability-building across the Core Business Analytics team. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool. Location:This is a hybrid role, with 2 days per week expected in either our London or Nuneaton office. The Person Core Skills & Behaviours SQL expertise More ❯
What We’re Looking For Strong technical foundation with proficiency in Python (Pandas, NumPy, Scikit-learn), SQL, and cloud platforms (GCP or AWS). Experience with modern data warehouses (BigQuery, Snowflake, Redshift). Proven experience in deploying machine learning models or optimisation algorithms into production. Solid understanding of digital marketing concepts, platforms (e.g., Google Ads, Meta), and analytics tools. More ❯
What We’re Looking For Strong technical foundation with proficiency in Python (Pandas, NumPy, Scikit-learn), SQL, and cloud platforms (GCP or AWS). Experience with modern data warehouses (BigQuery, Snowflake, Redshift). Proven experience in deploying machine learning models or optimisation algorithms into production. Solid understanding of digital marketing concepts, platforms (e.g., Google Ads, Meta), and analytics tools. More ❯
ability to deliver interlocking benefits across teams and platforms. Strong statistical grounding, including expertise in forecasting, clustering, optimization, and predictive modeling. Proficiency in Python, SQL, and cloud platforms (especially BigQuery). Commercial acumen in leveraging data to shape strategy and unlock business value. Exceptional data storytelling skills, translating complex models into engaging narratives. Experience in scaling proof-of-concept More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Pontoon
ability to deliver interlocking benefits across teams and platforms. Strong statistical grounding, including expertise in forecasting, clustering, optimization, and predictive modeling. Proficiency in Python, SQL, and cloud platforms (especially BigQuery). Commercial acumen in leveraging data to shape strategy and unlock business value. Exceptional data storytelling skills, translating complex models into engaging narratives. Experience in scaling proof-of-concept More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Pontoon
ability to deliver interlocking benefits across teams and platforms. Strong statistical grounding, including expertise in forecasting, clustering, optimization, and predictive modeling. Proficiency in Python, SQL, and cloud platforms (especially BigQuery). Commercial acumen in leveraging data to shape strategy and unlock business value. Exceptional data storytelling skills, translating complex models into engaging narratives. Experience in scaling proof-of-concept More ❯
engineering or a similar role. Strong expertise in SQL with the ability to write efficient, complex queries. Proficiency in DBT for data modeling and transformation Hands-on experience with BigQuery and other GCP data services. Solid understanding of data warehousing principles and best practices. Basic to intermediate skills in Python for scripting and automation. Familiarity with version control systems More ❯
with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity with Snowflake. Knowledge of object-oriented programming (Kotlin, Java, or C#). Retail data experience (sales, inventory, or related datasets). More ❯
Nuneaton, Warwickshire, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
to deliver end-to-end data products. Mentor mid-level analysts and help strengthen capability across the Commercial Ops analytics team. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool. Location: This is a hybrid role based in Nuneaton, with the requirement to be in the office at least two days per week. The Person What More ❯
ability to translate ambiguous business problems into scalable, production-ready data science solutions Hands-on experience with cloud platforms (AWS, GCP, Azure) and tools such as Databricks, Snowflake, and BigQuery Exceptional communication and stakeholder management skills, with the ability to influence decision-making across technical and non-technical teams Desirable skills Familiarity with real-time model serving, monitoring, and More ❯
Stevenage, Hertfordshire, United Kingdom Hybrid / WFH Options
GlaxoSmithKline
data solutions on cloud platforms. Key Placement Learnings: Design and build data pipelines using Python, R, Nextflow; work on GCP to automate data flows. Interact with databases (e.g., PostgreSQL, BigQuery) to manage data and create views. Create analysis tools and dashboards to communicate results for decision-making. Work in multidisciplinary teams and develop presentation and communication skills. Role More ❯
experience in cybersecurity engineering, cloud security, or IT security. Hands-on experience with: Google Admin Console, Security Center, Google Vault, Cloud Identity GCP services: Compute Engine, VPC, IAM, GKE, BigQuery Proficiency with infrastructure-as-code tools: Terraform, Ansible, Deployment Manager. Familiarity with CI/CD tools: GitLab CI, Jenkins, Cloud Build. Strong background in Linux systems, networking, and containerization More ❯
in technology delivery, with 3+ years leading cloud data projects (GCP preferred). Must have experience delivering data platforms in banking or financial services. Strong knowledge of GCP services (BigQuery, Dataflow, Pub/Sub). Familiarity with ETL/ELT, data lakehouse architectures, and cloud integration. Excellent leadership, communication, and stakeholder management skills. Understanding of data governance frameworks (GDPR More ❯
and existing diseases, and a pattern of continuous learning and development is mandatory. Key Responsibilities Build data pipelines using modern data engineering tools on Google Cloud: Python, Spark, SQL, BigQuery, Cloud Storage. Ensure data pipelines meet the specific scientific needs of data consuming applications. Responsible for high quality software implementations according to best practices, including automated test suites and More ❯
and existing diseases, and a pattern of continuous learning and development is mandatory. Key Responsibilities Build data pipelines using modern data engineering tools on Google Cloud: Python, Spark, SQL, BigQuery, Cloud Storage. Ensure data pipelines meet the specific scientific needs of data consuming applications. Responsible for high quality software implementations according to best practices, including automated test suites and More ❯
and existing diseases, and a pattern of continuous learning and development is mandatory. Key Responsibilities Build data pipelines using modern data engineering tools on Google Cloud: Python, Spark, SQL, BigQuery, Cloud Storage. Ensure data pipelines meet the specific scientific needs of data consuming applications. Responsible for high quality software implementations according to best practices, including automated test suites and More ❯
will too. What you'll need: Confirmed experience as a Data Solution Architect with hands-on expertise in GCP and/or Azure Strong proficiency in GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Composer and Looker Experience migrating data from legacy platforms to cloud-native environments Expertise in data modelling, pipeline development and real-time data streaming More ❯
with a multi-person engineering team, leveraging modern practices like code reviews, CI/CD, and trunk-based development. Experience in cloud ecosystems, particularly Google Cloud Platform services like BigQuery, Pub/Sub, or Cloud Functions. Required Skills/Abilities Proficiency in software development using modern programming languages like Python, Java, Go, or similar tools. Strong understanding of microservices More ❯
applications Hands-on experience with cloud data platforms (AWS/GCP/Azure) Proficiency with big data technologies (Spark, Kafka, or similar streaming platforms) Experience with data warehouses (Snowflake, BigQuery, Redshift) and data lakes Knowledge of containerization (Docker/Kubernetes) and infrastructure as code Preferred Experience Experience building web applications with modern frameworks (React, Vue, or Angular) API development More ❯
where relevant. Tech you'll likely use LLM frameworks : LangChain, LlamaIndex (or similar) Cloud & Dev : Azure/AWS/GCP, Docker, REST APIs, GitHub Actions/CI Data & MLOps : BigQuery/Snowflake, MLflow/DVC, dbt/Airflow (nice to have) Front ends (for internal tools) : Streamlit/Gradio/basic React Must-have experience 7+ years in Data More ❯
with the ability to effectively present findings to both technical and non-technical stakeholders. Experience of A/B testing, experience in running retailed-based tool analytics Experience of BigQuery is desirable Benefits Pension company contribution = 3% Incentive scheme up to 10% of annual salary , based on company performance. Your wellbeing is paramount so you can get away and More ❯