Python libraries for data wrangling, such as Pandas, NumPy, and SQLAlchemy. Experience working with traditional SQL databases (e.g. PostgreSQL, MySQL, SQL Server) and cloud data warehouses (e.g. Snowflake, Databricks, BigQuery, Redshift). Experience with time-series data, and/or implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git, Docker, Jenkins/ More ❯
Python libraries for data wrangling, such as Pandas, NumPy, and SQLAlchemy. Experience working with traditional SQL databases (e.g. PostgreSQL, MySQL, SQL Server) and cloud data warehouses (e.g. Snowflake, Databricks, BigQuery, Redshift). Experience with time-series data, and/or implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git, Docker, Jenkins/ More ❯
london (city of london), south east england, united kingdom
Winston Fox
Python libraries for data wrangling, such as Pandas, NumPy, and SQLAlchemy. Experience working with traditional SQL databases (e.g. PostgreSQL, MySQL, SQL Server) and cloud data warehouses (e.g. Snowflake, Databricks, BigQuery, Redshift). Experience with time-series data, and/or implementing data pipelines from major financial market data vendors (Bloomberg, Refinitiv, Factset....) SDLC and DevOps: Git, Docker, Jenkins/ More ❯
reinforcement learning, LLM orchestration, RAG systems, etc Familiarity with cloud services and MLOps tooling to deploy and scale data and ML workloads cost-effectively Familiarity with data warehouses (Redshift, BigQuery, Snowflake, etc) and best practices around data pipeline tools Strong proficiency in SQL Comfort with ambiguity and short feedback loops A passion for building products that make a real More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
Analysts, and cross-functional squads to land scalable data solutions. Mentor mid-level analysts on analytics delivery, tooling, and stakeholder engagement. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool Location This role is based can be based in London or Nuneaton, and may be occasionally required to travel to any other location of H&B. More ❯
experience with data modeling tools such as Erwin, ER/Studio, SqlDBM or SQL DB-native modelling tools. Hands-on experience with cloud data platforms (Azure Synapse, Snowflake, GCP BigQuery) and modern data storage solutions. Strong SQL skills and good knowledge of data warehousing, lakehouses, and OLTP/OLAP systems. Demonstratable interest and awareness in emerging technologies What we More ❯
Excellent written and verbal communication skills; presentation skills preferred Proficiency in Collibra APIs, Java, Groovy, and REST integrations. Strong SQL knowledge and familiarity with common databases (Oracle, PostgreSQL, DB2, BigQuery, etc.). Experience with ETL/ELT tools (Informatica, DataStage, etc.) and BI platforms (Qlik, Power BI, etc.). Understanding of metadata standards, data lineage, reference data, and business More ❯
handsworth, yorkshire and the humber, united kingdom
Vallum Associates
Excellent written and verbal communication skills; presentation skills preferred Proficiency in Collibra APIs, Java, Groovy, and REST integrations. Strong SQL knowledge and familiarity with common databases (Oracle, PostgreSQL, DB2, BigQuery, etc.). Experience with ETL/ELT tools (Informatica, DataStage, etc.) and BI platforms (Qlik, Power BI, etc.). Understanding of metadata standards, data lineage, reference data, and business More ❯
of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large-scale data tools (Hadoop, BigQuery, Amazon EMR, etc.). Experience with BI tools and visualization platforms such as Tableau, Qlik, or MicroStrategy. Bonus: Experience with geospatial data and advanced analytics platforms. More ❯
of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large-scale data tools (Hadoop, BigQuery, Amazon EMR, etc.). Experience with BI tools and visualization platforms such as Tableau, Qlik, or MicroStrategy. Bonus: Experience with geospatial data and advanced analytics platforms. More ❯
of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large-scale data tools (Hadoop, BigQuery, Amazon EMR, etc.). Experience with BI tools and visualization platforms such as Tableau, Qlik, or MicroStrategy. Bonus: Experience with geospatial data and advanced analytics platforms. More ❯
of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large-scale data tools (Hadoop, BigQuery, Amazon EMR, etc.). Experience with BI tools and visualization platforms such as Tableau, Qlik, or MicroStrategy. Bonus: Experience with geospatial data and advanced analytics platforms. More ❯
london (city of london), south east england, united kingdom
oryxsearch.io
of classical and modern ML techniques, A/B testing methodologies, and experiment design. Solid background in ranking, recommendation, and retrieval systems. Familiarity with large-scale data tools (Hadoop, BigQuery, Amazon EMR, etc.). Experience with BI tools and visualization platforms such as Tableau, Qlik, or MicroStrategy. Bonus: Experience with geospatial data and advanced analytics platforms. More ❯
Strong SQL skills and mastery of Dataform , with experience designing clean, performant, and modular data models supporting attribution and funnel analysis. Deep experience with Google Cloud Platform (GCP) , particularly BigQuery , including cost- and performance-optimized schema design. Advanced knowledge of Customer Data Platforms (Segment.io) , including event stream management, identity resolution, and building a "Golden Profile." Hands-on experience designing More ❯
Position requires at least three (3) years of experience in each of the following skills: 1. Utilize knowledge of relational database management systems, advanced SQL skills, and expertise in BigQuery to process datasets, generate insights, and power data-driven solutions for daily operations. 2. Utilize knowledge of Big Data and Apache Spark (PySpark) to process massive datasets, enabling efficient More ❯
strategy. You can tackle loosely defined problems and come up with relevant analytical approaches and impactful insights. You have proficiency with Python, or similar programming languages, experience with GoogleBigQuery & expertise in SQL. You have hands-on knowledge of A/B testing methodologies and experimentation at scale, including application of this knowledge to digital products. You enjoy sharpening More ❯
you're going to need Deep, hands-on experience designing and building data warehouses with strong command of dimensional modeling (e.g., Kimball methodology) Expertise in Google Cloud Platform, especially BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to build modular, testable transformation pipelines Practical mastery of LookML and semantic More ❯
tooling to get bootstrapped quickly is a must: Frontend TypeScript + React App Router with React Server Components Shadcn UI, Radix UI & Tailwind CSS Utility-first CSS framework Data BigQuery Cloud data warehouse Dataform SQL transformation framework Semantic layer/analytics API Superset Presentation layer Algolia/Vertex AI Search - Search PostgreSQL App data Infrastructure & DevOps Google Cloud Platform More ❯
and analysis. Strong stakeholder management and communication skills, with the ability to translate technical outputs into business-friendly insights. Experience with Snowflake (or alternative cloud data warehouses such as BigQuery or Redshift). More ❯
and analysis. Strong stakeholder management and communication skills, with the ability to translate technical outputs into business-friendly insights. Experience with Snowflake (or alternative cloud data warehouses such as BigQuery or Redshift). More ❯
Nuneaton, Warwickshire, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
Promotions teams to deliver data products. Contribute to sprint planning and backlog shaping as part of the Commercial Ops analytics roadmap. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool. Location: This is a hybrid role, with 2 days per week expected in either our Nuneaton office (CV10 7RH) The Person What We're Looking For More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
Requirements About You Proficient in SQL, with experience querying and transforming large datasets across marketing, sales, and customer data sources. Comfortable working with modern data stacks, ideally including Redshift, BigQuery, Matillion, Metabase, and Retool. Strong grasp of performance marketing metrics such as ROAS, CAC, CTR, CPM, and attribution models. Experienced in building dashboards and visualisations that enable data-driven More ❯
with middleware tools like Tray, Zapier, Make, n8n, and SuperBlocks, and extensive experience with rETL platforms and architecture (Polytomic, Hightouch, RudderStack, etc.) Familiarity with data warehouse platforms like Snowflake, BigQuery, DataBricks, etc. Bonus If You: Experience with other types of data storage ranging cache tools like Redis to S3 Built systems integrating data from a variety of tools and More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
with the ability to effectively present findings to both technical and non-technical stakeholders. Experience of A/B testing, experience in running retailed-based tool analytics Experience of BigQuery is desirable Benefits Pension company contribution = 3% Incentive scheme up to 10% of annual salary , based on company performance. Your wellbeing is paramount so you can get away and More ❯
who are willing to relocate) GCP Data Engineer - GCP Dataflow and Apache Beam (Key skills) Primary Skills- PySpark, Spark, Python, Big Data, GCP, Apache Beam, Dataflow, Airflow, Kafka and BigQuery GFO, Google Analytics Javascript is Must Strong Experience with Dataflow and BigQuery A person should have leading the team or responsible for team as a lead or senior … Cloud Platforms (preferably GCP) provided Big Data technologies Hands-on experience with real-time streaming processing as well as high volume batch processing, and skilled in Advanced SQL, GCP BigQuery, Apache Kafka, Data-Lakes, etc. Hands-on Experience in Big Data technologies - Hadoop, Hive, and Spark, and an enterprise-scale Customer Data Platform (CDP) Experience in at least one … programming language (Python strongly preferred), cloud computing platforms (e.g., GCP), big data tools such as Spark/PySpark, columnar datastores (BigQuery preferred), DevOps processes/tooling (CI/CD, GitHub Actions), infrastructure as code frameworks (Terraform), BI Tools (e.g. DOMO, Tableau, Looker,), pipeline orchestration (eg. Airflow) Fluency in data science/machine learning basics (model types, data prep, training More ❯