similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
promised outcomes. Drive high client value and broaden relationships at senior levels with current and prospective clients. Our Tech Stack Cloud: Azure, sometimes GCP & AWS Data Platform: Databricks, Snowflake, BigQuery Data Engineering tools: Pyspark, Polars, DuckDB, Malloy, SQL Infrastructure-as-code: Terraform, Pulumi Data Management and Orchestration: Airflow, dbt Databases and Data Warehouses: SQL Server, PostgreSQL, MongoDB, Qdrant, Pinecone More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
to shape commercially viable and technically sound solutions. Enterprise Solution Design Architect and oversee the delivery of scalable data platforms (data lakes, lakehouses, warehouses) using GCP technologies such as BigQuery, Cloud Storage, Databricks, and Snowflake. Cloud Data Strategy Lead cloud migration and modernisation strategies using GCP and tools like Terraform, CI/CD pipelines, Azure DevOps, and GitHub. Data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
to shape commercially viable and technically sound solutions. Enterprise Solution Design Architect and oversee the delivery of scalable data platforms (data lakes, lakehouses, warehouses) using GCP technologies such as BigQuery, Cloud Storage, Databricks, and Snowflake. Cloud Data Strategy Lead cloud migration and modernisation strategies using GCP and tools like Terraform, CI/CD pipelines, Azure DevOps, and GitHub. Data More ❯
modern data lake architectures Advanced proficiency in Python (including PySpark) and SQL, with experience building scalable data pipelines and analytics workflows Strong background in cloud-native data infrastructure (e.g., BigQuery, Redshift, Snowflake, Databricks) Demonstrated ability to lead teams, set technical direction, and collaborate effectively across business and technology functions Desirable skills Familiarity with machine learning pipelines and MLOps practices More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Awin
equivalent (expired is acceptable) Working knowledge of SQL and data modelling concepts Experience with BI tools (e.g., Power BI, Looker, Tableau) Familiarity with cloud data platforms such as Snowflake, BigQuery, or AWS Redshift Understanding of modern data architecture and APIs Our Offer Flexi-Week and Work-Life Balance: We prioritise your mental health and wellbeing, offering you a flexible More ❯
data accuracy, availability, security, and compliance. Partner with leadership to shape the company's tech and data strategy. Experience: Proven experience in data engineering, analytics, and reporting (SQL, GoogleBigQuery, Power BI, Looker, Tableau). Strong capability with API integrations and low/no-code tools (n8n, Zapier, Airtable, Make). Broad technical problem-solving skills, able to implement More ❯
technical topics for non-technical audiences. Team player - Collaborative, curious, and happy to work in a self-organising, fast-paced environment. Nice-to-Have Skills : Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). Familiarity with data orchestration tools (e.g., Airflow). Experience with data visualisation platforms (such as Preset.io/Apache Superset or More ❯
London, City of London, United Kingdom Hybrid / WFH Options
Datatech
technical topics for non-technical audiences. · Team player - Collaborative, curious, and happy to work in a self-organising, fast-paced environment. Nice-to-Have Skills : ·Experience with GCP/BigQuery or other cloud data warehouses (e.g., Snowflake, Redshift). ·Familiarity with data orchestration tools (e.g., Airflow). ·Experience with data visualisation platforms (such as Preset.io/Apache Superset or More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
re Looking For Hands-on Palantir Foundry expertise or transferable, client facing data engineering experience with large-scale data platforms such as: Snowflake, Databricks, AWS Glue/Redshift, GoogleBigQuery Software engineering skills in Python, Java, or TypeScript/React. Strong data modelling, pipeline development, and API design experience. Excellent problem-solving and communication skills. Why This Role Stands More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
re Looking For Hands-on Palantir Foundry expertise or transferable, client facing data engineering experience with large-scale data platforms such as: Snowflake, Databricks, AWS Glue/Redshift, GoogleBigQuery Software engineering skills in Python, Java, or TypeScript/React. Strong data modelling, pipeline development, and API design experience. Excellent problem-solving and communication skills. Why This Role Stands More ❯
cloud platforms, preferably Azure (AWS, GCP experience is also valuable). Experience managing large and complex datasets, with a strong command of SQL for cloud-based environments (Fabric, Snowflake, BigQuery, Redshift, etc.). A solid understanding of data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
cloud platforms, preferably Azure (AWS, GCP experience is also valuable). Experience managing large and complex datasets, with a strong command of SQL for cloud-based environments (Fabric, Snowflake, BigQuery, Redshift, etc.). A solid understanding of data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
Understanding of key SaaS metrics and customer lifecycle analytics Excellent communication skills with the ability to translate complex data into clear business recommendations Experience working with big data platforms (BigQuery, Snowflake, Redshift) Experience working with DBT Familiarity with product management tools (Jira, Confluence) Knowledge of statistical methods for hypothesis testing and forecasting Benefits Hybrid working (2 days in London More ❯
large-scale datasets Excellent communication skills and ability to present complex ideas to non-technical stakeholders Experience with cloud platforms (AWS, GCP, Azure) and tools like Databricks, Snowflake, or BigQuery Desirable Skills Familiarity with MLOps, model monitoring, and production deployment Experience in a specific domain (e.g., marketing, operations, fraud, personalization, NLP) is a plus Hands-on experience with LLM More ❯
output—focusing on clarity, documentation, scalability, and reproducibility. Embed insights into key product and tech forums, influencing strategic direction and performance outcomes. Leverage a modern data stack including Redshift, BigQuery, Matillion, and Retool to drive efficiency and depth in analytics. The Person What We're Looking For Leadership: Proven ability to lead and develop analytics teams, fostering a culture More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
deliver end-to-end data products. Mentor mid-level analysts and contribute to capability-building across the Core Business Analytics team. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool. Location:This is a hybrid role, with 2 days per week expected in either our London or Nuneaton office. The Person Core Skills & Behaviours SQL expertise More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
channels. What you'll be doing: Build dashboards & automate reporting across Paid Search, Social, Display, Email & Organic Define KPIs and translate data into clear business insights Use SQL, Python, BigQuery, Tableau (or similar BI tools) to drive scalable solutions Partner with marketing, sales, and finance to optimise performance Contribute to advanced analytics projects: media mix modelling, lead scoring, attribution More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
Analysts, and cross-functional squads to land scalable data solutions. Mentor mid-level analysts on analytics delivery, tooling, and stakeholder engagement. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool Location This role is based can be based in London or Nuneaton, and may be occasionally required to travel to any other location of H&B. More ❯
with dbt for data transformations. Hands-on experience with Apache Airflow or Prefect. Proficiency with GitHub, GitHub Actions, and CI/CD pipelines. Nice to have: Experience with GCP (BigQuery, Dataflow, Composer, Pub/Sub) or AWS. Familiarity with Snowflake. Knowledge of object-oriented programming (Kotlin, Java, or C#). Retail data experience (sales, inventory, or related datasets). More ❯
ability to translate ambiguous business problems into scalable, production-ready data science solutions Hands-on experience with cloud platforms (AWS, GCP, Azure) and tools such as Databricks, Snowflake, and BigQuery Exceptional communication and stakeholder management skills, with the ability to influence decision-making across technical and non-technical teams Desirable skills Familiarity with real-time model serving, monitoring, and More ❯
and existing diseases, and a pattern of continuous learning and development is mandatory. Key Responsibilities Build data pipelines using modern data engineering tools on Google Cloud: Python, Spark, SQL, BigQuery, Cloud Storage. Ensure data pipelines meet the specific scientific needs of data consuming applications. Responsible for high quality software implementations according to best practices, including automated test suites and More ❯
and existing diseases, and a pattern of continuous learning and development is mandatory. Key Responsibilities Build data pipelines using modern data engineering tools on Google Cloud: Python, Spark, SQL, BigQuery, Cloud Storage. Ensure data pipelines meet the specific scientific needs of data consuming applications. Responsible for high quality software implementations according to best practices, including automated test suites and More ❯
where relevant. Tech you'll likely use LLM frameworks : LangChain, LlamaIndex (or similar) Cloud & Dev : Azure/AWS/GCP, Docker, REST APIs, GitHub Actions/CI Data & MLOps : BigQuery/Snowflake, MLflow/DVC, dbt/Airflow (nice to have) Front ends (for internal tools) : Streamlit/Gradio/basic React Must-have experience 7+ years in Data More ❯