ll be given the space to lead your domain, backed by support and guidance when needed, but never micromanaged. What You’ll Bring Solid technical foundation, especially in SQL , BigQuery , and data pipelines . A curious mindset with the ability to translate messy data into clean insight — ideally from a background in marketing analytics , Data Science or MarTech . More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Lloyds Banking Group
Microsoft Graph API) IN ADDITION, THE BELOW WOULD BE NICE TO HAVE (DEPENDING ON THE TEAM) Lab 1: Cloud Enterprise and Computer Security Data & Analytics (KQL/SQL or BigQuery for GCP) Power Platform and PowerShell Lab 2: Security Operations SIEM management Advanced logging DLP technical policy development Ability to build and train machine learning models to address business More ❯
you're going to need Deep, hands-on experience designing and building data warehouses with strong command of dimensional modeling (e.g., Kimball methodology) Expertise in Google Cloud Platform, especially BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to build modular, testable transformation pipelines Practical mastery of LookML and semantic More ❯
use of PubSub and Cloud Storage React frontend, using ShadCN for components, TailwindCSS for styling, React Query for state management Posthog for frontend analytics (events, sessions, feature flags, experiments) BigQuery as our data warehouse, with Metabase for data visualization. Production data and Posthog events both stream into it so data is in one place. Sentry and Google Cloud Logging More ❯
Salford, Manchester, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
Microsoft Graph API) IN ADDITION, THE BELOW WOULD BE NICE TO HAVE (DEPENDING ON THE TEAM) Lab 1: Cloud Enterprise and Computer Security Data & Analytics (KQL/SQL or BigQuery for GCP) Kubernetes (K8s) Power Platform and PowerShell Lab 2: Security Operations SIEM management Advanced logging Cyber Defence Centre tooling DLP technical policy development Ability to build and train More ❯
Maidenhead, Berkshire, United Kingdom Hybrid / WFH Options
Squared Up
in B2B Saas/Tech/software industry) Demonstrable passion for data analytics, particularly marketing, product data analysis Ideally you would have experience using analytical tools such as SQL (BigQuery), GA4, Search Console, Google Ads, Amplitude, Salesforce o Data visualization tools such as : Looker If you don't have the experience but can demonstrate the passion and knowledge in More ❯
company strategy and steer key decisions through product and user analytics Requirements Essential: 1 + year as a Data Analyst (or a similar role) Strong understanding of SQL and Bigquery, or similar relational database languages Strong analytical skills Experience working with internal customers and understanding their needs Strong awareness of commercial and strategic drivers to translate data into real More ❯
Maidenhead, Berkshire, United Kingdom Hybrid / WFH Options
Squared Up
in B2B Saas/Tech/software industry) Demonstrable passion for data analytics, particularly marketing, product data analysis Ideally you would have experience using analytical tools such as SQL (BigQuery), GA4, Search Console, Google Ads, Amplitude, Salesforce o Data visualization tools such as : Looker If you don't have the experience but can demonstrate the passion and knowledge in More ❯
/product managers. You will have several years direct experience with Looker Core and strong SQL skills, ideally working with large datasets in a cloud tooling context (Snowflake/BigQuery). Exposure to commercial banking would be advantageous, particularly the standards of data governance or security that are expected in financial services. More ❯
skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid working More ❯
equivalent experience. Expertise on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernization and creating new AI applications. Expertise on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligence, and reporting with integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs More ❯
Studio, Tableau, Power BI, Trevor.io) Strong communication skills - able to explain data to non-technical audiences Highly self-motivated and comfortable managing your own workload Bonus Skills Experience with BigQuery or similar cloud data warehouses Familiarity with Looker (full version) Basic forecasting/modelling skills Exposure to product, marketing, or operational data Why Apply? High-ownership role with exposure More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Pertemps Cambridge
Studio, Tableau, Power BI, Trevor.io) Strong communication skills – able to explain data to non-technical audiences Highly self-motivated and comfortable managing your own workload Bonus Skills Experience with BigQuery or similar cloud data warehouses Familiarity with Looker (full version) Basic forecasting/modelling skills Exposure to product, marketing, or operational data Why Apply? High-ownership role with exposure More ❯
containerized microservices using Docker, running in ECS Fargate or similar cloud-native environments. -Work across AWS and GCP, leveraging services like Lambda, Kinesis, SQS, EventBridge, AWS Batch, Spark, and BigQuery to power cross-cloud data products. -Build and integrate with RESTful APIs to expose data services and connect systems. -Contribute to CI/CD pipelines using Terraform, Docker, and More ❯
in cloud-based data platforms and infrastructure (e.g., AWS, GCP), ensuring scalability and security for large volumes of streaming and batch data. Had exposure to data warehouses such as BigQuery or Snowflake. Adept in Python and/or Java for developing data services and integrating APIs to bring in diverse sources of media data. Strong interpersonal and communication skills … and Infrastructure Building: Build and manage the infrastructure necessary for optimal ETL or ELT of data using Python, SQL, and Google Cloud Platform (GCP) big data technologies, such as BigQuery, Dataflow, Dataproc and Cloud Storage. Business Intelligence Enablement: Prepare and transform pipeline data to support downstream analytics and feed BI tools (DOMO), enabling data-driven decision-making across the … SQL and Database Expertise: Strong working knowledge of SQL with hands-on experience querying and managing relational databases, alongside familiarity with a variety of database technologies (e.g., PostgreSQL, MySQL, BigQuery). Big Data Engineering: Exposure to designing, building, and optimizing ‘big data’ pipelines, architectures, and datasets, enabling efficient data processing at scale. Analytical Problem Solving: Ideally has performed root More ❯
in cloud-based data platforms and infrastructure (e.g., AWS, GCP), ensuring scalability and security for large volumes of streaming and batch data. Had exposure to data warehouses such as BigQuery or Snowflake. Adept in Python and/or Java for developing data services and integrating APIs to bring in diverse sources of media data. Strong interpersonal and communication skills … and Infrastructure Building: Build and manage the infrastructure necessary for optimal ETL or ELT of data using Python, SQL, and Google Cloud Platform (GCP) big data technologies, such as BigQuery, Dataflow, Dataproc and Cloud Storage. Business Intelligence Enablement: Prepare and transform pipeline data to support downstream analytics and feed BI tools (DOMO), enabling data-driven decision-making across the … SQL and Database Expertise: Strong working knowledge of SQL with hands-on experience querying and managing relational databases, alongside familiarity with a variety of database technologies (e.g., PostgreSQL, MySQL, BigQuery). Big Data Engineering: Exposure to designing, building, and optimizing ‘big data’ pipelines, architectures, and datasets, enabling efficient data processing at scale. Analytical Problem Solving: Ideally has performed root More ❯
and loading processes for various data sources (e.g., databases, APIs, cloud storage). Ensure data quality, consistency, and security throughout the data pipeline. Leverage GCP services (e.g., Dataflow, Dataproc, BigQuery, Cloud Storage ) to build and maintain cloud-native data solutions. Implement infrastructure as code (IaC) principles using Terraform to automate provisioning and configuration. Manage and optimize cloud resources to … data engineering best practices and standards. Collaborate with cross-functional teams to deliver complex data projects. This role will be a great fit if: Expert in GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Composer, Cloud Storage, and Cloud Functions. GCP Professional Data Engineer Certification is highly favourable. Advanced knowledge of SQL for complex data transformation and query More ❯
governance and privacy, data visualisation and analysis, activation and agents Familiarity with the Google Marketing Platform and Google Cloud toolset, specifically Google Analytics, Google Tag Manager, Looker Studio and BigQuery An insatiable appetite for AI with evidence of early adoption and a hand-on builder attitude A strong commercial mindset with an understanding of agency and consulting operating models More ❯
movement of data across enterprise environments Desirable Knowledge of data storage systems, including RDBMS (e.g., MySQL, PostgreSQL), NoSQL (e.g., MongoDB, Cassandra), and data warehouse solutions (e.g., Amazon Redshift, GoogleBigQuery) Familiarity with data exchange technologies and protocols such as APIs, ETL processes, and data replication Understanding of cloud storage services offered by AWS, Microsoft Azure, and Google Cloud Platform More ❯
informed, data-led choices Ideal Candidate: Hands-on experience with GA4 and Google Tag Manager implementation Confident working with large datasets; experience with cloud-based analytics tools such as BigQuery is valuable Proven ability to deliver analytical solutions in ecommerce or retail-focused environments Comfortable presenting data findings to both technical and non-technical audiences Understands digital marketing performance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Digital Gurus
informed, data-led choices Ideal Candidate: Hands-on experience with GA4 and Google Tag Manager implementation Confident working with large datasets; experience with cloud-based analytics tools such as BigQuery is valuable Proven ability to deliver analytical solutions in ecommerce or retail-focused environments Comfortable presenting data findings to both technical and non-technical audiences Understands digital marketing performance More ❯
teams Provide design support for our data models to answer multiple business needs through collaboration with data engineers Nice to haves Experience with Looker/ThoughtSpot Experience with GoogleBigQuery Experience with Mobile Measurement Platforms Why join Muzz? We're a profitable Consumer Tech startup, backed by Y Combinator (S17) and based in London . Join our fast growing More ❯
use of PubSub and Cloud Storage React frontend, using ShadCN for components, TailwindCSS for styling, React Query for state management Posthog for frontend analytics (events, sessions, feature flags, experiments) BigQuery as our data warehouse, with Metabase for data visualization. Production data and Posthog events both stream into it so data is in one place. Sentry and Google Cloud Logging More ❯
use of PubSub and Cloud Storage React frontend, using ShadCN for components, TailwindCSS for styling, React Query for state management Posthog for frontend analytics (events, sessions, feature flags, experiments) BigQuery as our data warehouse, with Metabase for data visualization. Production data and Posthog events both stream into it so data is in one place. Sentry and Google Cloud Logging More ❯
use of PubSub and Cloud Storage React frontend, using ShadCN for components, TailwindCSS for styling, React Query for state management Posthog for frontend analytics (events, sessions, feature flags, experiments) BigQuery as our data warehouse, with Metabase for data visualization. Production data and Posthog events both stream into it so data is in one place. Sentry and Google Cloud Logging More ❯