and Build: Design and implement a robust, cloud-native data analytics platform spanning AWS, GCP, and other emerging cloud environments. You'll leverage services like S3/GCS, Glue, BigQuery, Pub/Sub, SQS/SNS, MWAA/Composer, and more to create a seamless data experience. (Required) Data Lake , Data Zone, Data Governance: Design, build, and manage data More ❯
the team. Required skills and experience: Advanced Google Standard SQL and MySQL or similar relational database language Fluency in Python and its application in data analysis Familiarity with GoogleBigQuery or other data warehouse solution Experience with Tableau; creating & managing data sources and converting complex insights into a digestible format Working knowledge of Google Analytics or other web-based More ❯
New Malden, Surrey, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
will have the following skills and experience: Strong hands-on background in data engineering, with 5+ years working on modern data platforms Experience leading cloud data migrations- GCP and BigQuery strongly preferred Proficiency in SQL, Python, dbt, Airflow, Terraform and other modern tooling Excellent understanding of data architecture, governance, and DevOps best practices Proven leadership or team management experience More ❯
Statistics, Data Analytics, Computer Science) Experience in data analysis tools (e.g., SQL, Python, etc) Ability to work effectively in a team environment Responsibilities Collect, clean, and optimize datasets in BigQuery or other SQL databases for analysis and reporting Use SQL to query large datasets and uncover trends that support business decisions Design, build, and maintain Looker dashboards, including custom More ❯
across the business What You Bring 3+ years in analytics engineering, data engineering, or BI roles Deep experience with SQL (Python a bonus) Familiarity with modern data stacks: Fabric, BigQuery, Fivetran/Stitch, dbt, Looker/Hex/Metabase etc. Strong understanding of data modelling principles (dimensional models, slowly changing dimensions etc.) Track record of collaborating across functions and More ❯
automating models and advancing our engineering practices. You're familiar with cloud technologies . You have experience working with data in a cloud data warehouse (Redshift, Snowflake, Databricks, or BigQuery) Experience with a modern data modeling technology (DBT) You document and communicate clearly . Some experience with technical content writing would be a plus You are excited to work More ❯
to the business. What You'll Be Doing: Designing and developing scalable ETL pipelines to process and deliver large volumes of data. Working hands-on with GCP services including BigQuery , Pub/Sub , and Dataflow . Automating infrastructure using Terraform , Ansible , and CI/CD tooling. Writing clean, efficient code in Python , Go , and BASH . Supporting and maintaining More ❯
ML models, ideally in NLP or computer vision domains. Expert-level Python and SQL, with solid experience using libraries like Pandas, Scikit-Learn, TensorFlow, etc. Proven experience working with BigQuery and big data pipelines on GCP . Deep understanding of statistics, machine learning algorithms, and data modelling. Strong analytical mindset with a knack for turning data into actionable business More ❯
you're going to need Deep, hands-on experience designing and building data warehouses with strong command of dimensional modeling (e.g., Kimball methodology) Expertise in Google Cloud Platform, especially BigQuery architecture, optimization, and cost management Advanced SQL skills and production-level experience using dbt (or similar tools) to build modular, testable transformation pipelines Practical mastery of LookML and semantic More ❯
use of PubSub and Cloud Storage React frontend, using ShadCN for components, TailwindCSS for styling, React Query for state management Posthog for frontend analytics (events, sessions, feature flags, experiments) BigQuery as our data warehouse, with Metabase for data visualization. Production data and Posthog events both stream into it so data is in one place. Sentry and Google Cloud Logging More ❯
company strategy and steer key decisions through product and user analytics Requirements Essential: 1 + year as a Data Analyst (or a similar role) Strong understanding of SQL and Bigquery, or similar relational database languages Strong analytical skills Experience working with internal customers and understanding their needs Strong awareness of commercial and strategic drivers to translate data into real More ❯
skills for building and optimising data pipelines Experience working with cloud platforms (e.g., AWS, GCP, or Azure) Familiarity with modern data stack tools (e.g., dbt, Airflow, Snowflake, Redshift, or BigQuery) Understanding of data modelling and warehousing principles Experience working with large datasets and distributed systems What's in it for you? Up to £70k Hybrid working More ❯
equivalent experience. Expertise on Azure Databases (SQL DB, Cosmos DB, PostgreSQL) from migration & modernization and creating new AI applications. Expertise on Azure Analytics (Fabric, Azure Databricks, Purview) and competitors (BigQuery, Redshift, Snowflake) in data warehouse, data lake, big data, analytics, real-time intelligence, and reporting with integrated Data Security & Governance. Proven ability to lead technical engagements (e.g., hackathons, PoCs More ❯
Studio, Tableau, Power BI, Trevor.io) Strong communication skills - able to explain data to non-technical audiences Highly self-motivated and comfortable managing your own workload Bonus Skills Experience with BigQuery or similar cloud data warehouses Familiarity with Looker (full version) Basic forecasting/modelling skills Exposure to product, marketing, or operational data Why Apply? High-ownership role with exposure More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Pertemps Cambridge
Studio, Tableau, Power BI, Trevor.io) Strong communication skills – able to explain data to non-technical audiences Highly self-motivated and comfortable managing your own workload Bonus Skills Experience with BigQuery or similar cloud data warehouses Familiarity with Looker (full version) Basic forecasting/modelling skills Exposure to product, marketing, or operational data Why Apply? High-ownership role with exposure More ❯
containerized microservices using Docker, running in ECS Fargate or similar cloud-native environments. -Work across AWS and GCP, leveraging services like Lambda, Kinesis, SQS, EventBridge, AWS Batch, Spark, and BigQuery to power cross-cloud data products. -Build and integrate with RESTful APIs to expose data services and connect systems. -Contribute to CI/CD pipelines using Terraform, Docker, and More ❯
in cloud-based data platforms and infrastructure (e.g., AWS, GCP), ensuring scalability and security for large volumes of streaming and batch data. Had exposure to data warehouses such as BigQuery or Snowflake. Adept in Python and/or Java for developing data services and integrating APIs to bring in diverse sources of media data. Strong interpersonal and communication skills … and Infrastructure Building: Build and manage the infrastructure necessary for optimal ETL or ELT of data using Python, SQL, and Google Cloud Platform (GCP) big data technologies, such as BigQuery, Dataflow, Dataproc and Cloud Storage. Business Intelligence Enablement: Prepare and transform pipeline data to support downstream analytics and feed BI tools (DOMO), enabling data-driven decision-making across the … SQL and Database Expertise: Strong working knowledge of SQL with hands-on experience querying and managing relational databases, alongside familiarity with a variety of database technologies (e.g., PostgreSQL, MySQL, BigQuery). Big Data Engineering: Exposure to designing, building, and optimizing ‘big data’ pipelines, architectures, and datasets, enabling efficient data processing at scale. Analytical Problem Solving: Ideally has performed root More ❯
in cloud-based data platforms and infrastructure (e.g., AWS, GCP), ensuring scalability and security for large volumes of streaming and batch data. Had exposure to data warehouses such as BigQuery or Snowflake. Adept in Python and/or Java for developing data services and integrating APIs to bring in diverse sources of media data. Strong interpersonal and communication skills … and Infrastructure Building: Build and manage the infrastructure necessary for optimal ETL or ELT of data using Python, SQL, and Google Cloud Platform (GCP) big data technologies, such as BigQuery, Dataflow, Dataproc and Cloud Storage. Business Intelligence Enablement: Prepare and transform pipeline data to support downstream analytics and feed BI tools (DOMO), enabling data-driven decision-making across the … SQL and Database Expertise: Strong working knowledge of SQL with hands-on experience querying and managing relational databases, alongside familiarity with a variety of database technologies (e.g., PostgreSQL, MySQL, BigQuery). Big Data Engineering: Exposure to designing, building, and optimizing ‘big data’ pipelines, architectures, and datasets, enabling efficient data processing at scale. Analytical Problem Solving: Ideally has performed root More ❯
informed, data-led choices Ideal Candidate: Hands-on experience with GA4 and Google Tag Manager implementation Confident working with large datasets; experience with cloud-based analytics tools such as BigQuery is valuable Proven ability to deliver analytical solutions in ecommerce or retail-focused environments Comfortable presenting data findings to both technical and non-technical audiences Understands digital marketing performance More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Digital Gurus
informed, data-led choices Ideal Candidate: Hands-on experience with GA4 and Google Tag Manager implementation Confident working with large datasets; experience with cloud-based analytics tools such as BigQuery is valuable Proven ability to deliver analytical solutions in ecommerce or retail-focused environments Comfortable presenting data findings to both technical and non-technical audiences Understands digital marketing performance More ❯
teams Provide design support for our data models to answer multiple business needs through collaboration with data engineers Nice to haves Experience with Looker/ThoughtSpot Experience with GoogleBigQuery Experience with Mobile Measurement Platforms Why join Muzz? We're a profitable Consumer Tech startup, backed by Y Combinator (S17) and based in London . Join our fast growing More ❯
use of PubSub and Cloud Storage React frontend, using ShadCN for components, TailwindCSS for styling, React Query for state management Posthog for frontend analytics (events, sessions, feature flags, experiments) BigQuery as our data warehouse, with Metabase for data visualization. Production data and Posthog events both stream into it so data is in one place. Sentry and Google Cloud Logging More ❯
use of PubSub and Cloud Storage React frontend, using ShadCN for components, TailwindCSS for styling, React Query for state management Posthog for frontend analytics (events, sessions, feature flags, experiments) BigQuery as our data warehouse, with Metabase for data visualization. Production data and Posthog events both stream into it so data is in one place. Sentry and Google Cloud Logging More ❯
use of PubSub and Cloud Storage React frontend, using ShadCN for components, TailwindCSS for styling, React Query for state management Posthog for frontend analytics (events, sessions, feature flags, experiments) BigQuery as our data warehouse, with Metabase for data visualization. Production data and Posthog events both stream into it so data is in one place. Sentry and Google Cloud Logging More ❯
Nice to have Experience supporting client projects in a research or consultancy context. Familiarity with survey methodology, MRP, or causal inference techniques. Experience working with cloud-based tools (e.g. BigQuery, GCP). Interest in public opinion, politics, or social science research. An eye for design and clarity in visualising data and outputs. Solid working knowledge of Python (e.g. pandas More ❯