solve real-world problems in venture capital. What We're Looking For Technical Excellence: Deep expertise in Python, React, and modern data technologies (e.g., BigQuery/GCP, dbt, Prefect, OpenAI, LookML/Looker, Glean). Proven Track Record: A history of building complex data products that drive business value More ❯
solve real-world problems in venture capital. What We're Looking For Technical Excellence: Deep expertise in Python, React, and modern data technologies (e.g., BigQuery/GCP, dbt, Prefect, OpenAI, LookML/Looker, Glean). Proven Track Record: A history of building complex data products that drive business value More ❯
and project management. Some of us had a technical background before joining, and experience with tools like Salesforce, Hubspot, Zapier, or Data Warehouses (Snowflake, BigQuery); but many of us transitioned from commercial roles; we wanted to get deeper into products, and are self-taught technically. Competencies We are not More ❯
least Somewhat Technical - You will need to write SQL and be able to operate in Metabase and tools like Statsig. Our data is in BigQuery, Metabase and Statsig. You don't need to have specific experience with these tools, but you need a good data foundation and the ability More ❯
Emerson Plantweb/AMS, GE/Meridum APM, Aveva, Bentley, and OSIsoft PI Familiarity with relevant technology, such as Big Data (Hadoop, Spark, Hive, BigQuery); Data Warehouses; Business Intelligence; and Machine Learning Savvy at helping customers create business cases with quantified ROI to justify new investments Experience with enterprise More ❯
Strong project management skills - you're able to build structure, coordinate stakeholders, and keep work moving at pace. Technical fluency: APIs, data platforms (SQL, BigQuery, Looker), product mechanics. Exceptional communication skills - written, visual, and verbal - with a calm, client-friendly approach. Comfortable working cross-functionally and juggling multiple complex More ❯
London, England, United Kingdom Hybrid / WFH Options
Focus on SAP
Key skills: 5+ years of experience in data engineering, with at least 1–2 years hands-on with Google Cloud Platform. Strong experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow. Proficient in SQL, Python, and Apache Beam. Familiarity with DevOps and CI/CD pipelines in cloud … Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP) Develop ETL processes using tools like Cloud Dataflow, Apache Beam, BigQuery, and Cloud Composer Collaborate with data analysts, scientists, and business stakeholders to understand data requirements Optimize performance and cost-efficiency of GCP data services More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focus on SAP
Key skills: 5+ years of experience in data engineering, with at least 1–2 years hands-on with Google Cloud Platform. Strong experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow. Proficient in SQL, Python, and Apache Beam. Familiarity with DevOps and CI/CD pipelines in cloud … Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP) Develop ETL processes using tools like Cloud Dataflow, Apache Beam, BigQuery, and Cloud Composer Collaborate with data analysts, scientists, and business stakeholders to understand data requirements Optimize performance and cost-efficiency of GCP data services More ❯
allows you to work and take decisions independently Experience in working with data visualization tools Experience in GCP tools – Cloud Function, Dataflow, Dataproc and Bigquery Experience in data processing framework – Beam, Spark, Hive, Flink GCP data engineering certification is a merit Have hands on experience in Analytical tools such … the development and deployment of data mesh architecture , ensuring federated governance, discoverability, and self-serve capabilities. Design and build scalable data pipelines using GCP (BigQuery, Dataflow, Cloud Composer, Cloud Functions) and orchestrate transformations using DBT . Develop modular, reusable DBT models for core retail metrics such as inventory accuracy More ❯
allows you to work and take decisions independently Experience in working with data visualization tools Experience in GCP tools – Cloud Function, Dataflow, Dataproc and Bigquery Experience in data processing framework – Beam, Spark, Hive, Flink GCP data engineering certification is a merit Have hands on experience in Analytical tools such … the development and deployment of data mesh architecture , ensuring federated governance, discoverability, and self-serve capabilities. Design and build scalable data pipelines using GCP (BigQuery, Dataflow, Cloud Composer, Cloud Functions) and orchestrate transformations using DBT . Develop modular, reusable DBT models for core retail metrics such as inventory accuracy More ❯
ability to build relationships with technical peers - Experience in setting up, maintaining, and querying different types of databases using SQL/NoSQL (familiarity with BigQuery and Dataform is a plus) Optimisation of data, analysis of information, and organisation of new sources of data into databases Experience with testing strategies More ❯
ability to build relationships with technical peers - Experience in setting up, maintaining, and querying different types of databases using SQL/NoSQL (familiarity with BigQuery and Dataform is a plus) Optimisation of data, analysis of information, and organisation of new sources of data into databases Experience with testing strategies More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Alevio Consulting
As a BA, you will be responsible for leading and coordinating the requirements gathering from the internal stakeholders within our business for our technology department. Your primary focus will be on working with key non-technical stakeholders in the business More ❯
As a BA, you will be responsible for leading and coordinating the requirements gathering from the internal stakeholders within our business for our technology department. Your primary focus will be on working with key non-technical stakeholders in the business More ❯
documentation & maintainability, ensuring a clean & scalable codebase ⚙️ Tech you will work with: Kotlin, Grails, Vert.x, gRPC, microservices, event-driven architecture, GCP, Kubernetes, Terraform, Kafka, BigQuery, PostgreSQL Ideal Candidates will have: 3+ years commercial experience building complex systems using a JVM language (Java, Kotlin, Scala) Experience building scalable microservices with More ❯
documentation & maintainability, ensuring a clean & scalable codebase ⚙️ Tech you will work with: Kotlin, Grails, Vert.x, gRPC, microservices, event-driven architecture, GCP, Kubernetes, Terraform, Kafka, BigQuery, PostgreSQL Ideal Candidates will have: 3+ years commercial experience building complex systems using a JVM language (Java, Kotlin, Scala) Experience building scalable microservices with More ❯
who loves turning data into business intelligence. You’ll be instrumental in developing dashboards and data models that support strategic decision-making. Working with BigQuery and Looker Studio, you’ll ensure data accuracy and clarity while collaborating with teams across the business to identify opportunities for improvement and innovation. … Build, maintain, and optimise data models in BigQuery to support business reporting Design and enhance dashboards in Looker Studio , ensuring usability and accuracy Translate stakeholder needs into scalable and effective BI solutions Analyse large datasets to surface trends, performance insights, and growth opportunities Conduct data quality checks and enforce … use dashboards and access insights effectively SKILLS AND EXPERIENCE: Minimum of 3 years’ experience in analytics or business intelligence Advanced SQL skills and strong BigQuery expertise Proficiency with Looker Studio (formerly Google Data Studio) A proactive, detail-oriented, and solution-focused approach Team-first mindset with a collaborative working More ❯
methodologies). Strong analytic skills related to working with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, GoogleBigQuery, Amazon Redshift). Experience building and optimising pipelines and ETL/ELT workflows. Working knowledge of data warehouse technologies (Snowflake, BigQuery). Code More ❯
Crawley, Sussex, United Kingdom Hybrid / WFH Options
Rentokil Pest Control South Africa
refining prompts to improve LLM performance in structured data querying and other business-specific applications. Integrate LLMs with structured data systems (e.g., SQL databases, BigQuery, GCS) to enable natural language querying and advanced analytics. Implementing MLOps/LLMOps pipelines for deploying LLMs in production, monitoring their performance, and ensuring … such as LangChain and LangGraph, including prompt engineering, fine-tuning, and workflow orchestration. Skilled in integrating LLMs with structured data systems (e.g., SQL databases, BigQuery) to enable natural language querying and advanced analytics. MLOps/LLMOps Proficient in designing and implementing MLOps/LLMOps pipelines for model deployment, monitoring … maintenance using tools like Vertex AI Pipelines. Cloud Computing (Google Cloud Platform - GCP Preferred) Hands-on experience with GCP services such as Vertex AI, BigQuery, Cloud SQL, and Google Cloud Storage (GCS) for AI/ML applications. Skilled in containerization (Docker) and orchestration (Kubernetes, GKE), with a solid understanding More ❯
Data Engineer - AlbionVC AlbionVC is looking for a dynamic Data Engineer to join our team, to continue to develop our data platform, software stack, and more generally help develop AlbionVC’s data strategy. Having experienced success with the first phase More ❯
london, south east england, United Kingdom Hybrid / WFH Options
AlbionVC
Data Engineer - AlbionVC AlbionVC is looking for a dynamic Data Engineer to join our team, to continue to develop our data platform, software stack, and more generally help develop AlbionVC’s data strategy. Having experienced success with the first phase More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Forsyth Barnes
Role: Senior Data Analyst Location: London (Hybrid - 3 days in-office) Salary: £60,000- £80,000 Industry: Retail Overview: This is an exciting hands-on opportunity well-suited to a versatile individual who thrives in dynamic, fast-paced environments. The More ❯
Role: Senior Data Analyst Location: London (Hybrid - 3 days in-office) Salary: £60,000- £80,000 Industry: Retail Overview: This is an exciting hands-on opportunity well-suited to a versatile individual who thrives in dynamic, fast-paced environments. The More ❯
London, England, United Kingdom Hybrid / WFH Options
Focus on SAP
experience in data engineering or data architecture roles 3+ years of hands-on experience architecting data solutions on Google Cloud Platform Strong knowledge of BigQuery, Dataflow, Cloud Composer, Pub/Sub, and Cloud Functions Proficient in data modeling, data warehousing, and ETL/ELT architectures Experience with Infrastructure as … written) Rights to work in the UK is must (No Sponsorship available) Responsibilities: Design and implement enterprise-scale data architectures using GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Storage, and Dataproc Define data modeling standards, integration patterns, and governance frameworks Collaborate with stakeholders across engineering, analytics, product More ❯