expect to get involved in a variety of projects in the cloud (AWS, Azure, GCP), while also gaining opportunities to work with Snowflake, Databricks, BigQuery, and Fabric. We work with near real-time/streaming data, geospatial data and using modern AI-tooling to accelerate development. About You You More ❯
re pragmatic about our technology choices. These are some of the things we use at the moment: TypeScript, React, styled-components Python, NodeJS ️ PostgreSQL, BigQuery, MySQL Jest, React Testing Library, Cypress, pytest ️ AWS, GCP Kubernetes, Docker, Terraform, GitHub, CircleCI How we expect you to work ️ We expect you to More ❯
AWS EMR and PySpark to generate real-time (fast-moving) features for the feature store. Develop and maintain batch processing pipelines using DBT and BigQuery to generate batch (slow-moving) features, ensuring data quality, consistency and reliability. Work with Feast feature store, manage feature life cycle and maintain data More ❯
The tech stack includes: Infrastructure: Google Cloud Backend: Python for internal API codebase, running on a Postgres database hosted with Google Cloud SQL Data: BigQuery, DBT, and Tableau Frontend: Typescript and React Mobile/Apps: Flutter What’s on Offer Generous time off. Equity package. Comprehensive health coverage. Pension More ❯
london, south east england, United Kingdom Hybrid / WFH Options
HUG
The tech stack includes: Infrastructure: Google Cloud Backend: Python for internal API codebase, running on a Postgres database hosted with Google Cloud SQL Data: BigQuery, DBT, and Tableau Frontend: Typescript and React Mobile/Apps: Flutter What’s on Offer Generous time off. Equity package. Comprehensive health coverage. Pension More ❯
data processing Requirements: Basics 3+ years of experience in technical support or project management, preferably with GCP environments Understanding of Google Cloud services including BigQuery, Workflows, Batch, Dataproc, Dataflow, Cloud Run, and GCS Familiarity with HPC concepts including job schedulers (e.g., Slurm, LSF), parallel computing frameworks, and compute cluster More ❯
and Operations, you'll work in an agile environment that values knowledge sharing and diversity. You'll analyze large datasets using tools like GoogleBigQuery and Amazon Redshift and leverage one of Europe's largest Tableau Server platforms for advanced reporting and insights. You will: Write efficient SQL queries More ❯
role, ideally in retail or fashion. Strong commercial awareness and ability to work cross-functionally with non-technical teams. Hands-on experience with SQL (BigQuery ideally), plus Looker Studio or similar BI tools. Confident communicator, comfortable presenting to senior stakeholders and influencing decisions. Experience with tools like Fospha, ContentSquare More ❯
london (wimbledon), south east england, United Kingdom
Harnham
role, ideally in retail or fashion. Strong commercial awareness and ability to work cross-functionally with non-technical teams. Hands-on experience with SQL (BigQuery ideally), plus Looker Studio or similar BI tools. Confident communicator, comfortable presenting to senior stakeholders and influencing decisions. Experience with tools like Fospha, ContentSquare More ❯
our data models to answer multiple business needs through collaboration with data engineers Nice to haves Experience with Looker/ThoughtSpot Experience with GoogleBigQuery Experience with Mobile Measurement Platforms Why join Muzz? We're a profitable Consumer Tech startup, backed by Y Combinator (S17) and based in London More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Finatal
data architecture and processes to improve reliability, data quality and consistency. Requirements: Strong experience working across a modern cloud environment, a GCP stack including BigQuery or Snowflake DWH, Fivetran and transformation in dbt. Strong skills in building dashboards and using visualisation tools such as Tableau (beneficial), Power BI or More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Finatal
data architecture and processes to improve reliability, data quality and consistency. Requirements: Strong experience working across a modern cloud environment, a GCP stack including BigQuery or Snowflake DWH, Fivetran and transformation in dbt. Strong skills in building dashboards and using visualisation tools such as Tableau (beneficial), Power BI or More ❯
ability to develop data-driven hypotheses. Data Science Knowledge : Familiarity with concepts like clustering, predictive modelling, and basic statistical techniques. Technical Skills : Knowledge of BigQuery/SQL/Python is desirable but not essential. Strong communication skills - experience presenting to clients and collaborating with internal teams. Experience working with More ❯
concept to full deployment. Cloud Computing : Experience working with cloud platforms (AWS, Azure, GCP) for deploying AI solutions, including familiarity with data platforms like BigQuery, Databricks or Snowflake. Business Acumen : Ability to translate technical AI concepts into clear business outcomes, advising clients on strategies to harness AI for growth. More ❯
or GCP. Experience with Terraform to define and manage cloud infrastructure through code. Desirables: Experience in SQL-based transformation workflows, particularly using DBT in BigQuery Experience with containerisation technologies (Docker, Kubernetes). Familiarity with streaming data ingestion technologies (Kafka, Debezium). Exposure to Data management and Linux admisntration Interview More ❯
good. We're searching for an experienced Analytics Engineer to help us enhance our data warehouse. You'll primarily work in SQL with dbt (BigQuery) but have the opportunity to work in other programming languages, expanding your skillset and making a significant impact on our data capabilities. We're More ❯
position, the following experience is required: Strong technical leadership (minimum of 2-3 years at Lead level) Experience across GCP and Google Suite (eg. BigQuery, Google Consul etc) Strong history of Data Engineering and Backend Development with Python Hands-on experience across SQL and other DB technologies This position More ❯
quality control processes. What We're Looking For: Proven experience in a data analysis, reporting, or business intelligence role. Strong working knowledge of GoogleBigQuery and SQL to write and customise queries. Proficiency in data visualisation tools such as Power BI, Tableau, or Google Data Studio. Strong skills in More ❯
in SQL and data visualization tools like Tableau or Power BI. Experience with ETL processes and BI solutions based on cloud platforms (e.g., GoogleBigQuery) is a must. Ability to work independently and handle both large and small projects in a fast-paced environment. If you're data-savvy More ❯
and DAX. Knowledge of all laws relating to data. The ability to analyse data to identify exceptions/areas for attention. Knowledge of GoogleBigQuery, Looker and other GCP tools or desire to learn. Required Documents CV/Resume Application Process Interested and qualified candidates should kindly CLICK HERE More ❯
would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (Apache Beam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two way process and we want you to More ❯
access, storage, manipulation, and interpretation of media performance data across cloud infrastructure technologies, specifically Google Cloud and API tools. Proven experience with SQL, GoogleBigQuery, Snowflake or similar. Proven experience with technical and commercial stakeholder management. Nice to have experience within a matrixed organization. Experience with Data Visualization software More ❯
scalable, secure, and efficient. Database Expertise : Familiarity with both SQL and NoSQL database technologies, such as Firestore and PostgreSQL. Experience with data warehouses like BigQuery is a significant plus. Authentication and Authorization : Knowledge of authentication and authorization mechanisms, including JWT (JSON Web Tokens), and experience with custom Identity Providers More ❯
proficiency in Python (NumPy, Pandas, Scikit-Learn, etc.), SQL, and cloud platforms such as GCP or AWS. Have experience working with modern databases like BigQuery, Snowflake, or Redshift. Have successfully deployed machine learning models or optimisation algorithms in production environments. Have a solid understanding of digital marketing and marketing More ❯
proficiency in Python (NumPy, Pandas, Scikit-Learn, etc.), SQL, and cloud platforms such as GCP or AWS. Have experience working with modern databases like BigQuery, Snowflake, or Redshift. Have successfully deployed machine learning models or optimisation algorithms in production environments. Have a solid understanding of digital marketing and marketing More ❯