similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
About FairPlay Sports Media: We’re a sports media network, focused on building and nurturing a portfolio of highly engaged and connected communities of sports fans and bettors to create value for our partners. We empower sports fans with real More ❯
City of London, London, United Kingdom Hybrid / WFH Options
FairPlay Sports Media
About FairPlay Sports Media: We’re a sports media network, focused on building and nurturing a portfolio of highly engaged and connected communities of sports fans and bettors to create value for our partners. We empower sports fans with real More ❯
data warehouse setup (incl. modeling), data activation, data quality, data governance, real-time reporting; Hands on experience on GCP (Google Cloud Platform) environment; preferably on most of the following: BigQuery, Cloud Composer, Cloud Run, Cloud Monitoring & Logging, Dataplex, Beam, Tentacles and Pub/Sub; Fluent Python, SQL skills with real life project experience; Experience on orchestration tools such as … Airflow and DBT; Experience with one of major analytical DWHs is plus: BigQuery, Redshift, Snowflake, Databricks, Synapse; Work experience with following technologies are noteworthy to mention and might be seen as bonus: AWS (and Data related proprietary technologies), Azure (and Data related proprietary technologies), Adverity, Fivetran, Looker, PBI, Tableau, RDBMS, Spark, Redis, Kafka; Being a fair, kind and reliable More ❯
skills (NumPy, Pandas, SQL Alchemy) and expert-level SQL across multiple database platforms Hands-on experience with modern data stack tools including dbt, Airflow, and cloud data warehouses (Snowflake, BigQuery, Redshift) Strong understanding of data modelling, schema design, and building maintainable ELT/ETL pipelines Experience with cloud platforms (AWS, Azure, GCP) and infrastructure-as-code practices Familiarity with More ❯
comfortably across both engineering and analytics, and is excited about building internal tools that directly improve product and customer experiences. You'll be working with a mature stack (Python, BigQuery, dbt, FastAPI, Metabase), and your day-to-day will include both writing production-level code and making data actually useful for decision-makers. Main responsibilities: Build, maintain, and optimize … version control (GitLab) Experience Required: 5+ years of Python , including writing production-level APIs Strong SQL and DBT for data transformation and modeling Experience with modern data stack components: BigQuery, GCS, Docker, FastAPI Solid understanding of data warehousing principles Proven ability to work cross-functionally with both technical and non-technical stakeholders Comfortable maintaining and optimizing BI dashboards ( Metabase More ❯
Infrastructure : Design, develop, and maintain scalable data pipelines and infrastructure to enable reliable analytics and data-driven decisions. ETL/ELT Development : Build, optimise, and manage ETL processes in BigQuery and associated tools to ensure timely and accurate data ingestion. Data Warehousing : Maintain and evolve our BigQuery data warehouse, ensuring it is performant, reliable, and aligned with business … Automate and streamline infrastructure deployment and management processes, improving operational efficiency. What we're looking for: Experience : Practical production experience building, optimising and maintaining data pipelines and warehouses, using BigQuery or similar cloud data solutions. Technical Skills : Strong proficiency in SQL and ETL/ELT frameworks and experience with data modelling, optimisation, and pipeline orchestration. Python or similar programming More ❯
similar scripting languages for data science Experience with data processes and building ETL pipelines Experience with cloud data warehouses such as Snowflake, Azure Data Warehouse, Amazon Redshift, or GoogleBigQuery Proficiency in creating visualizations using Power BI or Tableau Experience designing ETL/ELT solutions with tools like SSIS, Alteryx, AWS Glue, Databricks, IBM DataStage Strong analytical and technical More ❯
+ BigQuery + Dashboard Experience Location: Heathrow ( Hybrid 2 X Week Onsite ) Rate: £550 Inside IR35 We are seeking a talented Data Analyst with expertise in utilising GoogleBigQuery and Google Cloud Platform (GCP) for data analysis and insights generation. The ideal candidate should have a strong analytical mindset, proficiency in dashboards, querying and manipulating large datasets, and … complex data into actionable insights that drive business decisions. Candidate Specification What you’ll bring Proven experience as a Data Analyst, with a focus on utilising GCP services and BigQuery for data analysis. Proficiency in SQL and experience in writing complex queries for data extraction and transformation. Strong understanding of data modelling concepts and data warehousing principles. Familiarity with More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83data
+ BigQuery + Dashboard Experience Location: Heathrow ( Hybrid 2 X Week Onsite ) Rate: £550 Inside IR35 We are seeking a talented Data Analyst with expertise in utilising GoogleBigQuery and Google Cloud Platform (GCP) for data analysis and insights generation. The ideal candidate should have a strong analytical mindset, proficiency in dashboards, querying and manipulating large datasets, and … complex data into actionable insights that drive business decisions. Candidate Specification What you’ll bring Proven experience as a Data Analyst, with a focus on utilising GCP services and BigQuery for data analysis. Proficiency in SQL and experience in writing complex queries for data extraction and transformation. Strong understanding of data modelling concepts and data warehousing principles. Familiarity with More ❯
more junior members of staff. Required Skills & Experience: Proven experience as a Data Engineer in a commercial environment. Strong hands-on experience with Google Cloud Platform (GCP) services (e.g., BigQuery, Dataflow, Pub/Sub). Solid understanding of Azure data services and hybrid cloud environments. Advanced SQL skills and proficiency in Python for data engineering tasks. Experience working in More ❯
SaaS or tech environment, but this isn't set in stone! Familiarity with digital advertising platforms (Google Ads, Facebook Ads, LinkedIn Ads, TikTok, etc.) We use Google Analytics 4, BigQuery, Fivetran, Amplitude, Metabase and DBT amongst other things for our data stack, so experience here would be beneficial, however, experience in similar tools (if not these ones) is a More ❯
used backend programming languages (Python, Node.JS, Java, PHP, GO, C#, C++) Familiar with version control tools and proper branching techniques (Gitlab preferred) Experience working with data warehouses (Google Cloud BigQuery), data governance, payments and treasury or capital markets systems So, what's in it for you? Our people are constantly striving to be the best through operational excellence. The More ❯
used backend programming languages (Python, Node.JS, Java, PHP, GO, C#, C++) Familiar with version control tools and proper branching techniques (Gitlab preferred) Experience working with data warehouses (Google Cloud BigQuery), data governance, payments and treasury or capital markets systems So, what's in it for you? Our people are constantly striving to be the best through operational excellence. The More ❯
and non-technical stakeholders alike. The Head of Data Engineering & Insight will work within a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, GoogleBigQuery , Google Analytics 4 Data Lake & Storage: Databricks Delta Lake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics More ❯
Senior Data Engineer London (Hybrid) We are Manufacturing the Future! Geomiq is revolutionizing traditional manufacturing by providing engineers worldwide with instant access to reliable production methods through our digital platform. As the UK's leading Digital Manufacturing Marketplace, we offer More ❯
Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
Experience as a Data Product Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling, data pipelines, ETL/ELT More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
You Strong experience as a Product Owner or Data Product Owner in Agile environments. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data modelling, data pipelines, ETL/ELT processes, and APIs. Knowledge of GDPR, data ethics, and governance practices. Strong understanding of data quality frameworks More ❯
expertise to grow top line revenues and guide commercial initiatives from our data. You'll own the analysis of the end-to-end customer journey, using our data stack ( BigQuery, dbt, Hex ) to create data models, data products, metrics and find insights that fuel our growth. You'll work closely with other engineers, marketers, product teams, and commercial teams More ❯
management practices, system development life cycle management, IT services management, agile and lean methodologies, infrastructure and operations, and EA and ITIL frameworks. Proficiency with data warehousing solutions (e.g., GoogleBigQuery, Snowflake). Expertise in data modeling tools and techniques (e.g., SAP PowerDesigner, EA Sparx). Strong knowledge of SQL and NoSQL databases (e.g., MongoDB, Cassandra). Familiarity with cloud More ❯
Streams)? Select Which statement best describes your hands on responsibility for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, Azure Synapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines (Docker, GitHub More ❯