infrastructure as code (IaC), automation, CI/CD pipelines, and application modernization on GCP. Serve as a subject matter expert on various GCP services, including Compute Engine, Kubernetes Engine, BigQuery, Cloud Storage, IAM, and more. Troubleshoot and resolve complex technical issues, providing expert guidance to project teams. Quality Assurance and Best Practices: Conduct regular reviews of project deliverables to … Solid understanding of networking principles and their application within GCP. Experience with containerization technologies (Docker, Kubernetes/GKE). Familiarity with data management and analytics services on GCP (e.g., BigQuery, Cloud Dataflow, Cloud Storage). Strong understanding of identity and access management (IAM) within GCP. Familiarity of different Cloud Platform like AWS, Azure is nice to have Consulting and More ❯
Biggleswade, England, United Kingdom Hybrid / WFH Options
Kramp
testing), and contribute to data governance. You will help manage structured and unstructured data, enabling consistent, accurate, and timely reporting and analytics across the organization. Tech Stack: dbt | GoogleBigQuery/SQL | Tableau (occasionally) | Python (occasionally) Your team: You’ll join a dynamic department (AI&T) of ~30 professionals, including project managers, solution builders, and business experts - all working … English Key technical skills: SQL Mastery & Data Modeling Expertise : Deep experience in SQL and modeling principles (Kimball, DataVault) Data Pipeline Development in dbt : Proven experience with dbt on GoogleBigQuery or similar platforms Cloud Environment Experience : Hands-on experience with Google Cloud Platform (GCP) or AWS CI/CD & Version Control : Familiarity with git and CI/CD workflows More ❯
a) Migration of portfolio from our UW legacy CRM system to Gentrack. b) Build extract scripts to pull data from Gentrack Junifer database and load the data into UW BigQuery data warehouse to assist with management, regulatory, operational and other reports. This role will be particularly focussed on (b) above and will include: interpreting Junifer data models, build ETL … scripts, and populate energy data models in our UW BigQuery environment. We deliver progress. What you’ll do and how you will make an impact. This role would need: Experience in data migration projects (ideally energy platform migration). Experience in Gentrack Junifer application/… data models. Experience in building data warehouse data models. Proficient in SQL as both data extraction scripts and data loading scripts will be sql based. Experience working with GoogleBigQuery and Dataform environments. Experience in working collaboratively with a multitude of teams to meet the timelines and deliverables. What you’ll do Analyse Gentrack Junifer data models to understand More ❯
modelling, data warehousing, and advanced SQL query crafting. Skilled in ETL processes, preferably with tools like dbt and Sigma. Experienced with cloud platforms and database technologies, particularly GCP and BigQuery Python proficiency is desirable, but not mandatory. Experience working with AI, or interest in AI products. Here’s what your role will entail: Extract and integrate data from diverse More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based solutions for Azure/AWS, Jenkins More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Signify Technology
a focus on data platforms or internal tools Strong understanding of data warehousing, data modelling, and schema design Proficiency in SQL and experience working with Google Cloud Platform and BigQuery Familiarity with Power BI and API integrations Solid knowledge of GDPR and data governance best practices Excellent communication skills and the ability to collaborate across teams Experience working in … better if you have: Hands-on experience with CDP platforms (e.g. Segment, Tealium, mParticle) Background in media, sports, or betting industries Python or similar scripting skills Tech Environment Data: BigQuery, GCP, Power BI Languages & Tools: SQL, APIs, CDP platforms, Python (bonus) Accessibility Statement: Read and apply for this role in the way that works for you by using our More ❯
a focus on data platforms or internal tools Strong understanding of data warehousing, data modelling, and schema design Proficiency in SQL and experience working with Google Cloud Platform and BigQuery Familiarity with Power BI and API integrations Solid knowledge of GDPR and data governance best practices Excellent communication skills and the ability to collaborate across teams Experience working in … better if you have: Hands-on experience with CDP platforms (e.g. Segment, Tealium, mParticle) Background in media, sports, or betting industries Python or similar scripting skills Tech Environment Data: BigQuery, GCP, Power BI Languages & Tools: SQL, APIs, CDP platforms, Python (bonus) Accessibility Statement: Read and apply for this role in the way that works for you by using our More ❯
London, England, United Kingdom Hybrid / WFH Options
HipHopTune Media
using Tableau, Amplitude, and Google Sheets. Create clear, compelling reports that translate data into actionable business recommendations. Use Dataform to transform raw data into structured, usable datasets in our Bigquery data warehouse. Contribute to the development of source-of-truth data models. Maintain and optimize scalable data pipelines. Qualification & Experience Must haves: 3+ years in data analytics Strong proficiency More ❯
Central London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Singular Recruitment
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
ETL/ELT Pipelines: Proven experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable data models and architect a More ❯
Nice to have Proficiency in Data Modelling, building high quality datasets and products Experience in Python Experience with DBT, Airflow, Dagster or similar tools Experience with Google Cloud Platform (BigQuery, Spark) Background in online trading or related financial sectors Benefits State-of-the-art technologies (Power BI, Metabase, BigQuery, Dagster, DBT, Datahub, Spark, Python, SQL) A collaborative and More ❯
each use case Support knowledge sharing and technical best practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable More ❯
each use case Support knowledge sharing and technical best practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
each use case Support knowledge sharing and technical best practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT processes Clear communication skills and ability to document processes effectively Desirable More ❯
London, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
Tool selection, cost management, and team management. Experience required: Experience in building and scaling BI and Data Architecture. Expertise in modern BI and Data DW platforms such as Snowflake, BigQuery, Redshift, Power BI, etc. Background in ETL/ELT tooling and Data Pipelines such as DBT, Fivetran, Airflow. Experience with Cloud-based solutions (Azure, AWS, or Google). #J More ❯
ThoughtSpot (beneficial) Familiarity with marketing platforms like GA4, Google Ads, Meta, Klaviyo,etc. Web Analytics experience Solid understanding of e-commerce metrics and attribution models Data warehouse exposure - Snowflake, BigQuery etc. Excel, Google Sheets Beneficial Experience Familiarity with Shopify or other e-commerce platform Fivetran, data engineering etc. GitHub Experience Business Analytics Effective verbal and written communication skills, demonstrating More ❯
Biggleswade, England, United Kingdom Hybrid / WFH Options
Kramp
cross-functional team combining Business Intelligence, DevOps, FinOps, Data Science, and Software Development; Working within Agile SCRUM methodologies and the Atlassian toolchain (JIRA, Confluence, GitHub, Git); Using Python, Java, BigQuery SQL, and Google Cloud Platform services like Cloud Build, Data Transfers, Workflows, Schedulers, and Functions to build our state-of-the-art cloud data warehouse; Utilizing dbt Cloud, Terraform More ❯
similar area 5+ years of experience working as a professional software engineer, data in industry Expertise with Python coding and type system Expertise in writing SQL (GQL, PostgreSQL, and BigQuery are a plus) Experience with building both batch and streaming ETL pipelines using data processing engines Deep understanding of building Knowledge Graphs entailing biological ontologies, and leveraging graph DBs More ❯
focus on backend technologies and building distributed services. Proficiency in one or more programming languages including Java, Python, Scala or Golang. Experience with columnar, analytical cloud data warehouses (e.g., BigQuery, Snowflake, Redshift) and data processing frameworks like Apache Spark is essential. Experience with cloud platforms like AWS, Azure, or Google Cloud. Strong proficiency in designing, developing, and deploying microservices More ❯
data into clear, actionable insights Deliver proactive analysis that are commercially relevant and decision focussed Analytics Engineering & Data Modelling Develop and maintain well structured, scalable data models in GoogleBigQuery Apply analytics engineering best practices, including modular SQL development, version control and documentation Contribute to improving data quality and consistency by helping define and standardise key business metrics and More ❯
and user guides,helping to enforce data management standards. Participate in agile ceremonies and provide occasional client interaction. Engage in DataOps practices and improve data delivery performance. GCP: GCS, BigQuery, GKE, Artifact Registry, Vertex AI, App Engine, Data Store, Secret Manager, Pub/Sub What do I need to bring with me? It is essential that your personal attributes … role you will need the following skills and experience: A minimum of 3 years relevant commercial experience with experience of scalable data pipelines using Argo on GKE, SQL on BigQuery, Python libraries like Pandas. Comfortable with APIs and Cloud storage systems. Experience with containerisation (Docker) and orchestration (Kubernetes). Familiarity with Terraform and data systems optimisation. Commitment to data More ❯
create and manage the platforms and data infrastructure that hold, secure, cleanse and validate, govern, and manage our data. Manage our data platform, incorporating services using Airflow/Composer, BigQuery, Snowflake, Kafka, and Redis running on Kubernetes, GCP, and AWS. Support our Data Science teams with access to data, performing code reviews, aiding model evaluation and testing, deploying models … transactional, global, industry. Experience with advertising technology (AdTech) highly desired. Proven experience and a passion for developing and operating data-oriented solutions using Python, Airflow/Composer, Kafka, Snowflake, BigQuery, and a mix of data platforms such as Spark, AWS Athena, Postgres and Redis. Excellent SQL development, query optimization and data pipeline development skills required. Strong experience using public More ❯
years’ experience in a similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based solutions for Azure/AWS, Jenkins More ❯