promised outcomes. Drive high client value and broaden relationships at senior levels with current and prospective clients. Our Tech Stack Cloud: Azure, sometimes GCP & AWS Data Platform: Databricks, Snowflake, BigQuery Data Engineering tools: Pyspark, Polars, DuckDB, Malloy, SQL Infrastructure-as-code: Terraform, Pulumi Data Management and Orchestration: Airflow, dbt Databases and Data Warehouses: SQL Server, PostgreSQL, MongoDB, Qdrant, Pinecone More ❯
similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
and compliance standards. Strong analytical and problem-solving skills with attention to detail. Excellent communication and documentation abilities. Preferred: Exposure to cloud data platforms (AWS Redshift, Azure Synapse, GoogleBigQuery, Snowflake). Knowledge of Python, R, or other scripting languages for data manipulation. Experience in banking, financial services, or enterprise IT environments. More ❯
and compliance standards. Strong analytical and problem-solving skills with attention to detail. Excellent communication and documentation abilities. Preferred: Exposure to cloud data platforms (AWS Redshift, Azure Synapse, GoogleBigQuery, Snowflake). Knowledge of Python, R, or other scripting languages for data manipulation. Experience in banking, financial services, or enterprise IT environments. More ❯
techniques. Excellent analytical, problem-solving, and critical thinking skills. Proven ability to communicate technical findings to non-technical stakeholders. Preferred: Experience with cloud data platforms (e.g., AWS Redshift, GoogleBigQuery, Snowflake). Familiarity with A/B testing, forecasting, and predictive modelling techniques. Experience in eCommerce as well as Finance and Healthcare industries. This fantastic new role is to More ❯
One Big Table (OBT) methodologies.• Translate business requirements from stakeholders into robust, well-documented and tested dbt models.• Develop and own workflows within Google Cloud Platform environments, primarily using BigQuery and dbt.• Write high-quality, optimised SQL for data transformation and analysis.• Develop and maintain scalable data pipelines within the Google Cloud Platform, ensuring efficient and cost-effective data More ❯
of experience in data engineering, analytics engineering, or a similar technical data role Advanced skills in SQL and Python Proven experience with cloud data warehousing platforms like Redshift, Databricks, BigQuery, or Snowflake Experience building and maintaining ELT pipelines using dbt in a production environment Strong understanding of data modelling principles (e.g., bronze/silver/gold layer design) Hands More ❯
Streams)? Select Which statement best describes your hands on responsibility for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, Azure Synapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines (Docker, GitHub More ❯
as Looker core , looker studio , Tableau, Power BI, or similar.* Strong SQL Skills - Ability to write efficient queries for data exploration, transformation, and visualization.* Google Cloud Expertise - Familiarity with BigQuery, Dataform, and GCP analytics services, looks and data integration/transformation strategies.* Data Modeling Expertise - Understanding of LookML or similar semantic modeling frameworks.* Financial Services Knowledge (Plus) - Understanding of More ❯
impact of business changes, new product launches, pricing strategies and market shifts, and creating comprehensive financial reports and visualisations using Looker. You'll be working with data from GoogleBigQuery, though prior experience with this is not required, as long as you bring SQL skills for data extraction and manipulation, and a desire to learn how to work with More ❯
impact of business changes, new product launches, pricing strategies and market shifts, and creating comprehensive financial reports and visualisations using Looker. You'll be working with data from GoogleBigQuery, though prior experience with this is not required, as long as you bring SQL skills for data extraction and manipulation, and a desire to learn how to work with More ❯
rail industry's revenue allocation system through a suite of technology upgrades and methodological improvements. Key enhancements include migrating the platform to Google Cloud Platform (GCP), implementing a modern BigQuery-based Data Warehouse, and replacing the legacy solution for allocation factor calculation with an innovative, graph database-driven solution. Duties Lead the design of data architectures and the development More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Interquest
rail industry's revenue allocation system through a suite of technology upgrades and methodological improvements. Key enhancements include migrating the platform to Google Cloud Platform (GCP), implementing a modern BigQuery-based Data Warehouse, and replacing the legacy solution for allocation factor calculation with an innovative, graph database-driven solution. Duties: Lead the design of data architectures and the development More ❯
Wilmslow, England, United Kingdom Hybrid / WFH Options
The Citation Group
components Programming experience in Python Skills and experience we’d love you to have... Understanding of cloud computing security concepts Experience in relational cloud based database technologies like Snowflake, BigQuery, Redshift Experience in open source technologies like Spark, Kafka, Beam Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
The Citation Group
components Programming experience in Python Skills and experience we’d love you to have... Understanding of cloud computing security concepts Experience in relational cloud based database technologies like Snowflake, BigQuery, Redshift Experience in open source technologies like Spark, Kafka, Beam Good understanding of Cloud providers – AWS, Microsoft Azure, Google Cloud Familiarity with DBT, Delta Lake, Databricks Experience working in More ❯
skills and ability to explain technical ideas to non technical audiences. Experience with real time data pipelines, event driven architectures (Kafka/Kinesis), or modern data warehouses (Snowflake, Redshift, BigQuery) is a plus. Our Benefits: Paid Vacation Days Health insurance Commuter benefit Employee Stock Purchase Plan (ESPP) Mental Health & Family Forming Benefits Continuing education and corridor travel benefits Our More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
and architecture, with a focus on developing scalable cloud solutions in Azure, GCP or AWS. Data Modelling using Kimball, 3NF or Dimensional methodologies Analytics Engineering lean, with experience within BigQuery (GCP) with data modelling in DBT and mobile/telecoms industry experience would be beneficial, with deep DBT experience being highly beneficial Depth of knowledge and understanding across core More ❯
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
london (city of london), south east england, united kingdom
Focused Futures Consultancy LTD
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Awin
equivalent (expired is acceptable) Working knowledge of SQL and data modelling concepts Experience with BI tools (e.g., Power BI, Looker, Tableau) Familiarity with cloud data platforms such as Snowflake, BigQuery, or AWS Redshift Understanding of modern data architecture and APIs Our Offer Flexi-Week and Work-Life Balance: We prioritise your mental health and wellbeing, offering you a flexible More ❯
modern data lake architectures Advanced proficiency in Python (including PySpark) and SQL, with experience building scalable data pipelines and analytics workflows Strong background in cloud-native data infrastructure (e.g., BigQuery, Redshift, Snowflake, Databricks) Demonstrated ability to lead teams, set technical direction, and collaborate effectively across business and technology functions Desirable skills Familiarity with machine learning pipelines and MLOps practices More ❯
Reading, Berkshire, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
per week) Location: Reading - Hybrid The must-haves: Proven experience in building and managing data pipelines using cloud-native tools, ideally on Google Cloud Platform. Hands-on experience with BigQuery and data transformation frameworks such as Dataform or DBT is essential. A strong background in SQL (particularly PostgreSQL), ETL/ELT processes, and working with large datasets is required. More ❯