promised outcomes. Drive high client value and broaden relationships at senior levels with current and prospective clients. Our Tech Stack Cloud: Azure, sometimes GCP & AWS Data Platform: Databricks, Snowflake, BigQuery Data Engineering tools: Pyspark, Polars, DuckDB, Malloy, SQL Infrastructure-as-code: Terraform, Pulumi Data Management and Orchestration: Airflow, dbt Databases and Data Warehouses: SQL Server, PostgreSQL, MongoDB, Qdrant, Pinecone More ❯
similar (for real-time data streaming) Experience using data tools in at least one cloud service - AWS, Azure or GCP (e.g. S3, EMR, Redshift, Glue, Azure Data Factory, Databricks, BigQuery, Dataflow, Dataproc) Would you like to join us as we work hard, have fun and make history? Apply for this job indicates a required field First Name Last Name More ❯
and compliance standards. Strong analytical and problem-solving skills with attention to detail. Excellent communication and documentation abilities. Preferred: Exposure to cloud data platforms (AWS Redshift, Azure Synapse, GoogleBigQuery, Snowflake). Knowledge of Python, R, or other scripting languages for data manipulation. Experience in banking, financial services, or enterprise IT environments. More ❯
and compliance standards. Strong analytical and problem-solving skills with attention to detail. Excellent communication and documentation abilities. Preferred: Exposure to cloud data platforms (AWS Redshift, Azure Synapse, GoogleBigQuery, Snowflake). Knowledge of Python, R, or other scripting languages for data manipulation. Experience in banking, financial services, or enterprise IT environments. More ❯
Streams)? Select Which statement best describes your hands on responsibility for architecting and tuning cloud native data lake/warehouse solutions (e.g., AWS S3 + Glue/Redshift, GCP BigQuery, Azure Synapse)? Select What best reflects your experience building ETL/ELT workflows with Apache Airflow (or similar) and integrating them into containerised CI/CD pipelines (Docker, GitHub More ❯
rail industry's revenue allocation system through a suite of technology upgrades and methodological improvements. Key enhancements include migrating the platform to Google Cloud Platform (GCP), implementing a modern BigQuery-based Data Warehouse, and replacing the legacy solution for allocation factor calculation with an innovative, graph database-driven solution. Duties Lead the design of data architectures and the development More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Interquest
rail industry's revenue allocation system through a suite of technology upgrades and methodological improvements. Key enhancements include migrating the platform to Google Cloud Platform (GCP), implementing a modern BigQuery-based Data Warehouse, and replacing the legacy solution for allocation factor calculation with an innovative, graph database-driven solution. Duties: Lead the design of data architectures and the development More ❯
skills and ability to explain technical ideas to non technical audiences. Experience with real time data pipelines, event driven architectures (Kafka/Kinesis), or modern data warehouses (Snowflake, Redshift, BigQuery) is a plus. Our Benefits: Paid Vacation Days Health insurance Commuter benefit Employee Stock Purchase Plan (ESPP) Mental Health & Family Forming Benefits Continuing education and corridor travel benefits Our More ❯
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
london (city of london), south east england, united kingdom
Focused Futures Consultancy LTD
shaping solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Awin
equivalent (expired is acceptable) Working knowledge of SQL and data modelling concepts Experience with BI tools (e.g., Power BI, Looker, Tableau) Familiarity with cloud data platforms such as Snowflake, BigQuery, or AWS Redshift Understanding of modern data architecture and APIs Our Offer Flexi-Week and Work-Life Balance: We prioritise your mental health and wellbeing, offering you a flexible More ❯
modern data lake architectures Advanced proficiency in Python (including PySpark) and SQL, with experience building scalable data pipelines and analytics workflows Strong background in cloud-native data infrastructure (e.g., BigQuery, Redshift, Snowflake, Databricks) Demonstrated ability to lead teams, set technical direction, and collaborate effectively across business and technology functions Desirable skills Familiarity with machine learning pipelines and MLOps practices More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Medialab Group
version control best practices in a collaborative engineering environment. Strong communicator - able to engage with both technical and non-technical colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation More ❯
version control best practices in a collaborative engineering environment. Strong communicator - able to engage with both technical and non-technical colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation More ❯
london, south east england, united kingdom Hybrid / WFH Options
Medialab Group
version control best practices in a collaborative engineering environment. Strong communicator - able to engage with both technical and non-technical colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Medialab Group
version control best practices in a collaborative engineering environment. Strong communicator - able to engage with both technical and non-technical colleagues. Nice to Have Skills Experience working with GCP (BigQuery) or other modern cloud-native data warehouses (e.g. Snowflake, Redshift). Familiarity with data pipelining and orchestration systems (e.g. Airflow). Understanding of modern analytics architectures and data visualisation More ❯
the team. Required skills and experience: Advanced Google Standard SQL and MySQL or similar relational database language Fluency in Python and its application in data analysis Familiarity with GoogleBigQuery or other data warehouse solution Experience with Tableau; creating & managing data sources and converting complex insights into a digestible format Working knowledge of Google Analytics or other web-based More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
re Looking For Hands-on Palantir Foundry expertise or transferable, client facing data engineering experience with large-scale data platforms such as: Snowflake, Databricks, AWS Glue/Redshift, GoogleBigQuery Software engineering skills in Python, Java, or TypeScript/React. Strong data modelling, pipeline development, and API design experience. Excellent problem-solving and communication skills. Why This Role Stands More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
re Looking For Hands-on Palantir Foundry expertise or transferable, client facing data engineering experience with large-scale data platforms such as: Snowflake, Databricks, AWS Glue/Redshift, GoogleBigQuery Software engineering skills in Python, Java, or TypeScript/React. Strong data modelling, pipeline development, and API design experience. Excellent problem-solving and communication skills. Why This Role Stands More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
two days of work per week. Key Requirements Proficiency with Python and SQL for data transformation, pipeline development, and model integration Experience with modern data lake technologies such as BigQuery or Snowflake Experience working with large, complex, and high-volume data sets The ability to unify, collate, and interpret data to support data-driven decisions Proficient in using analytics More ❯
cloud platforms, preferably Azure (AWS, GCP experience is also valuable). Experience managing large and complex datasets, with a strong command of SQL for cloud-based environments (Fabric, Snowflake, BigQuery, Redshift, etc.). A solid understanding of data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Office Angels
cloud platforms, preferably Azure (AWS, GCP experience is also valuable). Experience managing large and complex datasets, with a strong command of SQL for cloud-based environments (Fabric, Snowflake, BigQuery, Redshift, etc.). A solid understanding of data modelling techniques (star schema, data vault, dimensional modelling). Proficiency in Excel-based data workflows for various Agile Retail projects. Hands More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Holland & Barrett International Limited
deliver end-to-end data products. Mentor mid-level analysts and contribute to capability-building across the Core Business Analytics team. Work with a modern data stack, including Redshift, BigQuery, Matillion, and Retool. Location:This is a hybrid role, with 2 days per week expected in either our London or Nuneaton office. The Person Core Skills & Behaviours SQL expertise More ❯