Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or GoogleBigQuery/Cloud Storage ). Exposure to workflow orchestration tools ( Apache Airflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or GoogleBigQuery/Cloud Storage ). Exposure to workflow orchestration tools ( Apache Airflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
Sheffield, South Yorkshire, England, United Kingdom Hybrid / WFH Options
Vivedia Ltd
and Python . Strong grasp of ETL/ELT pipelines , data modeling , and data warehousing . Experience with cloud platforms (AWS, Azure, GCP) and tools like Snowflake, Databricks, or BigQuery . Familiarity with streaming technologies (Kafka, Spark Streaming, Flink) is a plus. Tools & Frameworks: Airflow, dbt, Prefect, CI/CD pipelines, Terraform. Mindset: Curious, data-obsessed, and driven to More ❯
in A/B testing design and evaluation. Experience in data visualization and reporting tools (Tableau, Grafana, Firebase, Devtodev, Power BI). Proficiency in SQL, Python, and/or BigQuery/Postgres for data analysis and modeling. Advanced Excel/VBA skills. Strong understanding of game/product economics, monetization, and user behavior metrics. Excellent analytical skills with the More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Retelligence
practices for governance, lineage, and performance – Enhance data observability and automate key workflows for efficiency You Will Need – Advanced hands-on experience with SQL, Python, and core GCP tools (BigQuery, Dataflow, Pub/Sub, Composer or similar) – Proven background in full-lifecycle data engineering and automation – Strong understanding of CI/CD and infrastructure-as-code for data systems More ❯
practices for governance, lineage, and performance – Enhance data observability and automate key workflows for efficiency You Will Need – Advanced hands-on experience with SQL, Python, and core GCP tools (BigQuery, Dataflow, Pub/Sub, Composer or similar) – Proven background in full-lifecycle data engineering and automation – Strong understanding of CI/CD and infrastructure-as-code for data systems More ❯
london, south east england, united kingdom Hybrid / WFH Options
Retelligence
practices for governance, lineage, and performance – Enhance data observability and automate key workflows for efficiency You Will Need – Advanced hands-on experience with SQL, Python, and core GCP tools (BigQuery, Dataflow, Pub/Sub, Composer or similar) – Proven background in full-lifecycle data engineering and automation – Strong understanding of CI/CD and infrastructure-as-code for data systems More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Retelligence
practices for governance, lineage, and performance – Enhance data observability and automate key workflows for efficiency You Will Need – Advanced hands-on experience with SQL, Python, and core GCP tools (BigQuery, Dataflow, Pub/Sub, Composer or similar) – Proven background in full-lifecycle data engineering and automation – Strong understanding of CI/CD and infrastructure-as-code for data systems More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Retelligence
practices for governance, lineage, and performance – Enhance data observability and automate key workflows for efficiency You Will Need – Advanced hands-on experience with SQL, Python, and core GCP tools (BigQuery, Dataflow, Pub/Sub, Composer or similar) – Proven background in full-lifecycle data engineering and automation – Strong understanding of CI/CD and infrastructure-as-code for data systems More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Morson Edge
pipelines Automating and orchestrating workflows using tools such as AWS Glue, Azure Data Factory or Google Cloud Dataflow Working with leading data platforms like Amazon S3, Azure Synapse, GoogleBigQuery and Snowflake Implementing Lakehouse architectures using Databricks or Snowflake Collaborating with engineers, analysts and client teams to deliver value-focused data solutions What We’re Looking For: Strong experience More ❯
strategy definition, vendor assessment, and governance adoption. Experience establishing reference data management and data quality practices at enterprise scale. Knowledge of analytical data architectures: data lakes, warehouses (e.g., Snowflake, BigQuery), BI/reporting, and ETL pipelines. Proficiency in public cloud platforms (AWS, Google Cloud Platform, Microsoft Azure). Experience with both transactional RDBMS and modern distributed/cloud-native More ❯
strategy definition, vendor assessment, and governance adoption. Experience establishing reference data management and data quality practices at enterprise scale. Knowledge of analytical data architectures: data lakes, warehouses (e.g., Snowflake, BigQuery), BI/reporting, and ETL pipelines. Proficiency in public cloud platforms (AWS, Google Cloud Platform, Microsoft Azure). Experience with both transactional RDBMS and modern distributed/cloud-native More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Paritas Recruitment
strategy definition, vendor assessment, and governance adoption. Experience establishing reference data management and data quality practices at enterprise scale. Knowledge of analytical data architectures: data lakes, warehouses (e.g., Snowflake, BigQuery), BI/reporting, and ETL pipelines. Proficiency in public cloud platforms (AWS, Google Cloud Platform, Microsoft Azure). Experience with both transactional RDBMS and modern distributed/cloud-native More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Paritas Recruitment
strategy definition, vendor assessment, and governance adoption. Experience establishing reference data management and data quality practices at enterprise scale. Knowledge of analytical data architectures: data lakes, warehouses (e.g., Snowflake, BigQuery), BI/reporting, and ETL pipelines. Proficiency in public cloud platforms (AWS, Google Cloud Platform, Microsoft Azure). Experience with both transactional RDBMS and modern distributed/cloud-native More ❯
london, south east england, united kingdom Hybrid / WFH Options
Paritas Recruitment
strategy definition, vendor assessment, and governance adoption. Experience establishing reference data management and data quality practices at enterprise scale. Knowledge of analytical data architectures: data lakes, warehouses (e.g., Snowflake, BigQuery), BI/reporting, and ETL pipelines. Proficiency in public cloud platforms (AWS, Google Cloud Platform, Microsoft Azure). Experience with both transactional RDBMS and modern distributed/cloud-native More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Paritas Recruitment
strategy definition, vendor assessment, and governance adoption. Experience establishing reference data management and data quality practices at enterprise scale. Knowledge of analytical data architectures: data lakes, warehouses (e.g., Snowflake, BigQuery), BI/reporting, and ETL pipelines. Proficiency in public cloud platforms (AWS, Google Cloud Platform, Microsoft Azure). Experience with both transactional RDBMS and modern distributed/cloud-native More ❯
data accuracy, availability, security, and compliance. Partner with leadership to shape the company's tech and data strategy. Experience: Proven experience in data engineering, analytics, and reporting (SQL, GoogleBigQuery, Power BI, Looker, Tableau). Strong capability with API integrations and low/no-code tools (n8n, Zapier, Airtable, Make). Broad technical problem-solving skills, able to implement More ❯
offline and online methods. Skilled at navigating ambiguity and turning complex, messy data into actionable insights. Proficiency in SQL and navigating large-scale data lakes and warehouses (e.g. GoogleBigQuery, Redshift). Experience with cloud platforms such as AWS, GCP is a plus. At JET, this is on the menu Our teams forge connections internally and work with some More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Utility Warehouse Limited
functional teams and communicating with both technical and non-technical stakeholders. Curious, collaborative, and impact-driven — you enjoy turning data into action. Experience working with cloud-based data environments (BigQuery, GCP); exposure to dbt or Dataform is a plus. Experience with Python is desirable. This is a 6 month contract. It is a hybrid role, based in central London More ❯
solutions. Promote a culture of insight, simplicity, and innovation in data use. Skills & Experience Expert in Microsoft Power Platform; strong SQL skills. Experience with cloud analytics (e.g. Databricks, Snowflake, BigQuery). Skilled in automation, agile delivery, and stakeholder management. Strong understanding of data governance, CI/CD, and version control. Excellent data storytelling and visual communication skills. If you More ❯
everything they build. What You’ll Do Build and own data pipelines that connect product, analytics, and operations Design scalable architectures using tools like dbt , Airflow , and Snowflake/BigQuery Work with engineers and product teams to make data easily accessible and actionable Help evolve their data warehouse and ensure high data quality and reliability Experiment with automation, streaming More ❯
everything they build. What You’ll Do Build and own data pipelines that connect product, analytics, and operations Design scalable architectures using tools like dbt , Airflow , and Snowflake/BigQuery Work with engineers and product teams to make data easily accessible and actionable Help evolve their data warehouse and ensure high data quality and reliability Experiment with automation, streaming More ❯
london (city of london), south east england, united kingdom
Digital Waffle
everything they build. What You’ll Do Build and own data pipelines that connect product, analytics, and operations Design scalable architectures using tools like dbt , Airflow , and Snowflake/BigQuery Work with engineers and product teams to make data easily accessible and actionable Help evolve their data warehouse and ensure high data quality and reliability Experiment with automation, streaming More ❯
if you Have 7+ years in data or backend engineering, including 2+ years in a lead or technical decision-making role. Are fluent in the modern data stack: DBT, BigQuery, Airflow, Terraform, GCP or AWS. Bring strong software engineering skills: Python, SQL, CI/CD, DevOps mindset. Understand data warehousing, ETL/ELT, orchestration, and streaming pipelines. Thrive in More ❯
if you Have 7+ years in data or backend engineering, including 2+ years in a lead or technical decision-making role. Are fluent in the modern data stack: DBT, BigQuery, Airflow, Terraform, GCP or AWS. Bring strong software engineering skills: Python, SQL, CI/CD, DevOps mindset. Understand data warehousing, ETL/ELT, orchestration, and streaming pipelines. Thrive in More ❯