experience in Python development for data workflows Experience building and maintaining ETL pipelines, ideally with Apache Airflow or a similar orchestration tool Hands-on experience with Google Cloud Platform (BigQuery, GCS, etc.) or another major cloud provider Good understanding of data modelling principles and the ability to translate business needs into technical solutions Strong collaboration and communication skills, with More ❯
years of experience in data engineering or analytics, including designing and delivering enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one More ❯
years of experience in data engineering or analytics, including designing and delivering enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one More ❯
London, England, United Kingdom Hybrid/Remote Options
Route Research Ltd
/ELT pipelines using any major programming language, with a strong preference for Python · Databases & SQL: Extensive experience working with databases, SQL, and data warehouses. Familiarity with Postgres and BigQuery required. · Data Ingestion: Experience ingesting and processing data from diverse sources (databases, data warehouses, and external APIs) · Infrastructure-as-Code (IaC): Hands-on experience with an IaC solution, such More ❯
closely with stakeholders to translate business requirements into technical solutions. Requirements 5+ years of experience in Data Engineering or a similar role. Strong proficiency in Google Cloud Platform (GCP), BigQuery, Dataflow, Pub/Sub, Composer, etc. Expertise in SQL and programming with Python or Scala. Experience with Airflow, dbt, or other orchestration frameworks. Solid understanding of data warehousing, ETL More ❯
closely with stakeholders to translate business requirements into technical solutions. Requirements 5+ years of experience in Data Engineering or a similar role. Strong proficiency in Google Cloud Platform (GCP), BigQuery, Dataflow, Pub/Sub, Composer, etc. Expertise in SQL and programming with Python or Scala. Experience with Airflow, dbt, or other orchestration frameworks. Solid understanding of data warehousing, ETL More ❯
Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
CV TECHNICAL LTD
Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
Uxbridge, England, United Kingdom Hybrid/Remote Options
Pepper Advantage
Data Science, or a related field. 5+ years of experience in data architecture, solution design, or similar roles. Strong experience with modern data platforms and technologies (e.g., Snowflake, Databricks, BigQuery, Azure/AWS/GCP data services). Deep knowledge of data modeling, APIs, event-driven architectures, and cloud-native data architectures. Proven ability to design and implement scalable More ❯
in a simple manner. Technical Skills: Advanced SQL skills essential. Experience with visualisation tools, such as Metabase, Tableau, Looker and PowerBi. Knowledge of dbt. Familiarity with data warehouses like BigQuery, Snowflake and Redshift. Knowledge in another programming language such as Python or R or a keen desire to develop programming skills. Good statistical understanding. Other Skills: Strong ability and More ❯
in a simple manner. Technical Skills: Advanced SQL skills essential. Experience with visualisation tools, such as Metabase, Tableau, Looker and PowerBi. Knowledge of dbt. Familiarity with data warehouses like BigQuery, Snowflake and Redshift. Knowledge in another programming language such as Python or R or a keen desire to develop programming skills. Good statistical understanding. Other Skills: Strong ability and More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Singular Recruitment
in data architecture, data modeling, and scalable storage design Solid engineering practices: Git and CI/CD for data systems Highly Desirable Skills GCP Stack: Hands-on expertise with BigQuery, Cloud Storage, Pub/Sub, and orchestrating workflows with Composer or Vertex Pipelines. Domain Knowledge: Understanding of sports-specific data types ( event, tracking, scouting, video ) API Development: Experience building More ❯
in data architecture, data modeling, and scalable storage design Solid engineering practices: Git and CI/CD for data systems Highly Desirable Skills GCP Stack: Hands-on expertise with BigQuery, Cloud Storage, Pub/Sub, and orchestrating workflows with Composer or Vertex Pipelines. Domain Knowledge: Understanding of sports-specific data types ( event, tracking, scouting, video ) API Development: Experience building More ❯
technology solutions. Strong programming skills (JavaScript, TypeScript, Python, or similar). Experience with cloud platforms (preferably Google Cloud Platform) and serverless architecture. Advanced SQL and data warehousing skills (e.g. BigQuery). Familiarity with front-end frameworks such as React. Experience with ETL/ELT processes and data visualisation tools (e.g. Looker Studio). Knowledge of AI integrations and agent More ❯
Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or GoogleBigQuery/Cloud Storage ). Exposure to workflow orchestration tools ( Apache Airflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
Plus) Experience with a distributed computing framework like Apache Spark (using PySpark). Familiarity with cloud data services ( AWS S3/Redshift, Azure Data Lake/Synapse, or GoogleBigQuery/Cloud Storage ). Exposure to workflow orchestration tools ( Apache Airflow, Prefect, or Dagster ). Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Retelligence
practices for governance, lineage, and performance – Enhance data observability and automate key workflows for efficiency You Will Need – Advanced hands-on experience with SQL, Python, and core GCP tools (BigQuery, Dataflow, Pub/Sub, Composer or similar) – Proven background in full-lifecycle data engineering and automation – Strong understanding of CI/CD and infrastructure-as-code for data systems More ❯
practices for governance, lineage, and performance – Enhance data observability and automate key workflows for efficiency You Will Need – Advanced hands-on experience with SQL, Python, and core GCP tools (BigQuery, Dataflow, Pub/Sub, Composer or similar) – Proven background in full-lifecycle data engineering and automation – Strong understanding of CI/CD and infrastructure-as-code for data systems More ❯
strategy definition, vendor assessment, and governance adoption. Experience establishing reference data management and data quality practices at enterprise scale. Knowledge of analytical data architectures: data lakes, warehouses (e.g., Snowflake, BigQuery), BI/reporting, and ETL pipelines. Proficiency in public cloud platforms (AWS, Google Cloud Platform, Microsoft Azure). Experience with both transactional RDBMS and modern distributed/cloud-native More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Paritas Recruitment
strategy definition, vendor assessment, and governance adoption. Experience establishing reference data management and data quality practices at enterprise scale. Knowledge of analytical data architectures: data lakes, warehouses (e.g., Snowflake, BigQuery), BI/reporting, and ETL pipelines. Proficiency in public cloud platforms (AWS, Google Cloud Platform, Microsoft Azure). Experience with both transactional RDBMS and modern distributed/cloud-native More ❯
solutions. Promote a culture of insight, simplicity, and innovation in data use. Skills & Experience Expert in Microsoft Power Platform; strong SQL skills. Experience with cloud analytics (e.g. Databricks, Snowflake, BigQuery). Skilled in automation, agile delivery, and stakeholder management. Strong understanding of data governance, CI/CD, and version control. Excellent data storytelling and visual communication skills. If you More ❯
if you Have 7+ years in data or backend engineering, including 2+ years in a lead or technical decision-making role. Are fluent in the modern data stack: DBT, BigQuery, Airflow, Terraform, GCP or AWS. Bring strong software engineering skills: Python, SQL, CI/CD, DevOps mindset. Understand data warehousing, ETL/ELT, orchestration, and streaming pipelines. Thrive in More ❯
if you Have 7+ years in data or backend engineering, including 2+ years in a lead or technical decision-making role. Are fluent in the modern data stack: DBT, BigQuery, Airflow, Terraform, GCP or AWS. Bring strong software engineering skills: Python, SQL, CI/CD, DevOps mindset. Understand data warehousing, ETL/ELT, orchestration, and streaming pipelines. Thrive in More ❯
data pipelines in production environments Strong Python and SQL skills (Pandas, PySpark, query optimisation) Cloud experience (AWS preferred) including S3, Redshift, Glue, Lambda Familiarity with data warehousing (Redshift, Snowflake, BigQuery) Experience with workflow orchestration tools (Airflow, Dagster, Prefect) Understanding of distributed systems, batch and streaming data (Kafka, Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to More ❯
data pipelines in production environments Strong Python and SQL skills (Pandas, PySpark, query optimisation) Cloud experience (AWS preferred) including S3, Redshift, Glue, Lambda Familiarity with data warehousing (Redshift, Snowflake, BigQuery) Experience with workflow orchestration tools (Airflow, Dagster, Prefect) Understanding of distributed systems, batch and streaming data (Kafka, Kinesis) Knowledge of IaC (Terraform, CloudFormation) and containerisation (Docker, Kubernetes) Nice to More ❯