role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages like More ❯
role and graduate degree in Computer Science, Statistics, Informatics, Information Systems or another quantitative field ● Proficiency in writing SQL queries and knowledge of cloud-based databases like Snowflake, Redshift, BigQuery or other big data solutions ● Experience in data modelling and tools such as dbt, ETL processes, and data warehousing ● Experience with at least one of the programming languages like More ❯
experience in Python development for data workflows Experience building and maintaining ETL pipelines, ideally with Apache Airflow or a similar orchestration tool Hands-on experience with Google Cloud Platform (BigQuery, GCS, etc.) or another major cloud provider Good understanding of data modelling principles and the ability to translate business needs into technical solutions Strong collaboration and communication skills, with More ❯
years of experience in data engineering or analytics, including designing and delivering enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one More ❯
years of experience in data engineering or analytics, including designing and delivering enterprise grade data solutions Strong hands-on experience with Snowflake (or comparable cloud data warehouses like BigQuery, RedShift, Synapse) including data modelling, performance tuning and cost management Familiarity with SQL best practices, ELT patterns, and modern data transformation frameworks such as dbt Competence in at least one More ❯
big data platforms. Knowledge of data modeling, replication, and query optimization. Hands-on experience with SQL and NoSQL databases is desirable. Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) would be beneficial. Data Platform Management: Comfortable operating in hybrid environments (cloud and on-prem). Experience integrating diverse data sources and systems. Understanding of secure data transfer protocols More ❯
to structure information residing in the data under analysis hierarchically. Robust knowledge of statistical concepts, accompanied by expertise with a set of analytical tools ranging from databases (e.g. SQL, BigQuery, noSQL) to programming languages (e.g. R, Python, Spark), and from data visualisation (e.g. PowerBI, Tableau) to machine learning. Deep expertise in advanced analytics, data science and traditional analytics. High More ❯
and infrastructure as code •Understanding of machine learning workflows or MLOps for deploying and monitoring models in production. •Experience with cloud platforms, especially AWS and its data services (e.g., BigQuery, Redshift, Athena). •Knowledge of network protocols and programming around HTTP(s), SSH, TCP and UDP •Development experience within secure or classified environments particularly where compliance or restricted networks More ❯
London, England, United Kingdom Hybrid/Remote Options
Route Research Ltd
/ELT pipelines using any major programming language, with a strong preference for Python · Databases & SQL: Extensive experience working with databases, SQL, and data warehouses. Familiarity with Postgres and BigQuery required. · Data Ingestion: Experience ingesting and processing data from diverse sources (databases, data warehouses, and external APIs) · Infrastructure-as-Code (IaC): Hands-on experience with an IaC solution, such More ❯
including APIs, databases, and flat files. Assist in the setup and optimisation of data infrastructure and storage solutions, such as data warehouses and data lakes, using platforms like Snowflake, BigQuery, or Azure Synapse. Help manage cloud-based infrastructure and services across AWS, Azure, or GCP environments. Learn to monitor data quality and integrity across systems and gain an understanding More ❯
and problem-solving skills, with the ability to convert data into actionable insights. Proficiency in data visualization tools such as Power BI , or similar. Experience with Snowflake and GCP (BigQuery) or other cloud-based data warehousing technologies is highly desirable. Familiarity with Python is a plus. Working knowledge of Lead Scoring criteria, build and implementation is a plus Great More ❯
closely with stakeholders to translate business requirements into technical solutions. Requirements 5+ years of experience in Data Engineering or a similar role. Strong proficiency in Google Cloud Platform (GCP), BigQuery, Dataflow, Pub/Sub, Composer, etc. Expertise in SQL and programming with Python or Scala. Experience with Airflow, dbt, or other orchestration frameworks. Solid understanding of data warehousing, ETL More ❯
closely with stakeholders to translate business requirements into technical solutions. Requirements 5+ years of experience in Data Engineering or a similar role. Strong proficiency in Google Cloud Platform (GCP), BigQuery, Dataflow, Pub/Sub, Composer, etc. Expertise in SQL and programming with Python or Scala. Experience with Airflow, dbt, or other orchestration frameworks. Solid understanding of data warehousing, ETL More ❯
AI/ML and AI enabled analytics (LLMs, RAG, agents). Strong hands on coding skills with Spark (PySpark/Scala), SQL, dbt, and modern data platforms (Databricks, Snowflake, BigQuery). Experience with cloud platforms (AWS preferred). Proven expertise in BI and semantic modeling using tools such as Power BI, Tableau, Looker, or Mode. Strong understanding of data More ❯
DevOps, integrating automated testing into a CI/CD pipeline using Jenkins or GitLab runner. Experience with either AWS, Azure OR GCP Cloud Good knowledge in GCP Storage Buckets, BigQuery, Dataflow and Cloud Functions (any Cloud). Hands-on experience with one or more programming languages (Java/JavaScript). Experience with non-functional testing using JMeter or similar More ❯
Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
functionally with engineers, analysts, and product managers to advance data platform capabilities. Own the roadmap for analytics features and data products across internal business units. Requirements Strong experience in BigQuery and DBT for data modelling and transformation. Advanced SQL skills and deep understanding of modern data architecture principles, especially Medallion architecture. Familiarity with orchestration tools such as Airflow. Experience More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
CV TECHNICAL LTD
Airflow, or dbt. Strong SQL skills and experience with cloud platforms (Azure preferred). Solid programming background in Python, Scala, or Java. Knowledge of data warehousing solutions (e.g. Snowflake, BigQuery, Redshift). Strong understanding of data governance, security, and compliance (experience within financial services is a plus). Leadership experience, with the ability to mentor, influence, and set technical More ❯
Uxbridge, England, United Kingdom Hybrid/Remote Options
Pepper Advantage
Data Science, or a related field. 5+ years of experience in data architecture, solution design, or similar roles. Strong experience with modern data platforms and technologies (e.g., Snowflake, Databricks, BigQuery, Azure/AWS/GCP data services). Deep knowledge of data modeling, APIs, event-driven architectures, and cloud-native data architectures. Proven ability to design and implement scalable More ❯
in a simple manner. Technical Skills: Advanced SQL skills essential. Experience with visualisation tools, such as Metabase, Tableau, Looker and PowerBi. Knowledge of dbt. Familiarity with data warehouses like BigQuery, Snowflake and Redshift. Knowledge in another programming language such as Python or R or a keen desire to develop programming skills. Good statistical understanding. Other Skills: Strong ability and More ❯
in a simple manner. Technical Skills: Advanced SQL skills essential. Experience with visualisation tools, such as Metabase, Tableau, Looker and PowerBi. Knowledge of dbt. Familiarity with data warehouses like BigQuery, Snowflake and Redshift. Knowledge in another programming language such as Python or R or a keen desire to develop programming skills. Good statistical understanding. Other Skills: Strong ability and More ❯
Cucumber or similar framework. Experience in Azure DevOps to integrating automated testing into a CI/CD pipeline using Jenkins or GitLab runner. Good Knowledge in GCP Storage Buckets, BigQuery, Dataflow and Cloud Function (Any Cloud). Hands-on experience with one of more programming languages. Experience with non-functional testing with JMeter or similar. Proven experience and ability More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Singular Recruitment
in data architecture, data modeling, and scalable storage design Solid engineering practices: Git and CI/CD for data systems Highly Desirable Skills GCP Stack: Hands-on expertise with BigQuery, Cloud Storage, Pub/Sub, and orchestrating workflows with Composer or Vertex Pipelines. Domain Knowledge: Understanding of sports-specific data types ( event, tracking, scouting, video ) API Development: Experience building More ❯
in data architecture, data modeling, and scalable storage design Solid engineering practices: Git and CI/CD for data systems Highly Desirable Skills GCP Stack: Hands-on expertise with BigQuery, Cloud Storage, Pub/Sub, and orchestrating workflows with Composer or Vertex Pipelines. Domain Knowledge: Understanding of sports-specific data types ( event, tracking, scouting, video ) API Development: Experience building More ❯
technology solutions. Strong programming skills (JavaScript, TypeScript, Python, or similar). Experience with cloud platforms (preferably Google Cloud Platform) and serverless architecture. Advanced SQL and data warehousing skills (e.g. BigQuery). Familiarity with front-end frameworks such as React. Experience with ETL/ELT processes and data visualisation tools (e.g. Looker Studio). Knowledge of AI integrations and agent More ❯