needed. Requirements: Strong proficiency in Python for data engineering tasks. Experience with cloud platforms (e.g., AWS, Azure, or GCP), including services like S3, Lambda, BigQuery, or Databricks. Solid understanding of ETL processes , data modeling, and data warehousing. Familiarity with SQL and relational databases. Knowledge of big data technologies , such More ❯
South East London, England, United Kingdom Hybrid / WFH Options
83data
Lead role. Proven experience designing and delivering enterprise data strategies . Exceptional communication and stakeholder management skills. Expertise in enterprise-grade data warehouses (Snowflake, BigQuery, Redshift). Hands-on experience with Apache Airflow (or similar orchestration tools). Strong proficiency in Python and SQL for pipeline development. Deep understanding More ❯
lake/lakehouse architectures, and cloud-native analytics platforms. Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery). Experience with BI/analytics tools (e.g., Power BI, Tableau) and data visualization best practices. Strong knowledge of data governance, data privacy, and More ❯
lake/lakehouse architectures, and cloud-native analytics platforms Hands-on experience with SQL and cloud data platforms (e.g., Snowflake, Azure, AWS Redshift, GCP BigQuery) Experience with BI/analytics tools (e.g., Power BI, Tableau) and data visualization best practices Strong knowledge of data governance, data privacy, and compliance More ❯
fostering a positive and inclusive team culture What we’re looking for: Hands-on experience building and maintaining cloud-based data systems (e.g., Redshift, BigQuery, Snowflake) Strong coding skills in languages commonly used for data work (e.g., Python, Java, Scala) Deep understanding of ETL/ELT tools and workflow More ❯
Owner or Product Owner for data/analytics platforms. Experience managing data products or platforms, ideally customer data platforms (CDPs), data warehouses (e.g. Snowflake, BigQuery), or data lakes. Familiarity with data engineering workflows, data tools (dbt, Airflow), and cloud data platforms (AWS, GCP, Azure). Familiarity with data modelling More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
or AI transformation projects Proven track record in designing and maintaining large-scale data warehouses and data lakes. Expertise in GCP data services including BigQuery, Composer, Dataform, DataProc, and Pub/Sub. Strong programming experience with Python, PySpark, and SQL. Hands-on experience with data modelling, ETL processes, and More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
and internal teams Required Experience Proven track record of delivering large-scale data platforms using Google Cloud Platform Hands-on experience with GCP tools: BigQuery, Dataform, Dataproc, Composer, Pub/Sub Strong programming skills in Python, PySpark , and SQL Deep understanding of data engineering concepts, including ETL, data warehousing More ❯
similar role. Ability to lead and mentor the architects. Mandatory Skills [at least 2 Hyperscalers] GCP, AWS, Azure, Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills: Designing Databricks based More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
data platforms on Google Cloud Platform, with a focus on data quality at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage More ❯
South East London, England, United Kingdom Hybrid / WFH Options
Singular Recruitment
experience designing, building, and maintaining production-grade data pipelines using Google Cloud Dataflow (Apache Beam) or similar technologies. GCP Stack: Hands-on expertise with BigQuery , Cloud Storage , Pub/Sub , and orchestrating workflows with Composer or Vertex Pipelines. Data Architecture & Modelling: Ability to translate diverse business requirements into scalable More ❯
familiar with the auditing process to verify the efficacy of the data being captured. Naturally you’ll be comfortable working with SQL and ideally BigQuery (though a similar data warehousing technology is fine) with PowerBI experience being a big bonus, though by no means essential. Bonus points if you More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
current with the rapidly evolving landscape Proficient with Business Intelligence (BI) tools (e.g., Power BI, Tableau, Looker) and data platforms (e.g., SQL, Snowflake, GoogleBigQuery) Experience with learning management systems (LMS) and education data standards (e.g., xAPI, LTI, SCORM) Strong analytical mindset with the ability to communicate complex insights More ❯
High Wycombe, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
current with the rapidly evolving landscape Proficient with Business Intelligence (BI) tools (e.g., Power BI, Tableau, Looker) and data platforms (e.g., SQL, Snowflake, GoogleBigQuery) Experience with learning management systems (LMS) and education data standards (e.g., xAPI, LTI, SCORM) Strong analytical mindset with the ability to communicate complex insights More ❯
and accountability. Familiarity with CRM and ERP systems such as Salesforce, Oracle, or SAP. Working knowledge of data warehousing and cloud platforms (e.g., Snowflake, BigQuery, Azure) Ability to identify and apply AI and machine learning tools to enhance forecasting, automate insights, and improve strategic decision-making. Qualifications: Bachelor’s More ❯
AI products. Your responsibilities: Extract and integrate data from various sources based on stakeholder needs. Consolidate data on platforms like dbt, Sigma, Census, Gravity, BigQuery for visualization. Manage the entire ETL cycle within our tech stack. Define and monitor KPIs, maintain documentation, and ensure data governance. Collaborate with the More ❯