pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Noir
pipelines. Work with Python and SQL for data processing, transformation, and analysis. Leverage a wide range of GCP services including: Cloud Composer (Apache Airflow) BigQuery Cloud Storage Dataflow Pub/Sub Cloud Functions IAM Design and implement data models and ETL processes. Apply infrastructure-as-code practices using tools More ❯
techniques, including star and snowflake schemas, for efficient data analysis. Familiarity with cloud platforms such as AWS or GCP, including services like Databricks, Redshift, BigQuery, and Snowflake. Strong Python skills for data manipulation, scripting, and automation using libraries like Pandas and NumPy. Experience managing data architecture within data warehouses More ❯
e.g., Spark, Kafka). Strong experience with cloud data platforms (e.g., AWS, GCP, Azure). Hands-on experience with data warehousing tools (e.g., Snowflake, BigQuery, Redshift). Comfortable with infrastructure-as-code tools and modern DevOps practices. Strong understanding of data governance, security, and compliance. Excellent communication and leadership More ❯
designing, building, and maintaining scalable data pipelines and ETL processes. Proficiency in SQL and experience working with relational and non-relational databases (e.g. Snowflake, BigQuery, PostgreSQL, MySQL, MongoDB). Hands-on experience with big data technologies such as Apache Spark, Kafka, Hive, or Hadoop. Proficient in at least one More ❯
salisbury, south west england, United Kingdom Hybrid / WFH Options
Ascentia Partners
understanding of data security, GDPR, access control, and encryption best practices. Nice-to-Have Skills Exposure to AWS Redshift, Glue, or Snowflake. Familiarity with BigQuery and Google Analytics APIs. Proficiency in Python, PySpark, or dbt for data transformations. Background in insurance, especially in pricing analytics or actuarial data. Click More ❯
data from diverse sources. Strong knowledge of SQL/NoSQL databases and cloud data warehouse technology such as Snowflake/Databricks/Redshift/BigQuery, including performance tuning and optimisation. Understanding of best practices for designing scalable and efficient data models, leveraging dbt for transformations. Familiarity with CircleCI, Terraform More ❯
data engineering, with a strong focus on Google Cloud Platform (GCP)-based solutions. Proficiency in the GCP platform, particularly in Data & AI services (e.g., BigQuery, DataProc, Cloud SQL, DataFlow, Pub/Sub, Cloud Data Fusion, Cloud Composer, Python, SQL). Designing, developing, and deploying scalable, reliable, and secure cloud More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Queen Square Recruitment
data engineering, with a strong focus on Google Cloud Platform (GCP)-based solutions. Proficiency in the GCP platform, particularly in Data & AI services (e.g., BigQuery, DataProc, Cloud SQL, DataFlow, Pub/Sub, Cloud Data Fusion, Cloud Composer, Python, SQL). Designing, developing, and deploying scalable, reliable, and secure cloud More ❯
data engineering, with a strong focus on Google Cloud Platform (GCP)-based solutions. Proficiency in the GCP platform, particularly in Data & AI services (e.g., BigQuery, DataProc, Cloud SQL, DataFlow, Pub/Sub, Cloud Data Fusion, Cloud Composer, Python, SQL). Designing, developing, and deploying scalable, reliable, and secure cloud More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Tata Consultancy Services
data engineering, with a strong focus on Google Cloud Platform (GCP)-based solutions. Proficiency in the GCP platform, particularly in Data & AI services (e.g., BigQuery, DataProc, Cloud SQL, DataFlow, Pub/Sub, Cloud Data Fusion, Cloud Composer, Python, SQL). Designing, developing, and deploying scalable, reliable, and secure cloud More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
Intellect Group
client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in More ❯
cambridge, east anglia, United Kingdom Hybrid / WFH Options
Intellect Group
client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in More ❯
Cambridge, south west england, United Kingdom Hybrid / WFH Options
Intellect Group
client-facing environment Familiarity with tools and frameworks such as: Databricks PySpark Pandas Airflow or dbt Experience deploying solutions using cloud-native services (e.g., BigQuery, AWS Glue, S3, Lambda) What’s On Offer Fully remote working with the flexibility to work from anywhere in the UK Optional weekly in More ❯
to understand our customers better, harness data, and discover insights across all touchpoints, campaigns, and marketing activities. Using tools and databases including Google Sheets, BigQuery (SQL), Tableau, and Python, you'll tackle intriguing problems such as: Measuring, quantifying, and optimising complex marketing funnels and consumer journeys Running tests to More ❯
understand our customers better, harnessing data, and discovering insights across all our touchpoints, campaigns, and marketing activities. Using tools and databases including Google Sheets, BigQuery (SQL), Tableau, and Python, you'll tackle intriguing and exciting problems that we're only just starting to understand, in order to: Deepen our More ❯
ELT workflows. Strong analytic skills related to working with unstructured datasets. Engineering best practices and standards. Experience with data warehouse software (e.g. Snowflake, GoogleBigQuery, Amazon Redshift). Experience with data tools: Hadoop, Spark, Kafka, etc. Code versioning (Github integration and automation). Experience with scripting languages such as More ❯
London, England, United Kingdom Hybrid / WFH Options
Focus on SAP
is a plus). Experience with Airflow or similar orchestration tools. Familiarity with MLflow or MLOps practices. Knowledge of data warehousing solutions (Snowflake, Redshift, BigQuery). Consulting background is a plus. Strong communication skills (oral & written) Rights to work in the UK is must (No Sponsorship available) Responsibilities: Design More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focus on SAP
is a plus). Experience with Airflow or similar orchestration tools. Familiarity with MLflow or MLOps practices. Knowledge of data warehousing solutions (Snowflake, Redshift, BigQuery). Consulting background is a plus. Strong communication skills (oral & written) Rights to work in the UK is must (No Sponsorship available) Responsibilities: Design More ❯
Runcorn, Cheshire, North West, United Kingdom Hybrid / WFH Options
Forward Role
much more Key Responsibilities Develop and maintain ETL/ELT data pipelines using Python and SQL. Work with enterprise-level cloud platforms, ideally GCP (BigQuery, Airflow, Cloud Functions). Integrate APIs and process data from multiple sources. Design and optimise data warehouses and reporting systems. Build reports and dashboards More ❯
databases. Hands on experience working with visualization tools including ThoughtSpot, Power BI or Tableau. Familiarity with leading cloud-based data warehouses such as Azure, BigQuery, AWS Redshift, or Snowflake. Strong analytical and problem-solving abilities to address complex data challenges. Detail-oriented mindset with a focus on data accuracy More ❯
commercial environment creating production grade ETL and ELT pipelines in python Comfortable implementing data architectures in analytical data warehouses such as Snowflake, Redshift or BigQuery Hands on experience with data orchestrators such as Airflow Knowledge of Agile development methodologies Awareness of cloud technology particularly AWS. Knowledge of automated delivery More ❯
through sharing knowledge and mentoring. Minimum Requirements Min. 3 years of experience as Data Analyst/Scientist Proficient in query language/framework (SQL, BigQuery, MySQL) is a MUST Has experience handling big data projects Experienced in R or Python Experience with data visualisation tools like Looker Mastered various More ❯
organisation Essential Criteria - Bachelor's degree in Statistics, Mathematics, Computer Science, or a related quantitative discipline - 7+ years of experience with advanced SQL (Snowflake, BigQuery, Redshift, Oracle, PostgreSQL, MSSQL, etc.) - 5+ years of experience with reporting/visualization tools (Looker, Tableau, Power BI, etc.) - Strong knowledge of Looker/ More ❯