Remote Senior Data Analyst Jobs in High Wycombe

2 of 2 Remote Senior Data Analyst Jobs in High Wycombe

Senior Data Analyst - GCP, PostreSQL, AWS, ETL, SQL - Hybrid, Milton Keynes

high wycombe, south east england, united kingdom
Hybrid / WFH Options
MRP-Global
A large global organisation are looking to hire a Senior Data Analyst to lead end-to-end data workflows from requirement to delivery including data product creation and secure data transfer from Google Cloud Platform to PostgreSQL. This will be a full time permanent position. The role will be on a Hybrid/Remote … to automate those views, ensuring reliable execution on daily, weekly, or customized schedules. Execute Cross‐Platform ETL with AWS Glue Develop, deploy, and maintain AWS Glue jobs to extract data from GCP (such as BigQuery or GCS) and load it into PostgreSQL. Set up secure connectivity, schedule jobs via cron or trigger mechanisms, and ensure data pipelines are … errors. Conduct root cause analysis for pipeline failures—whether due to schema mismatches or performance bottlenecks—and apply robust fixes. Document resolutions to strengthen system resilience. Design, Build, & Govern Data Products Architect, construct, and maintain reusable data products, embedding clean datasets, metadata, governance policies, and clearly defined data contracts. Ensure compliance with FAIR principles—data being More ❯
Posted:

Sr Data Analyst

high wycombe, south east england, united kingdom
Hybrid / WFH Options
beatmysalary
Location : Milton Keynes, hybrid Type of employmnbet : Contract or Permanent Lead end-to-end data workflows—from requirement-to-delivery—including data product creation and secure data transfer from Google Cloud Platform to PostgreSQL. 1. Develop & Schedule SQL Views via DAGs Design and implement SQL views aligned with business needs, prioritizing clarity, reusability, and efficiency. Build and … automate those views, ensuring reliable execution on daily, weekly, or customized schedules. 2. Execute Cross‐Platform ETL with AWS Glue Develop, deploy, and maintain AWS Glue jobs to extract data from GCP (such as BigQuery or GCS) and load it into PostgreSQL. Set up secure connectivity, schedule jobs via cron or trigger mechanisms, and ensure data pipelines are … Conduct root cause analysis for pipeline failures—whether due to schema mismatches or performance bottlenecks—and apply robust fixes. Document resolutions to strengthen system resilience. 4. Design, Build, & Govern Data Products Architect, construct, and maintain reusable data products, embedding clean datasets, metadata, governance policies, and clearly defined data contracts. Ensure compliance with FAIR principles—data being More ❯
Posted: