Remote PySpark Jobs in the Thames Valley

4 of 4 Remote PySpark Jobs in the Thames Valley

Principal Consultant - Data Engineering Lead DBT

Newbury, Berkshire, England, United Kingdom
Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
Employment Type: Full-Time
Salary: £75,000 - £95,000 per annum
Posted:

Senior or Consultant - Data Engineering DBT

Newbury, Berkshire, England, United Kingdom
Hybrid / WFH Options
Intuita
including Azure DevOps or GitHub Considerable experience designing and building operationally efficient pipelines, utilising core Cloud components, such as Azure Data Factory, Big Query, AirFlow, Google Cloud Composer and Pyspark etc Proven experience in modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of More ❯
Employment Type: Full-Time
Salary: £45,000 - £68,000 per annum
Posted:

Clinical Data Engineer

Reading, United Kingdom
Hybrid / WFH Options
Royal Berkshire NHS Foundation Trust
Learning, Deep Learning or LLM Frameworks) Desirable Minimum 2 years experience in Data related field Minimum 2 years in a Business or Management Consulting field Experience of Docker, Hadoop, PySpark, Apache or MS Azure Minimum 2 years NHS/Healthcare experience Disclosure and Barring Service Check This post is subject to the Rehabilitation of Offenders Act (Exceptions Order More ❯
Employment Type: Fixed-Term
Salary: £55690.00 - £62682.00 a year
Posted:

Data Governance and AI

slough, south east england, united kingdom
Hybrid / WFH Options
83zero
controls. AI & Technology Enablement Build tools and processes for metadata management, data quality, and data sharing. Leverage AI and automation tools to improve data governance capabilities. Use Python, SQL, PySpark, Power BI, and related tools for data processing and visualization. Strategy & Stakeholder Engagement Provide subject matter expertise in data governance and AI governance. Collaborate with business, data, and tech More ❯
Posted: