4 of 4 Permanent PySpark Jobs in Oxfordshire

Data Engineer (Fabric-Platforms)

Hiring Organisation
Methods
Location
Oxford, Oxfordshire, UK
Employment Type
Full-time
with Spark-Based Solutions: You possess experience with Spark-based platforms like Azure Synapse, Databricks, Microsoft Fabric, or even on-premise Spark clusters, using PySpark or Spark SQL to manage and process large datasets. Expertise in Building ETL and ELT Pipelines: You are skilled in building robust ...

Senior Data Engineer

Hiring Organisation
ScaleOps Search Ltd
Location
Oxford, Oxfordshire, UK
Employment Type
Full-time
practices for data governance and security. Required Skills & Experience 5–10 years in data engineering roles, ideally within consultancy environments. Strong proficiency in: Databricks (PySpark) Azure Data Factory (ADF) SQL (complex queries and optimisation) Azure Data Services (Data Lake, Synapse, etc.) Excellent communication and stakeholder management skills. Nice ...

Data Engineer

Hiring Organisation
The Difference Engine
Location
Oxford, Oxfordshire, UK
Employment Type
Full-time
years commercial software engineering experience, with a strong focus on data-engineering at scale Strong experience building robust & scalable data pipelines using Python and PySpark A deep knowledge of Schema Design, SQL and database optimizations Experience of being a bar-raiser and helping improve engineering best practices Experience ...

Senior Data Engineer

Hiring Organisation
Tenth Revolution Group
Location
Oxford, Oxfordshire, United Kingdom
Employment Type
Permanent
Salary
£60000 - £70000/annum
responsible for: Designing end-to-end data architecture aligned with modern best practices. Building and managing ingestion pipelines using Databricks and related tools. Developing PySpark/Spark SQL notebooks for complex transformations and cleansing. Applying governance, security, and CI/CD best practices across cloud environments. Leading technical discussions … producing professional documentation. To be successful in this role, you will have: Hands-on experience with Databricks including Unity Catalog. Strong PySpark/Spark SQL skills for large-scale transformations. Experience integrating with diverse data sources such as APIs, cloud storage and databases. Experience with the Azure cloud data ...