Remote PySpark Jobs in Leeds

3 of 3 Remote PySpark Jobs in Leeds

Data Engineer

Leeds, West Yorkshire, Yorkshire, United Kingdom
Hybrid / WFH Options
Forward Role
financial datasets Python experience, particularly for data processing and ETL workflows Hands-on experience with cloud platforms- Azure Experience designing and maintaining data pipelines using tools like Databricks and PySpark Knowledge of data warehousing solutions - Snowflake experience would be brilliant Understanding of CI/CD processes for deploying data solutions Some exposure to big data technologies and distributed processing More ❯
Employment Type: Permanent, Work From Home
Salary: £55,000
Posted:

Azure Data Engineer

Leeds, West Yorkshire, United Kingdom
Hybrid / WFH Options
Tenth Revolution Group
architecture Ensuring best practices in data governance, security, and performance tuning Requirements: Proven experience with Azure Data Services (ADF, Synapse, Data Lake) Strong hands-on experience with Databricks (including PySpark or SQL) Solid SQL skills and understanding of data modelling and ETL/ELT processes Familiarity with Delta Lake and lakehouse architecture A proactive, collaborative approach to problem-solving More ❯
Employment Type: Permanent
Salary: £55000 - £65000/annum
Posted:

Senior Data Engineer (Data Governance, Databricks, PySpark)

Leeds, Yorkshire, United Kingdom
Hybrid / WFH Options
PEXA Group Limited
the transformation pipeline from start to finish, guaranteeing that datasets are robust, tested, secure, and business-ready. Our data platform is built using Databricks, with data pipelines written in PySpark and orchestrated using Airflow. You will be expected to challenge and improve current transformations, ensuring they meet our performance, scalability, and data governance needs. This includes work with complex … days per year for meaningful collaboration in either Leeds or Thame. Key Responsibilities Ensure end-to-end data quality, from raw ingested data to business-ready datasets Optimise PySpark-based data transformation logic for performance and reliability Build scalable and maintainable pipelines in Databricks and Airflow Implement and uphold GDPR-compliant processes around PII data Collaborate with stakeholders to … management, metadata management, and wider data governance practices Help shape our approach to reliable data delivery for internal and external customers Skills & Experience Required Extensive hands-on experience with PySpark, including performance optimisation Deep working knowledge of Databricks (development, architecture, and operations) Proven experience working with Airflow for orchestration Proven track record in managing and securing PII data, with More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted:
PySpark
Leeds
Median
£55,233