4 of 4 PySpark Jobs in Bristol

Data Engineer

Hiring Organisation
Artis Recruitment
Location
Greater Bristol Area, United Kingdom
concept initiatives About You To be considered, you should have: Proven experience in a Data Engineering role Strong proficiency in SQL, Python and Spark (PySpark) Experience designing data models and implementing reliable pipeline patterns Knowledge of orchestration tools such as Airflow Experience with low-code or no-code integration ...

Azure Data Engineer

Hiring Organisation
Opus Recruitment Solutions Ltd
Location
Bristol, Avon, England, United Kingdom
Employment Type
Contractor
Contract Rate
£400 - £500 per day
legacy warehouse assets into a modern Azure environment Contribute to cloud architecture decisions, data standards and best-practice engineering patterns Develop reliable Python and PySpark code to support data ingestion, transformation, and end-to-end processing. What you’ll bring Strong hands-on experience across Azure Data Services ...

Lead Data Engineer

Hiring Organisation
Canada Life UK
Location
Bristol, Avon, South West, United Kingdom
Employment Type
Part Time
data modelling for Finance and Enterprise data products, working closely with architecture where required. Implement and maintain data pipelines and ETL workflows in Databricks (PySpark, Delta Lake). Contribute to CI/CD pipelines for data applications using Azure DevOps and infrastructure-as-code (Terraform) in line with established … influence within the team and communicate clearly with technical and non-technical stakeholders. Data Engineer (New Technology/Microsoft) Strong experience with Databricks (Spark, PySpark, Delta Lake, and Unity Catalog advantageous). Proficiency in Azure data services (Azure Data Factory, Data Lake, Azure Functions advantageous). Experience contributing ...

Senior BI Developer

Hiring Organisation
Tenth Revolution Group
Location
Bristol, Avon, England, United Kingdom
Employment Type
Full-Time
Salary
£60,000 - £75,000 per annum
maintain scalable ETL/ELT data pipelines within Microsoft Fabric Lakehouse environments Develop and optimise data ingestion/transformation workflows Develop Python and PySpark processes for data transformation and large-scale processing Support the development and optimisation of Power BI datasets, reports and dashboards Build and maintain scalable semantic … data engineering solutions Skills and Experience Hands-on expertise with Microsoft Fabric, Lakehouse architecture and Data Pipelines Experience developing Python or PySpark solutions for large-scale data processing Understanding of ETL/ELT design patterns, data modelling and modern data platform architecture Experience developing Power BI datasets, semantic models ...