Databricks, or equivalent) Proficiency in ELT/ETL development using tools such as Data Factory, Dataflow Gen2, Databricks Workflows, or similar orchestration frameworks Experience with Python and/or PySpark for data transformation, automation, or pipeline development Familiarity with cloud services and deployment automation (e.g., Azure, AWS, Terraform, CI/CD, Git) Ability to deliver clear, insightful, and performant More ❯
Wiltshire, England, United Kingdom Hybrid / WFH Options
Data Science Talent
and DevOps and collaborate with the Software Delivery Manager and data engineering leadership. What you'll need Hands-on Databricks experience Strong Azure Cloud knowledge Proficient in SQL, Python, PySpark ETL & pipeline design (Matillion preferred, alternatives acceptable) Practical data modelling & pipeline architecture Terraform or Bicep for IaC About the company The company is one of the longest-established financial More ❯
swindon, wiltshire, south west england, united kingdom Hybrid / WFH Options
Data Science Talent
and DevOps and collaborate with the Software Delivery Manager and data engineering leadership. What you'll need Hands-on Databricks experience Strong Azure Cloud knowledge Proficient in SQL, Python, PySpark ETL & pipeline design (Matillion preferred, alternatives acceptable) Practical data modelling & pipeline architecture Terraform or Bicep for IaC About the company The company is one of the longest-established financial More ❯
and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across both technical and non-technical teams, with the ability More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non-technical teams, with the ability More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non-technical teams, with the ability to More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
term data strategy with a strong focus on data integrity and GDPR compliance To be successful in the role you will have Hands-on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, Delta Lake, and Azure Synapse Strong More ❯
term data strategy with a strong focus on data integrity and GDPR compliance To be successful in the role you will have Hands-on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, Delta Lake, and Azure Synapse Strong More ❯
Bournemouth, Dorset, United Kingdom Hybrid / WFH Options
LV=
About The Role We are looking for an experienced Test Manager with a strong background in data platform testing, particularly across Microsoft Fabric, Azure services, PySpark notebooks, and automated testing frameworks. You will play a pivotal role in ensuring quality and governance across data pipelines, lakehouses, and reporting layers. This role requires a hands-on leader who can define … Microsoft Fabric workloads (Lakehouses, Pipelines, Notebooks, Power BI). Lead and manage the QA effort across multiple agile teams. Drive the development and maintenance of automated testing for: 1-PySpark notebooks in Fabric/Databricks 2-Data pipelines and transformations 3-Delta tables and lakehouse validation Embed testing into CI/CD pipelines (Azure DevOps or GitHub Actions). More ❯
South West London, London, England, United Kingdom
Tenth Revolution Group
Data Engineer - PySpark - Palantir - London - £75K Join a dynamic team dedicated to innovative data solutions in Soho! We are seeking a Data Engineer with expertise in PySpark to play a pivotal role as the team and business grow. This permanent, hybrid position offers a unique opportunity to contribute to impactful projects while enjoying a supportive and collaborative work … environment.Key Responsibilities:- Develop and optimise data pipelines using PySpark to ensure efficient data processing- Collaborate with cross-functional teams to deliver high-quality, data-driven insights- Engage in continuous learning and implement best practices in data management- Participate in the deployment and maintenance of data solutions within the Microsoft ecosystemThe ideal candidate will have strong analytical skills and a More ❯