PySpark Jobs in Bolton

4 of 4 PySpark Jobs in Bolton

Senior Data Engineer

bolton, greater manchester, north west england, united kingdom
Scrumconnect Consulting
Experience Active SC Clearance (mandatory at application stage). Proven expertise in: Azure Data Factory & Azure Synapse Azure DevOps & Microsoft Azure ecosystem Power BI (including semantic models) Python (incl. PySpark) and advanced SQL dbt with SQL DBs (data transformation & modelling) Dimension data modelling Terraform for infrastructure-as-code deployments Strong experience with both structured and unstructured data. Delivery track More ❯
Posted:

Senior Data Engineer - UK based - Remote/Hybrid - £75-95k DOE + Equity

Bolton, Greater Manchester, United Kingdom
Hybrid / WFH Options
Futureheads Recruitment | B Corp™
while establishing and maintaining best practices for data governance, security, and reliability. What we’re looking for: 5+ years’ experience in data engineering and cloud infrastructure Expertise in Python (PySpark), SQL, and dbt (or similar) Strong DevOps skills: Terraform, Docker, CI/CD, monitoring/logging tools AWS experience: S3, ECS/Fargate, RDS, Lambda Data warehousing experience (PostgreSQL More ❯
Posted:

Pricing Manager

bolton, greater manchester, north west england, united kingdom
Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
Posted:

Platform Engineer

bolton, greater manchester, north west england, united kingdom
TalkTalk
source of truth. Develop and optimise CI/CD pipelines in Azure DevOps to automate deployment of workspaces, Unity Catalog, networking, and security. Work with Databricks (Spark/Scala, PySpark) to support ingestion frameworks, data processing, and platform-level libraries. Implement secure connectivity (VNET injection, Private Link, firewall, DNS, RBAC). Monitor, troubleshoot, and optimise platform reliability and performance. … processes, and standards for wider engineering adoption. Must Have: Proven expertise with Microsoft Azure (networking, security, storage, compute). Strong proficiency in Databricks with hands-on Scala (Spark) and PySpark . Deep experience with Terraform for Azure resource deployment and governance. Hands-on with Azure DevOps pipelines (YAML, agents, service connections). Understanding of Azure Active Directory/Entra More ❯
Posted: