understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks. Significant proficiency in SQL and languages like Python, PySpark and/or Scala. Can lead, work independently as well as play a key role in a team. Good communication and interpersonal skills for working in a multicultural work More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
ideally with a focus in Motor Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated with a drive to learn More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
operations, and commercial teams — all while growing your skills in a modern, cloud-first environment. Why This Role Stands Out: You’ll build scalable, production-grade data pipelines using PySpark , SQL , and Databricks Your work will directly power analytics, BI, and data science across the business You’ll be part of a collaborative, forward-thinking data team that values … engineering You’ll get space to grow, learn new tools, and make a real impact What You’ll Bring: 2–5 years’ experience in data engineering or similar Strong PySpark and advanced SQL skills Hands-on experience with Databricks and building ETL/ELT pipelines Familiarity with CI/CD and version control Bonus Points For: Experience with Databricks More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Parking Network BV
commercial, and operations - and this role will have a big say in what we build next. You'll be responsible for designing and building robust, scalable data pipelines using PySpark, SQL and Databricks - enabling our analytics, BI and data science colleagues to unlock real value across the business. This is a brilliant opportunity for someone who's passionate about … your expertise further - especially with tools like Databricks. Here's what will help you thrive in this role: 2-5 years in data engineering or a related field Strong PySpark and advanced SQL skills Practical experience building and maintaining ETL/ELT pipelines in Databricks Familiarity with CI/CD pipelines and version control practices Nice to have: Experience More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
MAG (Airports Group)
Press Tab to Move to Skip to Content Link Select how often (in days) to receive an alert: For airports, for partners, for people. We are CAVU. At CAVU our purpose is to find new and better ways to make More ❯
practices. This is a fantastic opportunity for a curious, solutions-focused data scientist to help build out our capability, working with cutting-edge tools like Databricks, AWS data services, PySpark, and CI/CD pipelines. What's in it for you? You'll be joining a collaborative, supportive team with a real passion for data-led innovation. It's … business impact - we'd love to hear from you. About you: 2-5 years of experience in Data Science or a related field Strong programming skills in Python and PySpark Strong data science modelling skills across classification, regression, forecasting, and/or NLP Analytical mindset with the ability to present insights to both technical and non-technical audiences Experience More ❯
alerting systems to maintain data health and accuracy Define KPIs and thresholds in collaboration with technical and non-technical stakeholders Develop and productionise machine learning and statistical models (Python, PySpark) Deploy monitoring solutions on AWS infrastructure Create scalable frameworks for future monitoring needs Investigate anomalies and ensure quick resolution of issues in the data pipeline Advocate for data quality … best practices across the business Provide mentorship and contribute to a culture of continuous improvement About You: Proficient in Python and SQL Experience working with large datasets, preferably using PySpark Solid understanding of AWS or similar cloud infrastructure Methodical, detail-oriented, and comfortable working independently Able to translate business needs into technical solutions Previous experience building monitoring or data More ❯