understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks. Significant proficiency in SQL and languages like Python, PySpark and/or Scala. Can lead, work independently as well as play a key role in a team. Good communication and interpersonal skills for working in a multicultural work More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
ideally with a focus in Motor Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated with a drive to learn More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
ideally with a focus in Motor Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated with a drive to learn More ❯
Azure Data Engineer - 1/2 days onsite Summary: Join a team building a modern Azure-based data platform. This hands-on engineering role involves designing and developing scalable, automated data pipelines using tools like Data Factory, Databricks, Synapse, and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
MAG (Airports Group)
Press Tab to Move to Skip to Content Link Select how often (in days) to receive an alert: For airports, for partners, for people. We are CAVU. At CAVU our purpose is to find new and better ways to make More ❯
practices. This is a fantastic opportunity for a curious, solutions-focused data scientist to help build out our capability, working with cutting-edge tools like Databricks, AWS data services, PySpark, and CI/CD pipelines. What's in it for you? You'll be joining a collaborative, supportive team with a real passion for data-led innovation. It's … business impact - we'd love to hear from you. About you: 2-5 years of experience in Data Science or a related field Strong programming skills in Python and PySpark Strong data science modelling skills across classification, regression, forecasting, and/or NLP Analytical mindset with the ability to present insights to both technical and non-technical audiences Experience More ❯
alerting systems to maintain data health and accuracy Define KPIs and thresholds in collaboration with technical and non-technical stakeholders Develop and productionise machine learning and statistical models (Python, PySpark) Deploy monitoring solutions on AWS infrastructure Create scalable frameworks for future monitoring needs Investigate anomalies and ensure quick resolution of issues in the data pipeline Advocate for data quality … best practices across the business Provide mentorship and contribute to a culture of continuous improvement About You: Proficient in Python and SQL Experience working with large datasets, preferably using PySpark Solid understanding of AWS or similar cloud infrastructure Methodical, detail-oriented, and comfortable working independently Able to translate business needs into technical solutions Previous experience building monitoring or data More ❯
alerting systems to maintain data health and accuracy Define KPIs and thresholds in collaboration with technical and non-technical stakeholders Develop and productionise machine learning and statistical models (Python, PySpark) Deploy monitoring solutions on AWS infrastructure Create scalable frameworks for future monitoring needs Investigate anomalies and ensure quick resolution of issues in the data pipeline Advocate for data quality … best practices across the business Provide mentorship and contribute to a culture of continuous improvement About You: Proficient in Python and SQL Experience working with large datasets, preferably using PySpark Solid understanding of AWS or similar cloud infrastructure Methodical, detail-oriented, and comfortable working independently Able to translate business needs into technical solutions Previous experience building monitoring or data More ❯