Leeds, West Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Forward Role
financial datasets Python experience, particularly for data processing and ETL workflows Hands-on experience with cloud platforms- Azure Experience designing and maintaining data pipelines using tools like Databricks and PySpark Knowledge of data warehousing solutions - Snowflake experience would be brilliant Understanding of CI/CD processes for deploying data solutions Some exposure to big data technologies and distributed processing More ❯
Azure. Especially Synapse, ADF and Power BI (Datasets and Reports). Ideally SSIS, SSRS, SSAS with some understanding of Power App design and delivery Expert in SQL and Python (PySpark) languages, any other object orientated language skills would be a benefit Expert in data modelling and data architecture concepts Experience of setup and management of code management & deployment tools More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
architecture Ensuring best practices in data governance, security, and performance tuning Requirements: Proven experience with Azure Data Services (ADF, Synapse, Data Lake) Strong hands-on experience with Databricks (including PySpark or SQL) Solid SQL skills and understanding of data modelling and ETL/ELT processes Familiarity with Delta Lake and lakehouse architecture A proactive, collaborative approach to problem-solving More ❯
understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks. Significant proficiency in SQL and languages like Python, PySpark and/or Scala. Can lead, work independently as well as play a key role in a team. Good communication and interpersonal skills for working in a multicultural work More ❯
Trafford Park, Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Ikhoi Recruitment
Health & Safety. About You: Here’s what we’re looking for: · Proficiency in Python and SQL for analysis, model development, and data interrogation. · Experience in handling large datasets with PySpark and managing distributed data processing. · Comfortable deploying statistical or ML models into production environments. · Strong understanding of cloud infrastructure, preferably AWS. · A methodical, problem-solving mindset with high attention More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW’s Radar software is preferred Proficient at communicating results in a concise More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
bolton, greater manchester, north west england, united kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
ideally with a focus in Motor Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated with a drive to learn More ❯
Lead Data Engineer (Databricks) - Leeds (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) Our … modern data platform using cutting-edge technologies, architecting big data solutions and developing complex enterprise data ETL and ML pipelines and projections. The successful candidate will have strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No … Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
operations, and commercial teams — all while growing your skills in a modern, cloud-first environment. Why This Role Stands Out: You’ll build scalable, production-grade data pipelines using PySpark , SQL , and Databricks Your work will directly power analytics, BI, and data science across the business You’ll be part of a collaborative, forward-thinking data team that values … engineering You’ll get space to grow, learn new tools, and make a real impact What You’ll Bring: 2–5 years’ experience in data engineering or similar Strong PySpark and advanced SQL skills Hands-on experience with Databricks and building ETL/ELT pipelines Familiarity with CI/CD and version control Bonus Points For: Experience with Databricks More ❯
build, and optimize cloud-native data ecosystems that power analytics, governance, and business intelligence across the firm. You'll work with cutting-edge technologies like Azure Synapse, Databricks, Python, PySpark, and Collibra to deliver secure, high-performance solutions. What You'll Do Engineer and optimize modern data platforms using Azure and Databricks Build robust data pipelines with Python and … PySpark Automate deployments and infrastructure with DevOps and CI/CD Implement access controls and data protection policies Integrate governance tools like Collibra for metadata and lineage tracking Collaborate with cross-functional teams to enable data-driven decision-making What You'll Bring Proven experience in cloud-based data engineering (Azure, Databricks) Strong Python/PySpark skills and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Parking Network BV
commercial, and operations - and this role will have a big say in what we build next. You'll be responsible for designing and building robust, scalable data pipelines using PySpark, SQL and Databricks - enabling our analytics, BI and data science colleagues to unlock real value across the business. This is a brilliant opportunity for someone who's passionate about … your expertise further - especially with tools like Databricks. Here's what will help you thrive in this role: 2-5 years in data engineering or a related field Strong PySpark and advanced SQL skills Practical experience building and maintaining ETL/ELT pipelines in Databricks Familiarity with CI/CD pipelines and version control practices Nice to have: Experience More ❯
Azure Data Engineer - 1/2 days onsite Summary: Join a team building a modern Azure-based data platform. This hands-on engineering role involves designing and developing scalable, automated data pipelines using tools like Data Factory, Databricks, Synapse, and More ❯
and secure architectures on Databricks and distributed systems. Anticipate scaling challenges and ensure platforms are future-proof. Lead the design and development of robust, high-performance data pipelines using PySpark and Databricks. Define and ensure testing frameworks for data workflows. Ensure end-to-end data quality from raw ingestion to curated, trusted datasets powering analytics. Establish and enforce best … they add value to the ecosystem. Skills & Experience Required Broad experience as a Data Engineer including technical leadership Broad cloud experience, ideally both Azure and AWS Deep expertise in PySpark and distributed data processing at scale. Extensive experience designing and optimising in Databricks. Advanced SQL optimisation and schema design for analytical workloads. Strong understanding of data security, privacy, and More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
PEXA Group Limited
the transformation pipeline from start to finish, guaranteeing that datasets are robust, tested, secure, and business-ready. Our data platform is built using Databricks, with data pipelines written in PySpark and orchestrated using Airflow. You will be expected to challenge and improve current transformations, ensuring they meet our performance, scalability, and data governance needs. This includes work with complex … days per year for meaningful collaboration in either Leeds or Thame. Key Responsibilities Ensure end-to-end data quality, from raw ingested data to business-ready datasets Optimise PySpark-based data transformation logic for performance and reliability Build scalable and maintainable pipelines in Databricks and Airflow Implement and uphold GDPR-compliant processes around PII data Collaborate with stakeholders to … management, metadata management, and wider data governance practices Help shape our approach to reliable data delivery for internal and external customers Skills & Experience Required Extensive hands-on experience with PySpark, including performance optimisation Deep working knowledge of Databricks (development, architecture, and operations) Proven experience working with Airflow for orchestration Proven track record in managing and securing PII data, with More ❯
alerting systems to maintain data health and accuracy Define KPIs and thresholds in collaboration with technical and non-technical stakeholders Develop and productionise machine learning and statistical models (Python, PySpark) Deploy monitoring solutions on AWS infrastructure Create scalable frameworks for future monitoring needs Investigate anomalies and ensure quick resolution of issues in the data pipeline Advocate for data quality … best practices across the business Provide mentorship and contribute to a culture of continuous improvement About You: Proficient in Python and SQL Experience working with large datasets, preferably using PySpark Solid understanding of AWS or similar cloud infrastructure Methodical, detail-oriented, and comfortable working independently Able to translate business needs into technical solutions Previous experience building monitoring or data More ❯
alerting systems to maintain data health and accuracy Define KPIs and thresholds in collaboration with technical and non-technical stakeholders Develop and productionise machine learning and statistical models (Python, PySpark) Deploy monitoring solutions on AWS infrastructure Create scalable frameworks for future monitoring needs Investigate anomalies and ensure quick resolution of issues in the data pipeline Advocate for data quality … best practices across the business Provide mentorship and contribute to a culture of continuous improvement About You: Proficient in Python and SQL Experience working with large datasets, preferably using PySpark Solid understanding of AWS or similar cloud infrastructure Methodical, detail-oriented, and comfortable working independently Able to translate business needs into technical solutions Previous experience building monitoring or data More ❯
Salford, Greater Manchester, North West, United Kingdom Hybrid / WFH Options
Searchworks Ltd
and implement scalable monitoring and alerting solutions across data pipelines Define critical metrics and thresholds in collaboration with technical and non-technical teams Build and deploy models using Python, PySpark, and AWS services Troubleshoot data anomalies and promote best practices in data quality Contribute to future-proof frameworks that support growth across teams and projects Mentor junior colleagues and … support a culture of curiosity and rigor ?? What We're Looking For: Strong Python and SQL skills Experience with large datasets and distributed processing (PySpark) Familiarity with deploying models into production environments Solid understanding of cloud infrastructure (preferably AWS) A proactive, detail-oriented mindset and the ability to deliver independently Bonus: Experience building monitoring or data quality frameworks More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Searchability®
to £70,000 p/a plus bonus & excellent benefits Shape the future of monitoring & alerting infrastructure for a global data insights provider We’re looking for strong Python, PySpark, and AWS skills, with experience in large-scale data systems Hybrid working – 2-3 days in the office ABOUT THE CLIENT: Our client is a pioneering force in the … embedding best practices in data quality and monitoring. KEY SKILLS/EXPERIENCE: Proficient in Python & SQL for analysis, modelling, and interrogation Experience with large datasets and distributed data processing (PySpark) Skilled in deploying ML/statistical models into production Knowledge of AWS cloud infrastructure Strong analytical, problem-solving, and troubleshooting skills Ability to work with technical and non-technical … express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Data Scientist/Python/PySpark/AWS/SQL/Monitoring & Alerting/Data Quality/Machine Learning/Statistical Modelling More ❯
bolton, greater manchester, north west england, united kingdom Hybrid / WFH Options
Searchability®
to £70,000 p/a plus bonus & excellent benefits Shape the future of monitoring & alerting infrastructure for a global data insights provider We’re looking for strong Python, PySpark, and AWS skills, with experience in large-scale data systems Hybrid working – 2-3 days in the office ABOUT THE CLIENT: Our client is a pioneering force in the … embedding best practices in data quality and monitoring. KEY SKILLS/EXPERIENCE: Proficient in Python & SQL for analysis, modelling, and interrogation Experience with large datasets and distributed data processing (PySpark) Skilled in deploying ML/statistical models into production Knowledge of AWS cloud infrastructure Strong analytical, problem-solving, and troubleshooting skills Ability to work with technical and non-technical … express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Data Scientist/Python/PySpark/AWS/SQL/Monitoring & Alerting/Data Quality/Machine Learning/Statistical Modelling More ❯
warrington, cheshire, north west england, united kingdom Hybrid / WFH Options
Searchability®
to £70,000 p/a plus bonus & excellent benefits Shape the future of monitoring & alerting infrastructure for a global data insights provider We’re looking for strong Python, PySpark, and AWS skills, with experience in large-scale data systems Hybrid working – 2-3 days in the office ABOUT THE CLIENT: Our client is a pioneering force in the … embedding best practices in data quality and monitoring. KEY SKILLS/EXPERIENCE: Proficient in Python & SQL for analysis, modelling, and interrogation Experience with large datasets and distributed data processing (PySpark) Skilled in deploying ML/statistical models into production Knowledge of AWS cloud infrastructure Strong analytical, problem-solving, and troubleshooting skills Ability to work with technical and non-technical … express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Data Scientist/Python/PySpark/AWS/SQL/Monitoring & Alerting/Data Quality/Machine Learning/Statistical Modelling More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Searchability
to £70,000 p/a plus bonus & excellent benefits Shape the future of monitoring & alerting infrastructure for a global data insights provider We're looking for strong Python, PySpark, and AWS skills, with experience in large-scale data systems Hybrid working - 2-3 days in the office ABOUT THE CLIENT: Our client is a pioneering force in the … embedding best practices in data quality and monitoring. KEY SKILLS/EXPERIENCE: Proficient in Python & SQL for analysis, modelling, and interrogation Experience with large datasets and distributed data processing (PySpark) Skilled in deploying ML/statistical models into production Knowledge of AWS cloud infrastructure Strong analytical, problem-solving, and troubleshooting skills Ability to work with technical and non-technical … express consent for us to process & submit (subject to required skills) your application to our client in conjunction with this vacancy only. KEY SKILLS: Data Scientist/Python/PySpark/AWS/SQL/Monitoring & Alerting/Data Quality/Machine Learning/Statistical Modelling/Manchester/Hybrid Working More ❯