Databricks, Azure Data Lake Storage, Delta Lake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python, SQL, Bash, and PySpark for automation. Strong aptitude for data pipeline monitoring and an understanding of data security practices such as RBAC and encryption. Implemented data and pipeline observability dashboards, ensuring high data More ❯
of) Azure Databricks, Data Factory, Storage, Key Vault Experience with source control systems, such as Git dbt (Data Build Tool) for transforming and modelling data SQL (Spark SQL) & Python (PySpark) Certifications (Ideal) SAFe POPM or Scrum PSPO Microsoft Certified: Azure Fundamentals (AZ-900) Microsoft Certified: Azure Data Fundamentals (DP-900) What's in it for you Skipton values work More ❯
Leeds, West Yorkshire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
architecture Ensuring best practices in data governance, security, and performance tuning Requirements: Proven experience with Azure Data Services (ADF, Synapse, Data Lake) Strong hands-on experience with Databricks (including PySpark or SQL) Solid SQL skills and understanding of data modelling and ETL/ELT processes Familiarity with Delta Lake and lakehouse architecture A proactive, collaborative approach to problem-solving More ❯
name within technology. They are looking for Data Engineers with significant Databricks experience to join an exceptional Agile engineering team. We are seeking a Data Engineer with strong Python, PySpark, and SQL experience, a clear understanding of Databricks, and a passion for Data Science (R, Machine Learning, and AI). Database experience with SQL and NoSQL databases such as More ❯
understanding of AI/ML/DL and Statistics, as well as coding proficiency using related open source libraries and frameworks. Significant proficiency in SQL and languages like Python, PySpark and/or Scala. Can lead, work independently as well as play a key role in a team. Good communication and interpersonal skills for working in a multicultural work More ❯
South Yorkshire, England, United Kingdom Hybrid / WFH Options
Erin Associates
SQL Server. Understanding of applying master data management principles, data quality frameworks and data governance best practices. Understanding of Azure Data Factory, Fabric and similar technologies Tech Stack – Python, PySpark, SQL, Xpath, XML, Azure-based Data Science tools, BI tools, Data Visualisation, Agile. The company have an excellent reputation within their sector and have shown consistent growth year-on More ❯
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Erin Associates
SQL Server. Understanding of applying master data management principles, data quality frameworks and data governance best practices. Understanding of Azure Data Factory, Fabric and similar technologies Tech Stack Python, PySpark, SQL, Xpath, XML, Azure-based Data Science tools, BI tools, Data Visualisation, Agile. The company have an excellent reputation within their sector and have shown consistent growth year-on More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
Knowledge of the technical differences between different packages for some of these model types would be an advantage. Experience in statistical and data science programming languages (e.g. R, Python, PySpark, SAS, SQL) A good quantitative degree (Mathematics, Statistics, Engineering, Physics, Computer Science, Actuarial Science) Experience of WTW's Radar software is preferred Proficient at communicating results in a concise More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Vermelo RPO
ideally with a focus in Motor Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated with a drive to learn More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Gerrard White
ideally with a focus in Motor Experience and detailed technical knowledge of GLMs/Elastic Nets, GBMs, GAMs, Random Forests, and clustering techniques Experience in programming languages (e.g. Python, PySpark, R, SAS, SQL) Proficient at communicating results in a concise manner both verbally and written Behaviours: Motivated by technical excellence Team player Self-motivated with a drive to learn More ❯
Lead Data Engineer (Databricks) - Leeds (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer) Our … modern data platform using cutting-edge technologies, architecting big data solutions and developing complex enterprise data ETL and ML pipelines and projections. The successful candidate will have strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No … Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Lead Data Engineer, Team Lead, Technical Lead, Senior Data Engineer, Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Lead Data More ❯
Data Engineer (Databricks) - Leeds (Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Data Engineer) Our client is a global innovator and world leader with one of the most recognisable names within technology. They are looking for … Data Engineers with significant Databricks experience to join an exceptional Agile engineering team. We are seeking a Data Engineer with strong Python, PySpark and SQL experience, possess a clear understanding of databricks, as well as a passion for Data Science (R, Machine Learning and AI). Database experience with SQL and No-SQL - Aurora, MS SQL Server, MySQL is … top performers. Location: Leeds Salary: £40k - £50k + Pension + Benefits To apply for this position please send your CV to Nathan Warner at Noir Consulting. (Data Engineer, Python, PySpark, SQL, Big Data, Databricks, R, Machine Learning, AI, Agile, Scrum, TDD, BDD, CI/CD, SOLID principles, Github, Azure DevOps, Jenkins, Terraform, AWS CDK, AWS CloudFormation, Azure, Data Engineer More ❯
Azure Data Engineer - 1/2 days onsite Summary: Join a team building a modern Azure-based data platform. This hands-on engineering role involves designing and developing scalable, automated data pipelines using tools like Data Factory, Databricks, Synapse, and More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Parking Network BV
For airports, for partners, for people. We are CAVU. At CAVU our purpose is to find new and better ways to make airport travel seamless and enjoyable for everybody. From the smallest ideas to the biggest changes. Every day here More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
MAG (Airports Group)
Press Tab to Move to Skip to Content Link Select how often (in days) to receive an alert: For airports, for partners, for people. We are CAVU. At CAVU our purpose is to find new and better ways to make More ❯
practices. This is a fantastic opportunity for a curious, solutions-focused data scientist to help build out our capability, working with cutting-edge tools like Databricks, AWS data services, PySpark, and CI/CD pipelines. What's in it for you? You'll be joining a collaborative, supportive team with a real passion for data-led innovation. It's … business impact - we'd love to hear from you. About you: 2-5 years of experience in Data Science or a related field Strong programming skills in Python and PySpark Strong data science modelling skills across classification, regression, forecasting, and/or NLP Analytical mindset with the ability to present insights to both technical and non-technical audiences Experience More ❯
practices. This is a fantastic opportunity for a curious, solutions-focused data scientist to help build out our capability, working with cutting-edge tools like Databricks, AWS data services, PySpark, and CI/CD pipelines. What's in it for you? You'll be joining a collaborative, supportive team with a real passion for data-led innovation. It's … we can reach new heights. Together, we are CAVU. About You: 2-5 years of experience in Data Science or a related field Strong programming skills in Python and PySpark Strong data science modelling skills across classification, regression, forecasting, and/or NLP Analytical mindset with the ability to present insights to both technical and non-technical audiences Experience More ❯
Leeds, Yorkshire, United Kingdom Hybrid / WFH Options
PEXA Group Limited
the transformation pipeline from start to finish, guaranteeing that datasets are robust, tested, secure, and business-ready. Our data platform is built using Databricks, with data pipelines written in PySpark and orchestrated using Airflow. You will be expected to challenge and improve current transformations, ensuring they meet our performance, scalability, and data governance needs. This includes work with complex … days per year for meaningful collaboration in either Leeds or Thame. Key Responsibilities Ensure end-to-end data quality, from raw ingested data to business-ready datasets Optimise PySpark-based data transformation logic for performance and reliability Build scalable and maintainable pipelines in Databricks and Airflow Implement and uphold GDPR-compliant processes around PII data Collaborate with stakeholders to … management, metadata management, and wider data governance practices Help shape our approach to reliable data delivery for internal and external customers Skills & Experience Required Extensive hands-on experience with PySpark, including performance optimisation Deep working knowledge of Databricks (development, architecture, and operations) Proven experience working with Airflow for orchestration Proven track record in managing and securing PII data, with More ❯