Python (PySpark) . Ingest, transform, and curate data from multiple sources into Azure Data Lake and Delta Lake formats. Build and optimize datasets for performance and reliability in Azure Databricks . Collaborate with analysts and business stakeholders to translate data requirements into robust technical solutions. Implement and maintain ETL/ELT pipelines using Azure Data Factory or Synapse Pipelines . … Monitor and troubleshoot production jobs and processes. Preferred Skills & Experience: Strong proficiency in SQL for data transformation and performance tuning. Solid experience with Python , ideally using PySpark in Azure Databricks . Hands-on experience with Azure Data Lake Storage Gen2 . Understanding of data warehouse concepts , dimensional modelling , and data architecture . Experience working with Delta Lake and large-scale More ❯
Interpath is an international and fast-growing advisory business with deep expertise in a broad range of specialisms spanning Deals, Advisory and Restructuring capabilities. We deliver tangible results for global businesses, their investors, and stakeholders when complex problems arise, and More ❯
Liverpool, Lancashire, United Kingdom Hybrid / WFH Options
Intuita - Vacancies
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
SFTP/FTPS), and remediation of security vulnerabilities (DAST, Azure Defender). Expertise in Python for writing efficient code and maintaining reusable libraries. Experienced with microservice design patterns, and Databricks/Spark for big data processing. Strong knowledge of SQL/NoSQL databases corresponding ELT workflows. Excellent problem-solving, communication, and collaboration skills in fast-paced environments. 3 years' professional More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
working model (1-2 days in office) Pension contribution Great opportunities for career progression And many more Role & Responsibilities Design and deliver solutions using MS Fabric, ADF, ADL, Synapse, Databricks, SQL, and Python. Work closely with a variety of clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use … in building out, developing, and training the data engineering function. What do I need to apply Strong MS data engineering expertise (Fabric, ADF, ADL, Synapse, SQL) Expert use of Databricks Strong Python experience Consultancy experience Leadership experience My client are looking to book in first stage interviews for next week and slots are already filling up fast. I have limited More ❯
working model (1-2 days in office) Pension contribution Great opportunities for career progression And many more Role & Responsibilities Design and deliver solutions using MS Fabric, ADF, ADL, Synapse, Databricks, SQL, and Python. Work closely with a variety of clients to gather requirements and deliver solutions. Be willing to engage and assist in pre-sales activities, bids, proposals etc. Use … in building out, developing, and training the data engineering function. What do I need to apply Strong MS data engineering expertise (Fabric, ADF, ADL, Synapse, SQL) Expert use of Databricks Strong Python experience Consultancy experience Leadership experience My client are looking to book in first stage interviews for next week and slots are already filling up fast. I have limited More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Pharmaceutical Company - Manchester(Tech Stack: Data Engineer, Databricks, Python, Power BI, Azure, TSQL, ETL, Agile Methodologies)About the Role: We are seeking a talented and experienced Data Engineer on behalf of our client, a leading Software House. This is a fully remote position, offering the opportunity to work with cutting-edge technologies and contribute to exciting projects More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - Leading Pharmaceutical Company - Manchester (Tech Stack: Data Engineer, Databricks, Python, Power BI, Azure, TSQL, ETL, Agile Methodologies) About the Role: We are seeking a talented and experienced Data Engineer on behalf of our client, a leading Software House. This is a fully remote position, offering the opportunity to work with cutting-edge technologies and contribute to exciting projects More ❯
managing teams & cross-functional data projects. Proven experience in data engineering, analytics, or data architecture, with experience leading & mentoring others. Strong understanding of Microsoft Azure (Data Lakes, Blob Storage), Databricks, and modern data stack tooling. Proficiency in SQL and Python for data manipulation and transformation. Familiarity with data visualisation tools such as power BI. Strong interpersonal and leadership skills - able More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
MAG (Airports Group)
models, exploring customer behaviours, and supporting personalisation strategies - with opportunities to work on NLP projects too. You'll also take ownership of projects, support our data science tooling (including Databricks and AWS), and collaborate closely with experts in Data Engineering, BI, Analytics, and Data Governance to solve problems and create scalable solutions that make a tangible difference. What's in … and continuously develop your skills in a collaborative, hybrid working environment. About you Role Responsibilities: Design, build, and maintain scalable machine learning pipelines using Python and PySpark. Work within Databricks to develop, schedule, and monitor data workflows, utilising Databricks Asset Bundles. Collaborate with data analysts, engineers, and other scientists to deliver clean, reliable, and well-documented datasets. Develop and maintain … skills with a problem-solving mindset. Strong analytical and communication skills, with the ability to tailor complex insights for both technical and non-technical audiences. Hands-on experience with Databricks for deploying, monitoring, and maintaining machine learning pipelines. Experience working with AWS data services and architectures. Good understanding of code versioning and CI/CD tools and practices. Familiarity with More ❯
Southeast Asia, Australia, and New Zealand-including many of the world's largest Fortune 1000 and Global 2000 companies. With strong global momentum, a growing partner ecosystem including SentinelOne, Databricks, and Google Cloud, and a major fundraise on the horizon, we're scaling quickly toward long-term growth and IPO readiness. Join us as we define the future of SaaS … Key Qualifications: 1-2 years of experience in software development. Experience with data pipeline development and optimization (ETL/ELT processes). Solid understanding of distributed systems, databases,(PostgreSQL, Databricks, Clickhouse, ElasticSearch), and performance tuning. Familiarity with modern web frameworks and front-end technologies (React, Vue, Angular, etc.) Experience with data processing frameworks (Apache Spark, Kafka, Airflow, Dagster or similar More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
and version management of large numbers of data science models (Azure DevOps). You'll support the implementation of Machine Learning Ops on cloud (Azure & Azure ML. Experience with Databricks is advantageous.) You'll protect against model degradation and operational performance issues through the development and continual automated monitoring of model execution and model quality. You'll manage automatic model … and integration Basic understanding of networking concepts within Azure Familiarity with Docker and Kubernetes is advantageous Experience within financial/insurance services industry is advantageous Experience with AzureML and Databricks is advantageous Skills & Qualifications Strong understanding of Microsoft Azure, (Azure ML, Azure Stream Analytics, Cognitive services, Event Hubs, Synapse, and Data Factory) Fluency in common data science coding capabilities such More ❯
tools applied to business problems Solid stakeholder management skills - able to work with teams across the business Background in insurance or financial services is a strong advantage Familiarity with Databricks is a plus More ❯
tools applied to business problems Solid stakeholder management skills - able to work with teams across the business Background in insurance or financial services is a strong advantage Familiarity with Databricks is a plus JBRP1_UKTJ More ❯
tools applied to business problems Solid stakeholder management skills - able to work with teams across the business Background in insurance or financial services is a strong advantage Familiarity with Databricks is a plus JBRP1_UKTJ More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Cornwallis Elt Ltd
enhance analytics or operations Comfortable coding in Python Strong DAX and T-SQL skills, including query optimization Industry experience in insurance or financial services is a plus Familiarity with Databricks is advantageous More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
on experience with Terraform and Bicep, deploying Azure infrastructure with automation, templates, and CI/CD pipelines. Azure Expertise: Solid experience with Azure services such as Synapse, Data Factory, Databricks, DevOps, networking, and security best practices. Effective Communicator & Team Player: Ability to translate technical concepts for diverse audiences and collaborate effectively. Creative Problem Solver: Enjoy solving complex infrastructure challenges with More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
First Central Services
practices and tools (Azure DevOps preferred). Experience with microservices architecture, RESTful API development, and system integration. Prior experience in financial services or insurance sectors advantageous. Familiarity with AzureML, Databricks, related Azure technologies, Docker, Kubernetes, and containerization is advantageous. Advanced proficiency in Python, and familiarity with AI frameworks such as LangChain Skilled in designing and operationalising AI Ops frameworks within More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Fdo Consulting Limited
Lead Data Engineer, £ 70000 - 80000 + benefits. SQL, ETL, Data Warehousing, Databricks etc. Home Based with one day a month at the office in Nottingham. Expanding SaaS product company are looking for a Lead Data Engineer as they continue to grow. In this hands-on role you will be part of the team responsible for designing, creating, deploying and managing More ❯
Azure Data Platform Engineer, you will be responsible for:- Designing, building, and maintaining scalable data solutions on Microsoft Azure. Designing and implementing scalable data pipelines using Azure Data Factory, Databricks, Synapse Analytics, and other Azure services. Leading technical workstreams and supporting project delivery. Acting as the subject-matter expert for cloud-based data engineering. Ensure data governance, security, and compliance … maintain best practices. If you possess a combination of some of the following skills, then LETS TALK! Proven experience with Azure data services (SQL, Data Factory, Data Lake, Synapse, Databricks, Azure SQL and Cosmos DB). Strong proficiency in designing and operating scalable data solutions and pipelines. Platform engineering - Terraform Familiar with Cloud security, performance optimisation and monitoring tools (Azure More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
The Co-operative Group
to join our Digital Technology & Data team in Manchester to help take our data driven decision-making to the next level. We already have modern cloud platforms in place (Databricks, Azure) and use data far and wide across Co-op. We want your help to scale our team, empower more self-service, and help our Data Engineers be even more … have Significant experience in data engineering and data platform design Experience leading a data engineering or data-focussed software engineering team Proficiency with our data software (Azure Data Factory, Databricks) A track record of supporting their team's learning and development Good communication and relationship-building skills, with the ability to influence and challenge people, of all levels of seniority More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Capgemini
to a wide variety of audiences. For candidates applying for the Senior Consultant role, we additionally require: Working experience with at least one Cloud Platform (AWS, Azure, GCP, Snowflake, Databricks etc.) and exposure to Cloud Architecture principles. Demonstrated experience in people management, product owner or workstream management. Experience supporting and participating in the commercial cycle, including defining project scope and More ❯
in our data science practices. This is a fantastic opportunity for a curious, solutions-focused data scientist to help build out our capability, working with cutting-edge tools like Databricks, AWS data services, PySpark, and CI/CD pipelines. What's in it for you? You'll be joining a collaborative, supportive team with a real passion for data-led … classification, regression, forecasting, and/or NLP Analytical mindset with the ability to present insights to both technical and non-technical audiences Experience deploying and maintaining ML pipelines within Databricks Comfortable working with AWS data services and modern data architectures Experience with CI/CD pipelines and code versioning best practices Preferred skills: Familiarity with Databricks Asset Bundles (DAB) for More ❯
Liverpool, Merseyside, North West, United Kingdom Hybrid / WFH Options
Forward Role
BI Proven track record managing secure, high-performance database environments A genuine passion for continuous learning and staying ahead of industry innovation Experience with Azure Synapse/Sharedo/Databricks (nice to have) Python (nice to have) Ready to Take the Next Step? If you're a Senior Database Developer who's ready to elevate your career APPLY NOW or More ❯
Bolton, Greater Manchester, United Kingdom Hybrid / WFH Options
Datatech Analytics
a range of data science tools, especially Python, R, and SQL Experience of cloud computing - Cloud Computing – Azure, AWS or GCP Substantial experience working in cloud-based tools like Databricks for Machine Learning, Azure Machine Learning and Azure AI Foundry as well as experience helping others to use them. If you are seeking a Data Science leadership role get in More ❯