Data Engineering Manager £110,000-£115,000 + 10% bonus ️Databricks, Snowflake, Terraform, Pyspark, Azure London, hybrid working (2 days in office) Leading property data & risk software company We are partnered with a leading property data & risk software company that contributes valuations, insights, and decisioning technology to over 1 million mortgage approvals each year. They are looking for a hands … in Azure, data visualization, and data modeling. Engage in projects that influence the company's bottom line. Drive the business forward by enabling better decision-making processes. Tech Stack : Databricks, Azure, Python, Pyspark, Terraform. What's in it for you 7.5% pension contribution by the company Discretionary annual bonus up to 10% of base salary 25 days annual leave plus More ❯
The primary focus is on building and maintaining the infrastructure to support the full data science lifecycle from data ingestion to model deployment, monitoring, and upgrades within Azure and Databricks environments. The engineer will work closely with data scientists in a collaborative, cross-functional setting, helping transition models from research into production. Key Responsibilities: Own and develop deployment frameworks for … cross-functional teams to ensure smooth productionisation of models. Write clean, production-ready Python code. Apply software engineering best practices, CI/CD, TDD. Required Skills: Proficiency in Python, Databricks, and Azure. Experience with deployment tools (e.g., AKS, managed endpoints). Strong software engineering background (CI/CD, VCS, TDD). Ability to integrate ML into business workflows. Desirable: Background More ❯
remote. Our client is looking for an experienced Lead Data Engineer with the following skill set: Experience with Azure data services, Azure Data Factory (ADF), API development and integration, Databricks, Python programming, Unit testing, GIT version control, Data management and data governance, Data security. If this is you, get in touch ASAP and send us your CV More ❯
Data QA will be responsible for executing comprehensive manual tests to ensure the quality of data flows across various Chambers products and platforms, with a primary focus on the Databricks environment. This role plays a critical part in maintaining the integrity and reliability of our data services. Main Duties and Responsibilities Testing Execution : Conduct detailed manual tests to validate data … flows and processing within the Databricks platform. Ensure that data integrity, accuracy, and consistency are maintained across all stages of data handling. Strategy and Planning : Assist in developing and defining the QA testing strategy and test plans specifically tailored for continuous data verification and validation. Issue Resolution : Identify, document, and track data discrepancies and inconsistencies. Collaborate with development and data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
role at the intersection of business, technology, and insight. As Data Solution Architect , you'll define how data flows across an evolving digital ecosystem anchored in Microsoft Azure and Databricks, with integrations across business-critical platforms including Tableau, CDPs, CRMs, and GraphQL-powered applications. Responsibilities Strategic Architecture & Integration Design and implement scalable, secure data architectures across cloud platforms. Lead integration … party data agencies and ensure alignment to architectural goals and project timelines. Skills & Experience Essential: 10+ years in enterprise-scale data architecture, preferably within Azure environments. Strong background with Databricks, data modelling, and schema design. 5+ years in team leadership and partner/vendor management. Hands-on experience building integrations with BI, CDP, and CRM platforms. Advanced knowledge of GraphQL More ❯
team. The Azure data warehouse has been built in the last few years, which is robust and provides data but this needs improving. They're also looking to add Databricks, with Unity Catalog, and are keen to optimise its use. They're moving away from a monolithic stack, adopting GraphQL for the new digital platforms, which will feature ML-driven … it from its current state to meet our ambitious future goals. YOUR SKILLS AND EXPERIENCE: The ideal Data Architect will have: Essential Technologies: Strong hands-on experience with Azure, Databricks, and Microsoft Dynamics. Industry Experience: Proven track record in B2C environments such as retail, entertainment, or media. Skills: A deep interest in the data element, coupled with a solid engineering More ❯
Key Requirements: Commercial experience as a Senior Data Engineer, with an active SC + NPPV3 Clearance Experience with Azure Data Factory for building and orchestrating data pipelines Experience with Databricks for Data transformation Experience with Azure Synapse Familiar with Azure Data Lake Active SC Clearance +NPPV3 Clearance Nice to have: Immediate availability Hays Specialist Recruitment Limited acts as an employment More ❯
Key Requirements: Commercial experience as a Senior Data Engineer, with an active SC + NPPV3 Clearance Experience with Azure Data Factory for building and orchestrating data pipelines Experience with Databricks for Data transformation Experience with Azure Synapse Familiar with Azure Data Lake Active SC Clearance +NPPV3 Clearance Nice to have: Immediate availability Hays Specialist Recruitment Limited acts as an employment More ❯
years+ Data Tester experience -3-4 year experience testing exclusively with Python and Pytest. - Experience with C# is useful - Experience of ETL Processing/Data Warehousing testing; including Databricks and Data Factory. - Hands-on experience with SQL or Azure SQL. - Experience using automated testing on Python frameworks (Pystest/Pyspark) - Experience with Specflow and other frameworks. If you are interested More ❯
years+ Data Tester experience -3-4 year experience testing exclusively with Python and Pytest. - Experience with C# is useful - Experience of ETL Processing/Data Warehousing testing; including Databricks and Data Factory. - Hands-on experience with SQL or Azure SQL. - Experience using automated testing on Python frameworks (Pystest/Pyspark) - Experience with Specflow and other frameworks. If you are interested More ❯
Data Science & ML Expert in Python, data wrangling, EDA, modelling, validation, and deployment Strong in supervised & unsupervised learning; exposure to NLP/OCR a plus Proficient with AWS SageMaker, Databricks; Palantir Foundry/AIP is a big plus Insurance/underwriting/claims background is a bonus Confident communicator with client-facing experience Interview Process: Round 1: Technical/scenario More ❯
City of London, Greater London, UK Hybrid / WFH Options
MANNING SERVICES LIMITED
Data Science & ML Expert in Python, data wrangling, EDA, modelling, validation, and deployment Strong in supervised & unsupervised learning; exposure to NLP/OCR a plus Proficient with AWS SageMaker, Databricks; Palantir Foundry/AIP is a big plus Insurance/underwriting/claims background is a bonus Confident communicator with client-facing experience Interview Process: Round 1: Technical/scenario More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Sanderson Recruitment
opportunity to join a prestigious company and to take your career to the next level: Keys skills/Experience: Strong experience working the Azure Data Stack Advance experience with Databricks Insurance experience is a must Solid Data warehousing experience Strong experience with Python T-SQL and SQL is a must Experience leading large projects Experienced working on finance related projects More ❯
to define, test and deliver technical and functional requirements. - You will need experience implementing Master Data Management programmes, this is mandatory for the role. - SQL and PySpark knowledge - Azure Databricks experience. - Experience with Data Querying, and Data Profiling. - Experience working with Large data sets especially analysing and cleansing said data. - Strong Communication Skills. - Experience working within an Agile Environment. If More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Sanderson
opportunity to join a prestigious company and to take your career to the next level: Keys skills/Experience: Strong experience working the Azure Data Stack Advance experience with Databricks Insurance experience is a must Solid Data warehousing experience Strong experience with Python T-SQL and SQL is a must Experience leading large projects Experienced working on finance related projects More ❯
to define, test and deliver technical and functional requirements. - You will need experience implementing Master Data Management programmes, this is mandatory for the role. - SQL and PySpark knowledge - Azure Databricks experience. - Experience with Data Querying, and Data Profiling. - Experience working with Large data sets especially analysing and cleansing said data. - Strong Communication Skills. - Experience working within an Agile Environment. If More ❯
will involve delivering high-quality insights into customer behaviour, marketing performance and loyalty schemes to drive strategic decisions and business growth. Insights Analyst, key skills: Python and SQL knowledge Databricks or BigQuery - highly desirable Stakeholder management Previously worked within customer behaviour, marketing schemes and loyalty schemes Retail experience We are committed to fostering a diverse and inclusive recruitment process. We More ❯
Comfortable working independently with minimal supervision in a dynamic setting Desirable Experience Familiarity with coordinate systems and transforms Exposure to workflow orchestration tools such as Flyte, Apache Spark, or Databricks Experience working with autonomous vehicle (AV) datasets or multi-sensor rigs in production Prior work involving debugging sensor metadata issues (e.g., misaligned extrinsics, inaccurate timestamps More ❯
teams and delivery squads (technical teams are adjacent but not within your remit). Champion best practices in change management and data adoption within the insurance context. Experience is Databricks, Power BI, DevOps More ❯
aligned data solutions. Key Skills & Experience: Proven expertise in creating and maintaining complex data models. Strong experience with data modelling tools and techniques. Experience working with Microsoft Azure and Databricks Solid understanding of database management systems and effective database solution design. Knowledge of data governance, data quality, and metadata management. Ability to translate business requirements into clear data models. Excellent More ❯
of professional experience PhD with 5+ years of professional experience Linux/Unix experience Object Oriented programming language experience Strong Experience with data warehousing and data lakes Experience with Databricks, Advana or similar tools Preferred Self-starter with a willingness to learn new skill Possess strong verbal and written communication skills More ❯
data governance frameworks, metadata management, and data quality best practices. Strong communication and stakeholder management skills, able to work across technical and non-technical teams. Hands-on experience with Databricks . Analytical thinker with a structured, problem-solving approach and an eye for detail. More ❯
throughout the project lifecycle Supporting pre-sales with solution design and technical positioning What They’re Looking For: Strong experience designing data platforms on Azure (e.g. ADF, Synapse, Fabric, Databricks) Hands-on ability in Python, PySpark, and SQL to validate solutions and guide teams Knowledge/hands-on experience with Azure AI Foundry, Azure OpenAI services and other Azure native More ❯
Bury St. Edmunds, Suffolk, England, United Kingdom
Sanderson
translate business requirements into effective data models. Excellent communication and stakeholder engagement skills. Strong analytical mindset with a structured, problem-solving approach. Hands-on experience with Microsoft Azure and Databricks is essential. Reasonable Adjustments: Respect and equality are core values to us. We are proud of the diverse and inclusive community we have built, and we welcome applications from people More ❯
Data QA will be responsible for executing comprehensive manual tests to ensure the quality of data flows across various Chambers products and platforms, with a primary focus on the Databricks environment. This role plays a critical part in maintaining the integrity and reliability of our data services. Main Duties and Responsibilities: Collaborate with cross-functional teams to understand data migration More ❯