Hiring: Snr. Databricks ML Engineer 📍 Remote (UK-based) | 💰 Up to £110,000 We’re working with a cutting-edge AI consultancy who are scaling their ML team with a key hire: a Senior Databricks ML Engineer who’s as comfortable in pre-sales conversations as they are deploying models to production. 🧠 You’ll be someone who: Has deep ML theory … productivity tools and uses them daily Is commercially minded, tech-savvy, and client-facing 🎯 Must-haves: 8+ years in Data Science/ML Engineering 2+ years hands-on with Databricks Strong track record delivering production-grade ML models Solid grasp of MLOps best practices Confident speaking to technical and non-technical stakeholders 🛠️ Tech you’ll be using: Python, SQL, Spark … R MLflow, vector databases GitHub/GitLab/Azure DevOps Jira, Confluence 🎓 Bonus points for: MSc/PhD in ML or AI Databricks ML Engineer (Professional) certified More ❯
Experience in designing and implementing pipelines using DAGs (e.g., Kubeflow, DVC, Ray) Ability to construct batch and streaming microservices exposed as gRPC and/or GraphQL endpoints Experience with Databricks MLflow for ML lifecycle management and model versioning Hands-on experience with Databricks Model Serving for production ML deployments Proficiency with GenAI frameworks/tools and technologies such as Apache More ❯
data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. … Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas More ❯
data is accessed and utilised. As a Senior AWS Data Engineer, you'll play a key role in designing and building a cutting-edge data platform using technologies like Databricks, Snowflake, and AWS Glue. Key Responsibilities: Build and maintain scalable data pipelines, warehouses, and lakes. Design secure, high-performance data architectures. Develop processing and analysis algorithms for complex data sets. … Python and SQL skills for data processing and analysis. Deep understanding of data governance, quality, and security. Knowledge of market data and its business applications. Desirable Experience: Experience with Databricks and Snowflake. Familiarity with machine learning and data science concepts. Strategic thinking and ability to influence cross-functional teams. This role offers the chance to work across multiple business areas More ❯
improvement in process efficiency. JB Hunt - Increased permitted use cases for cloud analytics by 100% by managing access to 100+ databases while achieving cost savings. • Technology partners include Snowflake, Databricks, AWS, Azure, Google Cloud, and Starburst. Immuta was recognized as the Snowflake Data Security Partner of the Year in June 2023. • Immuta has been recognized by Forbes as a top … one of the best workplaces, and by Fast Company as one of the top 50 most innovative companies. • $267 million in total funding. Lead investors include NightDragon, Snowflake, and Databricks, along with additional funding from ServiceNow, Citi Ventures, Dell Technologies Capital, DFJ Growth, IAG, Intel Capital, March Capital, Okta Ventures, StepStone, Ten Eleven Ventures, and Wipro Ventures. • A hybrid workplace … current best practices. Contribute to internal knowledge bases and provide ad-hoc support to Immuta's internal teams. Maintain competency in Immuta's technical adjacencies, including data platforms (Snowflake, Databricks, Starburst), commercial cloud offerings (AWS, Azure, GCP), IDM systems, and data catalog tools. WHAT YOU'LL BRING U.S. Citizenship with an Active TS/SCI with Polygraph is required due More ❯
edge data and analytics solutions. In this strategic and hands-on role, you'll lead technical teams and architect enterprise-grade solutions using the Microsoft Azure data stack (plus Databricks, Power BI, Synapse, and more). You'll take the lead in designing scalable data platforms, influencing decision-makers at all levels, and ensuring high-quality outcomes across the full … Factory, Synapse, Power BI) Strong SQL and modern data architecture experience Ability to lead teams and engage with C-level stakeholders Bonus points for experience with Microsoft Fabric or Databricks To speak in absolute confidence about this opportunity please contact Jess Ritchie, IT Recruitment Consultant at MCS Group or click the apply button below. If this position is not right More ❯
and develop high-performance big data applications, manage complex data sets, and work across the full development lifecycle. You'll explore big data technologies such as SQL, AWS and Databricks to add business value and collaborate with global clients. The role offers extensive training, structured career progression, and opportunities to work with leading brands in a vibrant, technology-driven environment. … Computer Science or in a directly related IT discipline Strong knowledge of - and initial experience with - handling big data projects using SQL and Python Initial experience with AWS and Databricks is strongly desirable Excellent communications skills; an ability to communicate with impact, ensuring complex information is articulated in a meaningful way to wide and varied audiences A passion for technology More ❯
teams understand where new technology could pragmatically deliver value . Manage and monitor the cost, efficiency, and speed of data processing. Our Data Tech Stack Azure Cloud (SQL Server, Databricks, Cosmos DB, Blob Storage) ETL/ELT (Python, Spark, SQL) Messaging (Service Bus, Event Hub) DevOps (Azure DevOps, Github Actions, Terraform) Who you are A driven, ambitious individual who's … in ELT and ETL processes and tools Ability to write efficient code for data extraction, transformation, and loading (eg. Python, Spark and SQL) Proficiency with cloud platforms (particularly Azure Databricks and SQL Server) Ability to work independently Ability to communicate complex technical concepts clearly to both technical and non-technical audiences Desirable skills NoSQL databases (eg. Cosmos DB, DynamoDB, MongoDB More ❯
and contribute to our current libraries and build robust and scalable libraries. Contribute to and maintain our repositories. Assist our integration to cloud-based computing and storage solutions, through Databricks and Azure Occasionally the employee might be part of collaborations across different teams and must report to different project managers.Role Requirements Advanced programming with Python and related data science libraries … managing projects independently Master/PhD in highly numerate discipline (preferably mathematics and statistics).Desirables: Familiar with repository-based version controls (e.g., git/Azure DevOps) Experience with Azure Databricks platformAt Kantar we have an integrated way of rewarding our people based around a simple, clear, and consistent set of principles. Our approach helps to ensure we are market competitive More ❯
About Us: LG Ad Solutions is a global leader in connected TV (CTV) and cross-screen advertising. We pride ourselves on delivering state-of-the-art advertising solutions that integrate seamlessly with today's ever-evolving digital media landscape. The More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Kantar Media
As people increasingly move across channels and platforms, Kantar Media’s data and audience measurement, targeting, analytics and advertising intelligence services unlock insights to inform powerful decision-making. Working with panel and first-party data in over 80 countries, we More ❯
As people increasingly move across channels and platforms, Kantar Media’s data and audience measurement, targeting, analytics and advertising intelligence services unlock insights to inform powerful decision-making. Working with panel and first-party data in over 80 countries, we More ❯
london, south east england, united kingdom Hybrid / WFH Options
Kantar Media
As people increasingly move across channels and platforms, Kantar Media’s data and audience measurement, targeting, analytics and advertising intelligence services unlock insights to inform powerful decision-making. Working with panel and first-party data in over 80 countries, we More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Kantar Media
As people increasingly move across channels and platforms, Kantar Media’s data and audience measurement, targeting, analytics and advertising intelligence services unlock insights to inform powerful decision-making. Working with panel and first-party data in over 80 countries, we More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Kantar Media
As people increasingly move across channels and platforms, Kantar Media’s data and audience measurement, targeting, analytics and advertising intelligence services unlock insights to inform powerful decision-making. Working with panel and first-party data in over 80 countries, we More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
Fdo Consulting Limited
Senior Data Engineer, Databricks, £ 60000 - 65000 + benefits. Strong Performant SQL and Databricks required. Home Based with one day a month at the office in Nottingham. Strong commercial knowledge of Databricks and Performant SQL is required for this role. Knowledge of testing, the finance sector or energy sector is a distinct advantage. Expanding SaaS product company are looking for a … Data Engineers and BA's to understand the data needs of the business. Skills Required Include - Previous experience as Data Engineer in a delivery focused environment. Excellent knowledge of Databricks and Performant SQL. Experience analysing complex business problems and designing workable technical solutions. Excellent knowledge of the SDLC, including testing and delivery in an agile environment. Excellent knowledge of ETL … expand its data team. In these roles you will use your technical skills and soft skills/people skills allowing the data team to further develop. Strong, hands-on databricks and Performant SQL are mandatory for this role. This role is home based with one day a month at their office in Nottingham. Salary is in the range More ❯
PSC/PSA). (Required) . Your knowledge of compliance frameworks relevant to cloud data will be invaluable in maintaining a secure and compliant data environment. (Optional) Snowflake and Databricks (Optional, but highly desired): Leverage your experience with Snowflake and Databricks to enhance our data platform's capabilities and performance. While not mandatory, experience with these technologies is a significant More ❯
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fdo Consulting Limited
Senior Data Engineer, Databricks (Mid Level roles also available), £ 50000 - 65000 + benefits. SQL Performance Tuning, Databricks. Home Based with one day a month at the office in Nottingham. Strong commercial knowledge of Databricks is required for this role. Knowledge of the finance sector or energy sector is an advantage. Expanding SaaS product company are looking for a number of … Data Engineers and BA's to understand the data needs of the business. Skills Required Include - Previous experience as Data Engineer in a delivery focused environment. Excellent knowledge of Databricks Experience analysing complex business problems and designing workable technical solutions. Excellent knowledge of the SDLC, including testing and delivery in an agile environment. Excellent knowledge of SQL (SQL Performance Tuning … expand its data team. In these roles you will use your technical skills and soft skills/people skills allowing the data team to further develop. Strong, hands-on databricks skills are mandatory for this role. This role is home based with one day a month at their office in Nottingham. Salary is in the range £ 50000 - 65000 + benefits More ❯
through data. In this role, you’ll work closely with clients to understand their needs, gather requirements, and shape solutions that leverage modern data platforms—particularly Microsoft Azure and Databricks—to accelerate business outcomes. Your work will span the largest global organisations as well as dynamic mid-tier businesses, providing a truly exciting opportunity to influence how data-driven enterprises … outcomes. Facilitate workshops, discovery sessions, and proof-of-concept activities to build alignment and ensure stakeholder engagement. Work closely with technical experts within Arreoblue and with our partners (Microsoft, Databricks) to ensure solutions are both feasible and aligned with client objectives. Champion transparency and collaboration, ensuring that solutions meet real business needs rather than short-term fixes. What We’re … understanding of how data platforms, analytics, and AI/ML capabilities drive business value. Familiarity with Microsoft’s Intelligent Data Platform stack (Azure Data Services, Fabric/Synapse) and Databricks is highly desirable. Business Analysis Expertise: Demonstrable experience in business analysis within complex enterprise environments. Strong capability in requirements elicitation, process modelling, and translating needs into actionable delivery items. Stakeholder More ❯
is a key role within our organisation, combining technical excellence with strong people and strategic leadership. You'll bring significant experience in data engineering and system development, particularly using Databricks, Microsoft Azure, Power Platform, M365, and integration's. You'll lead by example, remaining hands-on with technology while developing and mentoring a small, agile team. We're based in … stakeholders across the business to understand needs and deliver impactful solutions. Ensure best practices in system design, architecture, and performance. What We're Looking For Deep technical expertise in: Databricks, Azure, Power Platform, M365 System integration's and automation Proven track record of leading and mentoring technical teams, ideally in a small, agile environment. A leader who is equally comfortable More ❯
data solutions using Azure and Databricks. Design medallion-style data architectures (Bronze, Silver, Gold) for structured and unstructured data. Develop robust ETL/ELT pipelines using Azure Data Factory, Databricks, and event-driven frameworks. Model insurance data across domains (policy, claims, quotes, pricing) using 3NF and dimensional schemas. Collaborate with business stakeholders to define and deliver data products aligned with … quality, lineage, and governance using tools like Unity Catalog, Purview, and metadata-driven frameworks. Good experience in data architecture and design. Strong hands-on experience with Azure Data Services, Databricks, and ADF. Proven experience in insurance data domains and product-oriented data design. Excellent communication and stakeholder management skills. Experience with data governance and metadata management tools." ABOUT CAPGEMINI Capgemini More ❯
data solutions using Azure and Databricks. Design medallion-style data architectures (Bronze, Silver, Gold) for structured and unstructured data. Develop robust ETL/ELT pipelines using Azure Data Factory, Databricks, and event-driven frameworks. Model insurance data across domains (policy, claims, quotes, pricing) using 3NF and dimensional schemas. Collaborate with business stakeholders to define and deliver data products aligned with … quality, lineage, and governance using tools like Unity Catalog, Purview, and metadata-driven frameworks. Good experience in data architecture and design. Strong hands-on experience with Azure Data Services, Databricks, and ADF. Proven experience in insurance data domains and product-oriented data design. Excellent communication and stakeholder management skills. Experience with data governance and metadata management tools." ABOUT CAPGEMINI Capgemini More ❯
nurturing existing accounts across the UK, with a focus on FSI and GSI. Selling cloud-native servicesand data/AI transformation solutions across platforms likeAWS, Azure, GCP, Snowflake, and Databricks . Building and expanding executive-level relationships within enterprise clients. Collaborating with marketing, business development, and delivery teams to build pipeline and close deals. Acting as a trusted advisor to … level contacts within large enterprises. Experience selling services(not just products) in a consultative, value-based model. Familiarity with modern data stacks and cloud platforms (AWS, Azure, GCP, Snowflake, Databricks). Based in or near London, with flexibility to travel 25-30%. If you're a strategic seller ready to make a big impact in a high-growth environment More ❯
data solutions using Azure and Databricks. Design medallion-style data architectures (Bronze, Silver, Gold) for structured and unstructured data. Develop robust ETL/ELT pipelines using Azure Data Factory, Databricks, and event-driven frameworks. Model insurance data across domains (policy, claims, quotes, pricing) using 3NF and dimensional schemas. Collaborate with business stakeholders to define and deliver data products aligned with … quality, lineage, and governance using tools like Unity Catalog, Purview, and metadata-driven frameworks. Good experience in data architecture and design. Strong hands-on experience with Azure Data Services, Databricks, and ADF. Proven experience in insurance data domains and product-oriented data design. Excellent communication and stakeholder management skills. Experience with data governance and metadata management tools." ABOUT CAPGEMINI Capgemini More ❯
data solutions using Azure and Databricks. Design medallion-style data architectures (Bronze, Silver, Gold) for structured and unstructured data. Develop robust ETL/ELT pipelines using Azure Data Factory, Databricks, and event-driven frameworks. Model insurance data across domains (policy, claims, quotes, pricing) using 3NF and dimensional schemas. Collaborate with business stakeholders to define and deliver data products aligned with … quality, lineage, and governance using tools like Unity Catalog, Purview, and metadata-driven frameworks. Good experience in data architecture and design. Strong hands-on experience with Azure Data Services, Databricks, and ADF. Proven experience in insurance data domains and product-oriented data design. Excellent communication and stakeholder management skills. Experience with data governance and metadata management tools." ABOUT CAPGEMINI Capgemini More ❯