UK and Europe, building strong relationships and learning from experts in the field. Develop a solid understanding of clean architecture principles, optimisation techniques, and engineering best practices within a Databricks environment. Partner with teams across the business to ensure smooth data flows and deliver reliable, well-structured solutions end-to-end. What we're looking for: Python skills (must-have … you advance in your career with us. What success would look like: Building Reliable Data Pipelines: Consistently delivering well-tested and robust data pipelines using Python and PySpark on Databricks, adhering to established coding standards and software engineering best practices. Growing Technical Proficiency: Rapidly developing your skills in our core technologies (Python, PySpark, Databricks, SQL, Git, GitHub Actions, Terraform) and More ❯
to support experimentation and deployment. 🛠️ Key Responsibilities Build and maintain high-performance data pipelines to power AI/ML use cases Architect cloud-native data platforms using tools like Databricks , Airflow , Snowflake , and Spark Collaborate with AI/ML teams to align data processing with model requirements Develop ETL/ELT workflows to support feature engineering, model training, and inference … Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing skills—comfortable leading on technical delivery 🎁 What’s More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Omnis Partners
to support experimentation and deployment. 🛠️ Key Responsibilities Build and maintain high-performance data pipelines to power AI/ML use cases Architect cloud-native data platforms using tools like Databricks , Airflow , Snowflake , and Spark Collaborate with AI/ML teams to align data processing with model requirements Develop ETL/ELT workflows to support feature engineering, model training, and inference … Scala or Java Experience supporting AI/ML workflows and working with Data Scientists Exposure to cloud platforms: AWS , Azure , or GCP Hands-on with modern data tooling: Spark , Databricks , Snowflake , Airflow Solid grasp of data modelling, orchestration, and infrastructure-as-code (Terraform, Docker, CI/CD) Excellent communication and client-facing skills—comfortable leading on technical delivery 🎁 What’s More ❯
tolerance for production-grade robustness. Enable LLM-driven workflows by shaping data to be AI-consumable (e.g. chunking, embeddings, metadata). Reduce tech debt and simplify orchestration across Flyte, Databricks, and Azure-based infrastructure. Example Projects: Design and optimise distributed data pipelines to handle large-scale video and image data processing. Re-design and optimise existing analytics pipelines. Collaborate with … the data platform team to integrate pipelines with Databricks for governance and compliance - and unlock massive scale for offline evaluation from third party datasets. Shape evaluation data to support future use cases like Retrieval-Augmented Generation (RAG) and natural language analytics. What we are looking for in our candidate Essential Proficiency in Python and SQL, with experience in frameworks like … stakeholders to understand requirements and shape data pipelines to meet user needs effectively. Desirable 5+ years of experience in a data engineering or similar role Experience with Docker, Kubernetes, Databricks Familiarity with shaping data for AI/LLM-based systems This is a full-time role based in our office in London. At Wayve we want the best of all More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
Senior Data Engineer| AWS/Databricks/PySpark | London/Glasgow (Hybrid) | August Start Role: Senior Data Engineer Location: This is a hybrid engagement represented by 2 days/week onsite, either in Central London or Glasgow. Start Date: Must be able to start mid-August. Salary: £80k-£90k (Senior) | £90k-£95k (Lead) About The Role Our partner is looking … thinking environment. You’ll be involved in designing and building production-grade ETL pipelines, driving DevOps practices across data systems and contributing to high-availability architectures using tools like Databricks, Spark and Airflow- all within a modern AWS ecosystem. Responsibilities Architect and build scalable, secure data pipelines using AWS, Databricks and PySpark. Design and implement robust ETL/ELT solutions … reviews and solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (Delta Lake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands-on experience with workflow orchestration tools More ❯
collaborate with cross-functional teams to manage complex ETL processes, implement best practices in code management, and ensure seamless data flow across platforms. Projects may include connecting SharePoint to Databricks, optimising Spark jobs, and managing GitHub-based code promotion workflows. This is a hybrid role based in London, with 1-2 days per week in the office. What You'll … Succeed You'll bring 5+ years of data engineering experience, with expert-level skills in Python and/or Scala, SQL, and Apache Spark. You're highly proficient with Databricks and Databricks Asset Bundles, and have a strong understanding of data transformation best practices. Experience with GitHub, DBT, and handling large structured and unstructured datasets is essential. You're detail More ❯
Shape and enforce data quality, lineage, and contracts during platform migration. Partner with engineers and architects to build scalable, validated pipelines . Integrate tooling across Alation, Azure Data Services, Databricks , and Great Expectations . Align with Data Governance to support data domain stewards during migration. 💼 You Bring: Proven track record delivering data platform and migration initiatives as a Product Manager. More ❯
Shape and enforce data quality, lineage, and contracts during platform migration. Partner with engineers and architects to build scalable, validated pipelines . Integrate tooling across Alation, Azure Data Services, Databricks , and Great Expectations . Align with Data Governance to support data domain stewards during migration. 💼 You Bring: Proven track record delivering data platform and migration initiatives as a Product Manager. More ❯
insurance and is investing heavily in modernising its pricing approach across multiple product lines. You'll play a key role in: Building and testing predictive models using Python and Databricks Exploring and integrating external data enrichments Contributing to model validation and peer code reviews Supporting pricing decisions that directly impact business growth and profitability Ideal candidate 2-8 years of More ❯
in a regulated environment e.g. FCA, PRA and wider regulatory, compliance environment for global insurance businesses (e.g. GDPR) Knowledge of Business Intelligence and Data platform technologies (e.g. Microsoft Fabric, Databricks etc.). Good knowledge of technology governance standards Good knowledge of technology best practice including SDLC Working knowledge of Enterprise and Solution Architecture frameworks and methodologies (e.g. TOGAF More ❯
in a regulated environment e.g. FCA, PRA and wider regulatory, compliance environment for global insurance businesses (e.g. GDPR) Knowledge of Business Intelligence and Data platform technologies (e.g. Microsoft Fabric, Databricks etc.). Good knowledge of technology governance standards Good knowledge of technology best practice including SDLC Working knowledge of Enterprise and Solution Architecture frameworks and methodologies (e.g. TOGAF More ❯
to transform how companies support their employees, and this role will challenge you to continually learn how we provide value to our customers. Those customers speak for themselves: Toyota, Databricks, Unilever, Marriott, and Snowflake, just to name a few. But before they became customers - and before they began saving more than a million dollars each year with Moveworks - they were More ❯
from Google and McKinsey, and is headquartered in Silicon Valley. We are backed by Index Ventures, Theory Ventures, MMC and a range of executives from leading tech companies including, Databricks, Clickhouse, and MongoDB, among others. We're based across San Francisco, London, Tel Aviv and Budapest - small team, friendly vibe, high expectations. What we're looking for: Mindset & attitude (most More ❯
at least $6M+ in new bookings. Experience co-owning and driving proposals and SOWs of $1M+ (help from Ness is a bonus). Experience selling alliance partnership services (Salesforce, Databricks, AWS, Confluent). Willing to travel to France monthly and as needed. Willing to commit at least 3 years to the role, with a view to greater responsibilities at Ness. More ❯
We find incredible minds from all backgrounds to train with us, creating a diverse team of experts. We’re the preferred partner of today’s leading technology providers, including Databricks, Snowflake, and Collibra, to accelerate delivery and co-create revolutionary solutions. We transform the lives of our consultants, clients, and their customers through data and AI. The Role: This is More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Kubrick Group
We find incredible minds from all backgrounds to train with us, creating a diverse team of experts. We’re the preferred partner of today’s leading technology providers, including Databricks, Snowflake, and Collibra, to accelerate delivery and co-create revolutionary solutions. We transform the lives of our consultants, clients, and their customers through data and AI. The Role: This is More ❯
business needs our type of product, you'll work with a variety of new clients and industries as Zip scales. Current clients include OpenAI, Coinbase, Snowflake, Notion, Canva, Samsara, Databricks and many more! Your Role We're looking for a Senior Solutions Consultant - Managed Services with experience with Netsuite, Sage, Coupa or Quickbooks implementations services to lead post-implementation services More ❯
business needs our type of product, you'll work with a variety of new clients and industries as Zip scales. Current clients include OpenAI, Coinbase, Snowflake, Notion, Canva, Samsara, Databricks, etc. You Will Lead onboarding for new customers, with a heavy emphasis on understanding requirements and creatively configuring the product to solve their problems Responsible for leading the end-to More ❯
focused on designing modern, scalable, and secure data platforms for enterprise clients. You'll play a key role in shaping data architecture across the full Azure stack- including Azure Databricks and Azure Data Factory (ADF) -and will guide engineering teams in delivering robust, future-proof solutions using lakehouse and medallion architecture principles . Key Responsibilities Design end-to-end data … architectures using Azure services, including Azure Databricks, ADF, Synapse Analytics , and Data Lake Storage Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD … and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensional modelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
focused on designing modern, scalable, and secure data platforms for enterprise clients. You'll play a key role in shaping data architecture across the full Azure stack- including Azure Databricks and Azure Data Factory (ADF) -and will guide engineering teams in delivering robust, future-proof solutions using lakehouse and medallion architecture principles . Key Responsibilities Design end-to-end data … architectures using Azure services, including Azure Databricks, ADF, Synapse Analytics , and Data Lake Storage Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD … and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensional modelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with More ❯
Summary: Join a team building a modern Azure-based data platform. This hands-on engineering role involves designing and developing scalable, automated data pipelines using tools like Data Factory, Databricks, Synapse, and Logic Apps. You'll work across the full data lifecycle - from ingestion to transformation and delivery - enabling smarter, faster insights. Key Responsibilities: * Develop and maintain data pipelines using … Collaborate with cross-functional teams in an agile environment. Collaboration With: * Data Engineers, Architects, Product Owners, Test Analysts, and BI Teams. Skills & Experience: * Proficiency in Azure tools (Data Factory, Databricks, Synapse, etc.). * Strong SQL and experience with data warehousing (Kimball methodology). * Programming skills in Python, Scala, or PySpark. * Familiarity with Power BI, SharePoint, and data integration technologies. * Understanding More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Involved Solutions
London. Key Responsibilities - Azure Data Engineer: Design, build and maintain scalable and secure data pipelines on the Azure platform. Develop and deploy data ingestion processes using Azure Data Factory, Databricks (PySpark), and Azure Synapse Analytics. Optimise ETL/ELT processes to improve performance, reliability and efficiency. Integrate multiple data sources including Azure Data Lake (Gen2), SQL-based systems and APIs. … incl. GDPR and ISO standards). Required Skills & Experience - Azure Data Engineer: Proven commercial experience as a Data Engineer delivering enterprise-scale solutions in Azure Azure Data Factory Azure Databricks (PySpark) Azure Synapse Analytics Azure Data Lake Storage (Gen2) SQL & Python Understanding of CI/CD in a data environment, ideally with tools like Azure DevOps. Experience working within consultancy More ❯
investment, real urgency, and real interest in doing this properly - not endless meetings and PowerPoints. What you'll be doing: Designing, building, and optimising Azure-based data pipelines using Databricks, PySpark, ADF, and Delta Lake Implementing a medallion architecture - from raw to curated Collaborating with analysts to make data business-ready Applying CI/CD and DevOps best practices (Git … code Exploring real-time logistics datasets What they're looking for: A strong communicator - someone who can build relationships and help connect silos Experience building pipelines in Azure using Databricks, ADF, and PySpark Strong SQL and Python skills Bonus points if you've worked with Power BI, Azure Purview, or streaming tools You're versatile - happy to support analysts and More ❯
around the world, we've built partnerships with more than 1,000 clients during our 30 years of history. Artificial Intelligence is our reality. Principal Architect (Data focused on Databricks), UK General Description We are looking for a Senior Architect with strong Data experience to be part of our team working for EMEA clients. In this position, you will lead … software solutions, working closely with teams in the UK, EU and other countries in Latin America. We are looking for someone who has: At least 3 years experience with Databricks data platform (setting up workspaces, delta lake, ingestion and transformation workflows, data catalogs, ) Solid experience in architecture design and technical leadership Solid experience designing, developing and deploying pipelines, data-lakes More ❯
around the world, we've built partnerships with more than 1,000 clients during our 30 years of history. Artificial Intelligence is our reality. Principal Architect (Data focused on Databricks), UK General Description We are looking for a Senior Architect with strong Data experience to be part of our team working for EMEA clients. In this position, you will lead … software solutions, working closely with teams in the UK, EU and other countries in Latin America. We are looking for someone who has: At least 3 years experience with Databricks data platform (setting up workspaces, delta lake, ingestion and transformation workflows, data catalogs, ) Solid experience in architecture design and technical leadership Solid experience designing, developing and deploying pipelines, data-lakes More ❯