permanently 4 years’ experience in data analytics or data science working with complex data Excellent demonstratable SQL skills including complex queries and stored procedures Solid Python and Spark/Databricks skills and an understanding of delta tables Strong numerical, data handling and analytical skills in relation to complicated data models A degree in relevant numerate or analytical subject Ability to More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Clarksons Research
permanently 4 years’ experience in data analytics or data science working with complex data Excellent demonstratable SQL skills including complex queries and stored procedures Solid Python and Spark/Databricks skills and an understanding of delta tables Strong numerical, data handling and analytical skills in relation to complicated data models A degree in relevant numerate or analytical subject Ability to More ❯
principal capacity. Proven experience designing and governing complex, multi-cloud or hybrid solutions. Deep technical expertise in cloud, data, integration, and security architecture. Strong understanding of enterprise data platforms (Databricks, S3, Redshift), integration patterns (API Gateway, AppFlow, Logic Apps), and cloud-native services. Demonstrable leadership in delivering large-scale transformation or digital programmes. Proficiency in DevOps tooling (Terraform, GitHub, CodePipeline More ❯
Glasgow, Scotland, United Kingdom Hybrid/Remote Options
Bioscript Group
GitLab), including branching, merging, and collaborative code management. Proficiency in programming languages such as Python, R, and experience with relevant libraries and frameworks, and data pipeline tools like Fabric, Databricks or Snowflake. Familiarity with biological and clinical data types and their challenges Excellent leadership, communication, and interpersonal skills. Our people are at the heart of our business We are focused More ❯
Databricks Engineer - SC Cleared Fully Remote 6-month initial contract Start Date: 17th November 2025 £550 per day Inside IR35 We are seeking two SC Cleared Databricks Analysts to support a high-profile data engineering and analytics programme with one of our GovTech Consultancy clients. The successful candidates will play a key role in designing, developing, and optimising data pipelines … and analytics solutions using Databricks within a secure environment. You will work with large and complex datasets from UK and potentially international sources, enabling advanced analytics and data-driven insights to inform strategic decisions. This role is highly technical and data-focused, requiring strong experience in Databricks , Spark , and related data engineering tools. While stakeholder engagement is minimal, you will … need to align technical solutions with business objectives and ensure data integrity and compliance throughout. Key Requirements Active SC Clearance (used within the last 12 months) Proven experience with Databricks (including notebooks, clusters, and job orchestration) Strong knowledge of Apache Spark , PySpark , and distributed data processing Experience building and optimising ETL pipelines and data workflows Familiarity with Delta Lake , SQL More ❯
experience Essential Proven experience in data engineering, data integration and data modelling Expertise with cloud platforms (e.g. AWS, Azure, GCP) Expertise with modern cloud data platforms (e.g. Microsoft Fabric, Databricks) Expertise with multiple data analytics tools (e.g. Power BI) Deep understanding of data wareho using concepts, ETL/ELT pipelines and dimensional modelling Proficiency in advanced programming languages (Python/… and adaptability to client needs. Experience defining and enforcing data engineering standards, patterns, and reusable frameworks Professional certifications in relevant technologies (e.g. Microsoft Azure Data Engineer, AWS Data Analytics, Databricks Certified Professional Data Engineer) Skills Data Development Process Design, build and test data products that are complex or large scale Build and lead teams to complete data integration services integration More ❯
Bournemouth, Dorset, South West, United Kingdom Hybrid/Remote Options
Sanderson Recruitment
Lead Databricks Engineer - Single Customer View Location: Bournemouth (Hybrid) Contract: 12-month FTC Salary: £85,000 + Benefits Lead Databricks Engineer: About the Role Join our Single Customer View (SCV) Programme , a strategic initiative within Financial Services aimed at delivering a unified, trusted view of customer data. We're seeking a highly skilled Lead Databricks Engineer to design and implement … scalable data pipelines that form the backbone of our Lakehouse platform, enabling accurate analytics, reporting, and regulatory compliance. You'll work with cutting-edge technologies including Databricks , PySpark , and Azure Data Factory , applying best practices in data engineering and governance to support this critical programme. Lead Databricks Engineer: Key Responsibilities Build and maintain Databricks pipelines (batch and incremental) using PySpark … business teams, and engineers to deliver consistent, well-documented datasets. Support deployments and automation via Azure DevOps CI/CD . Gather and refine requirements from business stakeholders. Lead Databricks Engineer: About You Strong PySpark development skills for large-scale data engineering. Proven experience with Databricks pipelines and workflow management. Expertise in Azure Data Factory orchestration. Solid knowledge of Delta More ❯
Role: Databricks Developer Location: Houston, TX - prefers candidates in Houston Onsite 4 days a week Visa: anything that can work W2 without sponsorship Overview: Client is building a team to conduct a full-scale Databricks implementation with Priority Power. 3 MDC resources alongside US Solutions Architect: 1 Business/Data Analyst; 2 Data Engineers. Solutions Architect will be responsible for … setting up the Databricks environment, starting with the AWS infra/hosting, Databricks workspace (alongside DBX SA), and data migration solution. Engineers will be responsible for building all of the "plumbing" and data ingestion, ETL, modeling. We are looking for a Senior Engineer with strong Databricks experience and hands-on skills in Python, PySpark, and SQL. They will be responsible … for building data ingestion processes/ETL pipelines to ingest data from various sources into Databricks (bronze layer) They should have experience in data modeling, transformations and curation of data through medallion architecture. Needs cloud experience, AWS preferred, to migrate workflows from AWS to Databricks The Engineers will be more-so responsible for getting the data into Databricks, data curation More ❯
Databricks Data Architectx2 UK Wide Hybrid Working £80,000-£90,000 As a Solution Architect with an Azure and Databrick focus, you will be an integral part of our team dedicated to building scalable and secure data platforms. You will leverage your expertise in Databricks, Apache Spark, and Azure to design, develop, and implement data warehouses, data lakehouses, and AI …/ML models that fuel our data-driven operations. Duties Design and build high-performance data platforms: Utilize Databricks and Apache Spark to extract, transform, and load data into Azure Data Lake Storage and other Azure services. Design and oversee the delivery of secure data warehouses and data lakehouses: Implement data models, data quality checks, and governance practices to ensure … reliable and accurate data. Abilty to Design, Build and deploy AI/ML models: Integrate Machine Learning into data pipelines, leverage Databricks ML and Azure ML to develop predictive models and drive business insights. Design, Monitor and optimize data pipelines and infrastructure: Analyze performance metrics, identify bottlenecks, and implement optimizations for efficiency and scalability. Collaborate with cross-functional teams: Work More ❯
three core responsibilities : 1️⃣ Data Architecture & Strategy – shaping the Azure-based data platform and overall data strategy. 2️⃣ Technical Delivery – building scalable, efficient data solutions using Azure Data Factory, Databricks, Synapse, SQL, Python, Terraform and modern CI/CD practices. Building ETL pipelines 3️⃣ Team Leadership – mentoring and developing a talented data team while promoting best practice and collaboration. This More ❯
real-world problems Proven experience managing production data pipelines Understanding of predictive modelling, machine-learning, clustering and classification techniques Fluency in Python and SQL Nice to have: Experience using Databricks Experience using Microsoft Azure Experience with RabbitMQ and Docker Experience using dbt More ❯
real-world problems Proven experience managing production data pipelines Understanding of predictive modelling, machine-learning, clustering and classification techniques Fluency in Python and SQL Nice to have: Experience using Databricks Experience using Microsoft Azure Experience with RabbitMQ and Docker Experience using dbt More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Executive Facilities
Experience domains. Proficiency in SQL for data extraction, transformation, and pipeline development. Experience with dashboarding and visualization tools (Tableau, Qlik, or similar). Familiarity with big data tools (Snowflake, Databricks, Spark) and ETL processes. Useful experience; Python or R for advanced analytics, automation, or experimentation support. Knowledge of statistical methods and experimentation (A/B testing) preferred. Machine learning and More ❯
South West, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
to enhance reporting, analytics, and business intelligence solutions. What You Bring: 5+ years of experience in SQL production environments and data engineering. Strong expertise in SQL, T-SQL, Azure, DataBricks, and SSIS. Experience with data integration, modeling, and performance optimization. Problem-solving mindset and a collaborative approach. Desirable: Experience handling data from APIs and secure protocols. Familiarity with visualization tools More ❯
in implementing end-to-end ML pipelines (data, training, validation, serving) Experience with ML workflow orchestration tools (e.g., Airflow, Prefect, Kubeflow) and ML feature or data platforms (e.g., Tecton, Databricks, etc.) Experience with cloud platforms (AWS, GCP/Vertex, Azure), Docker, and Kubernetes Solid coding practices (Git, automated testing, CI/CD). Proficiency with Linux Familiarity with time-series More ❯
ETL, ELT operations and administration using modern tools, programming languages and systems securely and in accordance with enterprise data standard Required Proficiency: Advanced SQL Snowflake Tamr Python GitHub Intermediate Databricks Airflow Skills and Qualifications We are looking for someone who has technical depth and broad tool experience related to data engineering and has the following required qualifications: Minimum bachelor's More ❯
Hands-on experience implementing end-to-end ML pipelines (data ingestion, training, validation, serving). Familiarity with ML workflow orchestration tools (Airflow, Prefect, Kubeflow) and feature/data platforms (Databricks, Tecton, etc.). Strong experience with cloud platforms (AWS, GCP, or Azure), Docker, and Kubernetes. Solid coding practices, including Git, automated testing, and CI/CD. Proficiency with Linux environments. More ❯
Hands-on experience implementing end-to-end ML pipelines (data ingestion, training, validation, serving). Familiarity with ML workflow orchestration tools (Airflow, Prefect, Kubeflow) and feature/data platforms (Databricks, Tecton, etc.). Strong experience with cloud platforms (AWS, GCP, or Azure), Docker, and Kubernetes. Solid coding practices, including Git, automated testing, and CI/CD. Proficiency with Linux environments. More ❯
technologies and their application in cloud environments (Azure would be ideal) Proficiency in ETL/ELT processes, data integration, and engineering tools. Hands-on experience with Python, Airflow, Snowflake, Databricks, and Spark The values and ethos of our business Innovation with real purpose and for real results Support one another - pull together and be helpful We are working hard but More ❯
routines, and monitoring processes to maintain data integrity and reliability. * Optimise data workflows for performance, cost-efficiency, and maintainability using tools such as Azure Data Factory, AWS Data Pipeline, Databricks, or Apache Spark . * Integrate and prepare data for Tableau dashboards and reports , ensuring optimal performance and alignment with business needs. * Collaborate with visualisation teams to develop, maintain, and enhance More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Areti Group | B Corp™
critical projects across the public sector, defence, and government organisations , delivering real-world innovation powered by data and technology . 🔧 Tech Stack & Skills We're Looking For: Palantir Azure Databricks Microsoft Azure Python Docker & Kubernetes Linux Apache Tools Data Pipelines IoT (Internet of Things) Scrum/Agile Methodologies ✅ Ideal Candidate: Already DV Cleared or at least SC Strong communication skills More ❯
critical projects across the public sector, defence, and government organisations , delivering real-world innovation powered by data and technology . 🔧 Tech Stack & Skills We're Looking For: Palantir Azure Databricks Microsoft Azure Python Docker & Kubernetes Linux Apache Tools Data Pipelines IoT (Internet of Things) Scrum/Agile Methodologies ✅ Ideal Candidate: Already DV Cleared or at least SC Strong communication skills More ❯
routines, and monitoring processes to maintain data integrity and reliability. * Optimise data workflows for performance, cost-efficiency, and maintainability using tools such as Azure Data Factory, AWS Data Pipeline, Databricks, or Apache Spark . * Integrate and prepare data for Tableau dashboards and reports , ensuring optimal performance and alignment with business needs. * Collaborate with visualisation teams to develop, maintain, and enhance More ❯
data segmentation, handoffs and statistical reporting. Collaborate closely with Data Science, Business and Data Modeling teams to perform feature engineering, enhancing model accuracy and relevance to business needs. Develop Databricks notebooks to deliver analytical insights and support strategic business decisions, ensuring transparency and reproducibility of analysis. Demonstrate expertise in SQL, Spark, PySpark, Python, metadata management, and translate complex analytics into More ❯
Nottingham, England, United Kingdom Hybrid/Remote Options
Understanding Recruitment
Machine Learning, or a related field (Master’s or PhD preferred) Proven experience developing and deploying ML models in real-world applications Solid coding and data skills (Python, SQL, Databricks, cloud platforms) Deep understanding of the full ML lifecycle and productionisation of models Passion for applying AI to solve real-world problems Why Join: Work with a modern AI/ More ❯