london (city of london), south east england, United Kingdom Hybrid / WFH Options
CipherTek Recruitment
very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will … risk, etc.). Essential Requirements: 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and … GitHub Actions. Familiarity with Databricks CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
CipherTek Recruitment
very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will … risk, etc.). Essential Requirements: 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and … GitHub Actions. Familiarity with Databricks CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and More ❯
Fareham, England, United Kingdom Hybrid / WFH Options
Careerwise
billing inconsistencies Conduct deep dive data analysis to uncover root causes Collaborate with stakeholders to improve reconciliation and reporting logic Support data migration to Databricks and optimize current reporting tools Document leakage issues and contribute to long-term financial improvement strategies ✅ Must-Have Skills: Proven experience in SQL and data More ❯
optimisation. Proficiency in Python, SQL, and Git, with hands-on experience in tools like Jupyter notebooks, Pandas, and PyTorch. Expertise in cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong leadership skills with experience mentoring and managing data science teams. Deep knowledge of media measurement techniques, such More ❯
optimisation. Proficiency in Python, SQL, and Git, with hands-on experience in tools like Jupyter notebooks, Pandas, and PyTorch. Expertise in cloud platforms (AWS, Databricks, Snowflake) and containerisation tools (Docker, Kubernetes). Strong leadership skills with experience mentoring and managing data science teams. Deep knowledge of media measurement techniques, such More ❯
Self‑starter with a proactive, team‑player mindset Nice‑to‑Have Azure services (DevOps, Test Plans) Python scripting for data validation Data‑platform testing (Databricks experience highly advantageous) Familiarity with non‑functional testing (performance, security More ❯
Apache Spark or AWS Glue Cloud Native storage: Iceberg, RDS, RedShift, Kafka IaC: Terraform, Ansible CI/CD: Jenkins, Gitlab Other platforms such as Databricks or Snowflake will be considered You will have a fantastic opportunity to lead the Data Platform Engineering division whether you decide to take your career More ❯
teams Hands-on experience with MLOps tools such as MLflow, DVC, Kubeflow, Docker/Kubernetes, and GitOps practices Strong working knowledge of Azure and Databricks services Proficient with observability and monitoring tools (e.g. Prometheus, Grafana, Datadog) Curious and commercially minded — focused on delivering scalable, valuable solutions Familiarity with additional cloud More ❯
experience as a Platform Technical Lead, Data Architect, or similar role , leading enterprise data platforms at scale. Strong technical expertise in Azure, AWS, Snowflake, Databricks, Data Lakes, and Kafka . Deep understanding of modern data platform architectures , including Lakehouse, Kappa, and Lambda . Experience designing and managing secure, scalable, and More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Loadsure
about emerging technology and Insurtech Desirable; Experience with dbt Experience with version control software Experience within a modern cloud data warehouse, e.g. BigQuery, Snowflake, Databricks Experience working on data within Insurance and/or a B2B company About Us; We’ve combined groundbreaking AI and industry expertise to create a More ❯
Motivated to expand technical skills. Desirable Experience with Microsoft BI Tools such as Power BI, SSRS & SQL Server. Experience with Azure Data Factory and Databricks (Python). Experience working with DevOps, IaC & CI/CD pipelines (e.g. Terraform and Databricks Asset Bundles). More ❯
london, south east england, United Kingdom Hybrid / WFH Options
DATAHEAD
Senior Data Engineer Location: London | Hybrid Tech Stack: Azure | Synapse | Databricks | Power BI Cloud Focus: Cloud-native + hybrid integration Salary: £80,000 - £100,000 + 20% bonus We’re hiring a Senior Data Engineer on behalf of a global financial services firm undergoing large-scale digital and cloud transformation. … practices. Support and mentor engineers, and contribute to code reviews and standards. What You Bring Strong hands-on experience with Azure Data Factory, Synapse, Databricks, and Power BI . Solid SQL and DAX skills; basic Python for Spark environments is a plus. Experience building pipelines in a lakehouse architecture with More ❯
platforms to support business insights, analytics, and other data-driven initiatives. Job Specification ( Technical Skills) : Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, Delta Lake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI … collaborate with business segments (MW Snacking, Petcare, Royal Canin) to identify data platform requirements and develop tailored solutions working with vendors such as Microsoft, Databricks, Snowflake etc., Develop and implement strategies for activating new platforms, frameworks, and technologies within the MARS, deploy to segments and corporate teams within MGS. Ensure More ❯
Engineer will actively contribute throughout the Agile development lifecycle , participating in planning, refinement, and review ceremonies. Key Responsibilities: Develop and maintain ETL pipelines in Databricks , leveraging Apache Spark and Delta Lake . Design, implement, and optimize data transformations and treatments for structured and unstructured data. Work with Hive Metastore and … mechanisms for maintaining stateful processing in Spark Structured Streaming . Handle DataFrames efficiently for large-scale data processing and analytics. Schedule, monitor, and troubleshoot Databricks pipelines for automated workflow execution. Enable pause/resume functionality in pipelines based on responses from external API calls. Ensure scalability, reliability, and performance optimization … including technical impact assessments and rationales. Work within GitLab repository structures and adhere to project-specific processes. Required Skills and Experience: Strong expertise in Databricks , Apache Spark , and Delta Lake . Experience with Hive Metastore and Unity Catalog for data governance. Proficiency in Python, SQL, Scala , or other relevant languages. More ❯
set up a test automation framework from scratch ability to work independently nice to have: Azure experience, Python, data platform testing experience, Experience with Databricks would be a big advantage Working in an Agile team, a fast-paced environment More ❯
days/month, reimbursed) Industry: Motor Fleet Insurance Experience: 15–20 years Core Skills Required: Strong data architecture expertise AWS (Aurora, S3, Lambda) Snowflake, Databricks, Reltio Reporting tools (Power BI, Tableau, or similar) Data mapping, lineage, migration, ETL, and modeling API-based data integration Nice to Have: Data governance frameworks More ❯
london (city of london), south east england, United Kingdom
Wilson Brown
scalable data capabilities; designing ingestion pipelines, high-performance APIs, and real-time data processing systems. Key Responsibilities Stack: Python, PySpark, Linux, PostgreSQL, SQL Server, Databricks, Azure, AWS Design and implement large-scale data ingestion, transformation, and analysis solutions Model datasets and develop indicators to improve data quality Collaborate on Infrastructure More ❯
Romford, south east england, United Kingdom Hybrid / WFH Options
Adecco
focus is on developing a portfolio of business intelligence services and products that support their customers’ needs. The ideal candidate will have experience with Databricks, as well as reading and writing in Python. Duties of the role: You will be responsible for the division's business intelligence domain (i.e. ingestion More ❯
Management, Storage Accounts, Virtual Machines, and core networking components Business automation using Power Automate Data integration and transformation tools including Azure Data Factory, Azure Databricks and SQL server AI tools such as Copilot Studio and AI cognitive services Working knowledge of commonly used programming languages and frameworks relevant to integration More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
Queen Square Recruitment Limited
installation on Linux systems. Experience with container technologies (Kubernetes, Docker). Proficiency in scripting and automation using Python and R. Familiarity with platforms like Databricks and Domino. Strong problem-solving, communication, and stakeholder management skills. Desirable Qualities: Collaborative team player with the ability to work independently. Confident communicator, capable of More ❯
product engineering, data marketplace architecture, data developer portals, platform engineering. Experience co-selling partner solutions with hyperscalers or platforms (e.g. AWS, Azure, GCP, Snowflake, Databricks). Outstanding communication skills - able to translate complex ideas for both technical and business audiences. Demonstrated thought leadership in AI/ML such as speaking More ❯
on experience with LLMs, vector databases, RAG pipelines, prompt engineering, and model fine-tuning Familiarity with cloud platforms (especially Azure), and tools such as Databricks, Hugging Face, LangChain, and open-source GenAI frameworks Deep curiosity and a passion for staying ahead of AI and data science developments Commercial acumen and More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Careerwise
Data Scientist with Databricks Experience Salary-up to £90K base + bonus + benefits Location- Work from Home but accessible to travel to London when needed Our client is an international company that requires a senior Data Scientist with experience in Azure Databricks, Knowledge Graph, Neo4J Graph Database, and RAG … pipelines for LLM to join the team. Job Description: Responsibilities: Develop and implement data models and algorithms to solve complex business problems. Utilize Databricks to manage and analyse large datasets efficiently. Collaborate with cross-functional teams to understand business requirements and deliver data-driven insights. Design and build scalable data … industry trends and best practices in data science and big data technologies. Requirements: Proven experience as a Data Scientist or similar role. Proficiency with Databricks and its ecosystem. Strong programming skills in Python, R, or Scala. Experience with big data technologies such as Apache Spark, Databricks. Knowledge of SQL and More ❯
technical roadblocks, and create measurable value. This role requires deep familiarity with modern data ecosystems. You’ll leverage cloud data warehouses such as Snowflake, Databricks, and others to architect robust, scalable solutions. You’ll help customers harness the power of AI and large language models (LLMs) to build intelligent agents … enhanced insights and automation Working knowledge of database technologies (relational, columnar, NoSQL), e.g., MySQL, Oracle, MongoDB Experience with modern cloud data warehouses (e.g., Snowflake, Databricks, BigQuery) Excellent organisational and multitasking skills across multiple sales cycles Agile and adaptable to evolving customer needs and priorities Creative problem-solver with a strategic More ❯
The Skills You'll Need: IT skills, technical engineering, big data processing, cloud architecture focused on Microsoft Azure Your New Salary: £19 p/h - £34 p/h Location: West London, Weybridge Duration: 4-month assignment from 01/ More ❯