opportunity to join one of the largest integrated energy and commodity trading companies in the world. We are looking for a Senior Data Engineer with strong technical expertise in Databricks , data engineering , and cloud-native analytics platforms . You will contribute to the development and expansion of our global analytics platform —supporting Front Office Trading across commodities—by building scalable … build robust data pipelines, and deliver end-to-end analytics and ML/AI capabilities. Key Responsibilities Design, build, and maintain scalable data pipelines and Delta Lake architectures in Databricks on AWS. Develop and enhance the Front Office data warehouse to ensure performance, reliability, and data quality for trading analytics. Partner with data scientists and quants to prepare ML-ready … Implement and maintain CI/CD pipelines, testing frameworks, and observability tools for data engineering workflows. Contribute to MLOps practices, including model tracking, deployment, and monitoring using MLflow and Databricks tools. Participate in code reviews, data modeling sessions, and collaborative solutioning across cross-functional teams. Ensure compliance with data governance, security, and performance standards. Stay current with Databricks platform enhancements More ❯
real-world problems Proven experience managing production data pipelines Understanding of predictive modelling, machine-learning, clustering and classification techniques Fluency in Python and SQL Nice to have: Experience using Databricks Experience using Microsoft Azure Experience with RabbitMQ and Docker Experience using dbt More ❯
real-world problems Proven experience managing production data pipelines Understanding of predictive modelling, machine-learning, clustering and classification techniques Fluency in Python and SQL Nice to have: Experience using Databricks Experience using Microsoft Azure Experience with RabbitMQ and Docker Experience using dbt More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Executive Facilities
Experience domains. Proficiency in SQL for data extraction, transformation, and pipeline development. Experience with dashboarding and visualization tools (Tableau, Qlik, or similar). Familiarity with big data tools (Snowflake, Databricks, Spark) and ETL processes. Useful experience; Python or R for advanced analytics, automation, or experimentation support. Knowledge of statistical methods and experimentation (A/B testing) preferred. Machine learning and More ❯
to-have Text-to-Cypher/SPARQL with safety filters and small eval sets. MCP-style tool contracts for safe agent access. Streaming/ELT at scale (Kafka/Databricks/PySpark). More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Intelix.AI
to-have Text-to-Cypher/SPARQL with safety filters and small eval sets. MCP-style tool contracts for safe agent access. Streaming/ELT at scale (Kafka/Databricks/PySpark). More ❯
NoSQL databases . AI Integration: Understanding of AI integration frameworks, including ONNX and TensorFlow Serving . Experience with A2A & MCP . Cloud Platforms: Hands-on experience with Azure and Databricks . DevOps: Knowledge of CI/CD pipelines, Terraform, Docker, Kubernetes . Version Control: Proficiency with Git (GitHub) . More ❯
South West, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
to enhance reporting, analytics, and business intelligence solutions. What You Bring: 5+ years of experience in SQL production environments and data engineering. Strong expertise in SQL, T-SQL, Azure, DataBricks, and SSIS. Experience with data integration, modeling, and performance optimization. Problem-solving mindset and a collaborative approach. Desirable: Experience handling data from APIs and secure protocols. Familiarity with visualization tools More ❯
in implementing end-to-end ML pipelines (data, training, validation, serving) Experience with ML workflow orchestration tools (e.g., Airflow, Prefect, Kubeflow) and ML feature or data platforms (e.g., Tecton, Databricks, etc.) Experience with cloud platforms (AWS, GCP/Vertex, Azure), Docker, and Kubernetes Solid coding practices (Git, automated testing, CI/CD). Proficiency with Linux Familiarity with time-series More ❯
Hands-on experience implementing end-to-end ML pipelines (data ingestion, training, validation, serving). Familiarity with ML workflow orchestration tools (Airflow, Prefect, Kubeflow) and feature/data platforms (Databricks, Tecton, etc.). Strong experience with cloud platforms (AWS, GCP, or Azure), Docker, and Kubernetes. Solid coding practices, including Git, automated testing, and CI/CD. Proficiency with Linux environments. More ❯
Hands-on experience implementing end-to-end ML pipelines (data ingestion, training, validation, serving). Familiarity with ML workflow orchestration tools (Airflow, Prefect, Kubeflow) and feature/data platforms (Databricks, Tecton, etc.). Strong experience with cloud platforms (AWS, GCP, or Azure), Docker, and Kubernetes. Solid coding practices, including Git, automated testing, and CI/CD. Proficiency with Linux environments. More ❯
critical projects across the public sector, defence, and government organisations , delivering real-world innovation powered by data and technology . 🔧 Tech Stack & Skills We're Looking For: Palantir Azure Databricks Microsoft Azure Python Docker & Kubernetes Linux Apache Tools Data Pipelines IoT (Internet of Things) Scrum/Agile Methodologies ✅ Ideal Candidate: Already DV Cleared or at least SC Strong communication skills More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Areti Group | B Corp™
critical projects across the public sector, defence, and government organisations , delivering real-world innovation powered by data and technology . 🔧 Tech Stack & Skills We're Looking For: Palantir Azure Databricks Microsoft Azure Python Docker & Kubernetes Linux Apache Tools Data Pipelines IoT (Internet of Things) Scrum/Agile Methodologies ✅ Ideal Candidate: Already DV Cleared or at least SC Strong communication skills More ❯
top-tier consultancy or blue-chip corporate. Demonstrable experience leading AI value delivery, CoE mobilisation, or multi-disciplinary product teams. Technical literacy across data and AI ecosystems (Azure, GCP, Databricks, Snowflake, Power BI, Kafka, LLMs). Exceptional stakeholder management up to CIO/CDO level with a track record of influence and measurable delivery. Strong grasp of responsible-AI, governance More ❯
into a management (mentoring, coaching, team development etc) Very strong technical skills that will include - SQL, SSIS, SSRS, SAS, Power BI, Power Platform, Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensional modelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering More ❯
Research Teams – Translate experimental findings into production-grade systems that extend the autonomy and reliability of agents. Run Experiments End-to-End – Own your compute environment (e.g. Jupyter, Colab, Databricks) and iterate on large-scale LLM training and evaluation. What You’ll Bring: 4+ years’ experience in Machine Learning or AI , with exposure to LLM agent systems , tool-use frameworks More ❯
Research Teams – Translate experimental findings into production-grade systems that extend the autonomy and reliability of agents. Run Experiments End-to-End – Own your compute environment (e.g. Jupyter, Colab, Databricks) and iterate on large-scale LLM training and evaluation. What You’ll Bring: 4+ years’ experience in Machine Learning or AI , with exposure to LLM agent systems , tool-use frameworks More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Carrington Recruitment Solutions Limited
DATA HEAVY Product Owners who have managed complex, Global products. Read on for more details Experience required: Technical proficiency:Familiarity with Azure services (e.g., Data Lake, Synapse, Fabric) and Databricks for data engineering, analytics, performance optimisation, and governance. Experience with implementing and optimising scalable cloud infrastructure is highly valued. Backlog management:Demonstrated expertise in maintaining and prioritizing product backlogs, writing More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Carrington Recruitment Solutions Ltd
DATA HEAVY Product Owners who have managed complex, Global products. Read on for more details... Experience required: Technical proficiency: Familiarity with Azure services (e.g., Data Lake, Synapse, Fabric) and Databricks for data engineering, analytics, performance optimisation, and governance. Experience with implementing and optimising scalable cloud infrastructure is highly valued. Backlog management: Demonstrated expertise in maintaining and prioritizing product backlogs, writing More ❯
Newbury, Berkshire, South East, United Kingdom Hybrid / WFH Options
Fdo Consulting Limited
into a management (mentoring, coaching, team development etc) Very strong technical skills that will include - SQL, SSIS, SSRS, SAS, Power BI, Power Platform, Azure Data Factory, Azure Data Lake, Databricks A good understanding of dimensional modelling techniques, including Kimball's Business Development Lifecycle Ability to design hybrid data solutions across on-prem and cloud data sources Expert with data engineering More ❯
Southampton, Hampshire, South East, United Kingdom
Spectrum It Recruitment Limited
background in data visualisation (Power BI/Looker/Looker Studio). Experience with regulated data environments and a disciplined approach to quality and documentation. Desirable: BigQuery, Snowflake, or Databricks; reverse-ETL tools; marketing attribution methods. Next Steps If you're a data scientist passionate about using data and making an impact, we would love to hear from you. Apply More ❯
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯
across data modelling, integration, governance, and transformation. Experience with AWS (S3, Glue, Redshift, Lambda, Kinesis) and/or Azure (ADF, Synapse, Fabric). Familiarity with modern data platforms (e.g. Databricks, Snowflake, or Lakehouse environments). Ability to operate confidently in highly secure, mission-focused settings. Why Join? Work on meaningful security projects with real-world impact. Join a culture built More ❯