CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
london, south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
viable and technically sound solutions. Enterprise Solution Design Architect and oversee the delivery of scalable data platforms (data lakes, lakehouses, warehouses) using GCP technologies such as BigQuery, Cloud Storage, Databricks, and Snowflake. Cloud Data Strategy Lead cloud migration and modernisation strategies using GCP and tools like Terraform, CI/CD pipelines, Azure DevOps, and GitHub. Data Modelling Apply hands-on More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Asset Resourcing Limited
engineering teams to resolve them. Strong analytical mindset and attention to detail. Clear, concise communicator able to present technical findings simply. Desirable: Experience testing in big data environments using Databricks, Snowflake, or Redshift. Knowledge of data governance and lineage tracking tools. Exposure to data performance and load testing. Experience in an Agile delivery environment. More ❯
and NLP models within production environments effectively. Required Skills 8+ years of software engineering experience, partially focused on LLM/AI/ML solutions. Experience with Python notebooks in Databricks (or similar tech stack) is essential. Robust hands-on experience with LLM SaaS providers (OpenAI, Claude, Gemini). Demonstrated expertise and managed services within cloud architecture (primarily AWS). Proficiency More ❯
disciplines : Cloud Engineering Data Engineering (not building pipelines but designing and building the framework) DevOps MLOps/LLMOps Often work with the following technologies : Azure, AWS, GCP Airflow, dbt, Databricks, Snowflake, etc. GitHub, Azure DevOps and related developer tooling and CI/CD platforms, Terraform or other Infra-as-Code MLflow, AzureML or similar for MLOps; LangSmith, Langfuse and similar More ❯
Opportunities Structured career progression with increasing exposure to technical leadership, client strategy, and people management. Access to 20 dedicated development days per year. Funded certifications in technologies like Snowflake, Databricks, AWS, and Azure. Hands-on leadership development, exposure to strategic projects, and collaboration with senior consultants. Role Details Contract Type: Permanent Salary: £70,000 + (negotiable based on experience) + More ❯
City of London, London, United Kingdom Hybrid / WFH Options
McGregor Boyall Associates Limited
small, high-performing team. What We're Looking For Hands-on Palantir Foundry expertise or transferable, client facing data engineering experience with large-scale data platforms such as: Snowflake, Databricks, AWS Glue/Redshift, Google BigQuery Software engineering skills in Python, Java, or TypeScript/React. Strong data modelling, pipeline development, and API design experience. Excellent problem-solving and communication More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
develop a team of Data Engineers to deliver robust, scalable data solutions. Provide technical leadership across the Microsoft and Azure Data Platform (SQL Server, Power BI, Azure Data Factory, Databricks, Data Lake). Champion Data Governance , Data Quality , and Data Management best practices. Collaborate with business and technology teams to align data solutions with strategic goals. Oversee Agile delivery, CI More ❯
City of London, London, United Kingdom Hybrid / WFH Options
I3 Resourcing Limited
development experience with: C# (.NET/.NET Core) REST APIs, SQL Server (T-SQL), HTML/CSS/JavaScript/React, Python Comfortable with tools like Azure Data Factory , Databricks , Power Automate , Power Platform , Copilot Studio . Strong knowledge of cloud security , automation , and DevOps principles. ?? Bonus Points for: Experience in the insurance market Familiarity with data architecture and data More ❯
solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines (Dataflow, Dataproc More ❯
solution strategies, guiding delivery teams, and acting as a trusted advisor to CDOs, CIOs, and Heads of Data. 🔑 What You’ll Do Lead enterprise solution design on GCP, BigQuery, Databricks, Snowflake Drive cloud migration & modernisation strategies (Terraform, CI/CD, Azure DevOps, GitHub) Define enterprise data models (ERwin, ER/Studio, PowerDesigner) Architect ETL/ELT frameworks & pipelines (Dataflow, Dataproc More ❯
Python and AI/ML frameworks such as PyTorch, LangChain, LangGraph, GraphRAG, and AutoGen. Experience with modern vector and graph databases (e.g., ChromaDB, Neo4j) and LLMOps platforms (e.g., Azure, Databricks, Azure OpenAI). Proven track record of delivering scalable AI solutions in enterprise settings, preferably in life sciences. Excellent communication and interpersonal skills, with the ability to lead projects and More ❯
quality, lineage, and documentation. Strong technical expertise in SQL, Python (or similar), cloud data stacks (Azure/AWS), modern ELT tools (e.g., dbt), data warehouses/lakehouses (BigQuery/Databricks), and Power BI. Strong domain knowledge of commercial analytics (revenue/cost/margin), audience/POI data usage for planning and measurement, and data privacy/DPIA processes. Excellent More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
with trading background) Location: Canary Wharf, UK (Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient in developing applications in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Harnham - Data & Analytics Recruitment
background in data architecture, data governance, and data management . Experience in modern data engineering practices (pipelines, orchestration, devops). Excellent leadership and stakeholder engagement skills. Bonus: experience with Databricks or TimeXtender . Package & Benefits: Salary up to £100,000 + 5% bonus . Hybrid working: London office 3 times per week . Comprehensive benefits package. Global organisation with opportunities More ❯
Preferred Qualifications Exposure to machine learning workflows, model lifecycle management, or data engineering platforms. Experience with distributed systems, event-driven architectures (e.g., Kafka), and big data platforms (e.g., Spark, Databricks). Familiarity with banking or financial domain use cases, including data governance and compliance-focused development. Knowledge of platform security, monitoring, and resilient architecture patterns. About us More ❯
you'll specialise in Azure Integration Services, Data Engineering in Fabric, and DevOps. You'll work directly with leading Azure tools including Azure Data Factory, Logic Apps, Fabric, PowerBI, Databricks, Azure DevOps, OpenAI and so on. Your role will involve creating robust, scalable cloud-native solutions, automating deployments, and ensuring seamless integrations.Our culture is collaborative, fast-paced, and built around More ❯
Databricks Engineer London- hybrid- 3 days per week on-site 6 Months + UMBRELLA only- Inside IR35 Key Responsibilities Design, develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in Delta Lake for performance, scalability, and cost efficiency. Collaborate with analytics … BI, and data science teams to deliver clean, reliable datasets. Implement data quality checks (dbt tests, monitoring) and ensure governance standards. Manage and monitor Databricks clusters & SQL Warehouses to support workloads. Contribute to CI/CD practices for data pipelines (version control, testing, deployments). Troubleshoot pipeline failures, performance bottlenecks, and scaling challenges. Document workflows, transformations, and data models for … knowledge sharing. Required Skills & Qualifications 3-6 years of experience as a Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, Delta Lake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modeling (Kimball, Data Vault, or More ❯
model training and inference. Driving best practices in data governance, security, and performance optimisation. What We're Looking For Strong experience with Azure cloud services, especially Data Factory, Synapse, Databricks, and Azure Functions. Proficiency in Python, SQL, and data transformation techniques. Experience working with both structured and unstructured data sources. Understanding of AI/ML data requirements and how to More ❯
model training and inference. Driving best practices in data governance, security, and performance optimisation. What We're Looking For Strong experience with Azure cloud services, especially Data Factory, Synapse, Databricks, and Azure Functions. Proficiency in Python, SQL, and data transformation techniques. Experience working with both structured and unstructured data sources. Understanding of AI/ML data requirements and how to More ❯
market, credit, and operational risk, compliance, and trade lifecycle Proven in delivering complex risk and compliance solutions Strong leadership and stakeholder management skills Bonus: Python, Azure Data Factory/Databricks, data analysis, and COTS risk platforms (e.g., CubeLogic, deltaconX) Package 85,000 Base Salary 15% non-contributory pension Hybrid working (3 days per week) Performance-based bonus scheme Comprehensive benefits More ❯