City Of London, England, United Kingdom Hybrid / WFH Options
CipherTek Recruitment
greenfield MLOps pipelines that handle very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will shape the MLOps framework and … data sources (orders, quotes, trades, risk, etc.). Essential Requirements: 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks … CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and integration. Experience with Agile methodologies and SDLC practices. Strong problem-solving, analytical, and communication skills. More ❯
City of London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
greenfield MLOps pipelines that handle very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will shape the MLOps framework and … across various data sources (orders, quotes, trades, risk, etc.). 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks … CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and integration. Experience with Agile methodologies and SDLC practices. Strong problem-solving, analytical, and communication skills. More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
greenfield MLOps pipelines that handle very large datasets. You will be responsible for building out a greenfield standaridised framework for Capital markets. The core platform is built on Azure Databricks Lakehouse, consolidating data from various front and Middle Office systems to support BI, MI, and advanced AI/ML analytics. As a lead, you will shape the MLOps framework and … across various data sources (orders, quotes, trades, risk, etc.). 2+ years of experience in MLOps and at least 3 years in AI/ML engineering. Knowledge in Azure Databricks and associated services. Proficiency with ML frameworks and libraries in Python. Proven experience deploying and maintaining LLM services and solutions. Expertise in Azure DevOps and GitHub Actions. Familiarity with Databricks … CLI and Databricks Job Bundle. Strong programming skills in Python and SQL; familiarity with Scala is a plus. Solid understanding of AI/ML algorithms, model training, evaluation (including hyperparameter tuning), deployment, monitoring, and governance. Experience in handling large datasets and performing data preparation and integration. Experience with Agile methodologies and SDLC practices. Strong problem-solving, analytical, and communication skills. More ❯
London, England, United Kingdom Hybrid / WFH Options
Artefact
engineering and a proven track record of leading data projects in a fast-paced environment. Key Responsibilities Design, build, and maintain scalable and robust data pipelines using SQL, Python, Databricks, Snowflake, Azure Data Factory, AWS Glue, Apache Airflow and Pyspark. Lead the integration of complex data systems and ensure consistency and accuracy of data across multiple platforms. Implement continuous integration … engineering with a strong technical proficiency in SQL, Python, and big data technologies. Extensive experience with cloud services such as Azure Data Factory and AWS Glue. Demonstrated experience with Databricks and Snowflake. Solid understanding of CI/CD principles and DevOps practices. Proven leadership skills and experience managing data engineering teams. Strong project management skills and the ability to lead … Strong communication and interpersonal skills. Excellent understanding of data architecture including data mesh, data lake and data warehouse. Preferred Qualifications: Certifications in Azure, AWS, or similar technologies. Certifications in Databricks, Snowflake or similar technologies Experience in the leading large scale data engineering projects Working Conditions This position may require occasional travel. Hybrid work arrangement: two days per week working from More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
forward-thinking organization using data to drive innovation and business performance. They’re expanding their team and are looking for a talented Data Engineer with experience in Azure and Databricks to join the team. Salary and Benefits £55,000 – £65,000 salary depending on experience 10% performance-related bonus Hybrid working model – 2 days in the Greater Manchester office Supportive … do I need to apply for the role: Solid hands-on experience with Azure data tools, including Data Factory, Data Lake, Synapse Analytics, and Azure SQL. Strong proficiency with Databricks, including Spark, Delta Lake, and notebooks. Skilled in Python and SQL for data transformation and processing. Experience with Git and modern CI/CD workflows. Strong analytical mindset and effective More ❯
Lutterworth, England, United Kingdom Hybrid / WFH Options
PharmaLex
database. Collaborate with Data Analysts and Scientists to optimise data quality, reliability, security, and automation. Skills & Responsibilities: Core responsibility will be using the NHS Secure Data Environment which utilises databricks, to design and extract regular datasets. Configure and troubleshoot Microsoft Azure, manage data ingestion using LogicApps and Data Factory. Develop ETL scripts using MS SQL, Python, handle web scraping, APIs More ❯
ll play a key role in shaping the data strategy, enhancing platform capabilities, and supporting business intelligence initiatives. Key Responsibilities Design and develop Azure data pipelines using Data Factory, Databricks, and related services. Implement and optimize ETL processes for performance, reliability, and cost-efficiency. Build scalable data models and support analytics and reporting needs. Design and maintain Azure-based data More ❯
West Midlands, England, United Kingdom Hybrid / WFH Options
Amtis - Digital, Technology, Transformation
ll play a key role in shaping the data strategy, enhancing platform capabilities, and supporting business intelligence initiatives. Key Responsibilities Design and develop Azure data pipelines using Data Factory, Databricks, and related services. Implement and optimize ETL processes for performance, reliability, and cost-efficiency. Build scalable data models and support analytics and reporting needs. Design and maintain Azure-based data More ❯
London, England, United Kingdom Hybrid / WFH Options
ScanmarQED
Experience: 3–5 years in Data Engineering, Data Warehousing, or programming within a dynamic (software) project environment. Data Infrastructure and Engineering Foundations: Data Warehousing: Knowledge of tools like Snowflake, DataBricks, ClickHouse and traditional platforms like PostgreSQL or SQL Server. ETL/ELT Development: Expertise in building pipelines using tools like Apache Airflow, dbt, Dagster. Cloud providers: Proficiency in Microsoft Azure More ❯
Nottingham, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Associate. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions to ingest, store, and transform large datasets, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating More ❯
Maidstone, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Analytics Engineer Associate. Responsibilities: Your daily tasks will include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, reusable Python code for data engineering tasks. Collaborating with data More ❯
Stockport, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Analytics Engineer Associate. Responsibilities Daily responsibilities include, but are not limited to: Designing, building, and optimizing high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks, or Microsoft Fabric. Implementing scalable solutions for data ingestion, storage, and transformation, ensuring data quality and availability. Writing clean, efficient, and reusable Python code for data engineering tasks. Collaborating with More ❯
London, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
teams, where required. Essential Skills and Experience: Educated to degree level or have equivalent professional experience. Experience translating business requirements into solution design and implementation. Experience of MS Azure Databricks Experience working with Database technologies such as SQL Server, and Data Warehouse Architecture with knowledge of big data, data lakes and NoSQL. Experience following product/solution development lifecycles using More ❯
Slough, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
teams, where required. Essential Skills and Experience: Educated to degree level or have equivalent professional experience. Experience translating business requirements into solution design and implementation. Experience of MS Azure Databricks Experience working with Database technologies such as SQL Server, and Data Warehouse Architecture with knowledge of big data, data lakes and NoSQL. Experience following product/solution development lifecycles using More ❯
Chester, England, United Kingdom Hybrid / WFH Options
Forge Holiday Group Ltd
ETL/ELT processes Exposure to Python or any other object-oriented programming languages Experience with modern data stack tools and cloud-based data warehouses like MS Fabric, Snowflake, Databricks, Teradata or AWS Experience in designing and constructing effective reports and dashboards that transform data into actionable insights with Tableau or Power BI Proven ability to manage work within set More ❯
London, England, United Kingdom Hybrid / WFH Options
First Central Services
PowerBI. Experience & knowledge Requires extensive experience of developing and implementing end to end data solutions in the cloud preferably in Azure. Experience engineering with big data technologies such as Databricks and/or Synapse Analytics, using PySpark. Solution design experience across end-to-end data solutions (sourcing to consumption) Experience in Azure services such as Data Factory, Azure Functions, ADLS More ❯
Glasgow, Scotland, United Kingdom Hybrid / WFH Options
Capgemini
Driven, use of Dataverse) and/or Power Automate. Copilot Studio experience also desirable. Programming language (Python, R or SQL) Hands-on experience with tools such as Microsoft Fabric, Databricks, or Snowflake Consulting Experience: Current experience in a major consulting firm and/or significant consulting background with evidence of effective stakeholder management to address business challenges Experience working across More ❯
Passionate about Databricks and Data Engineering? Aivix has been in operation for over a year and recently achieved AWS Select Partner & Databricks Registered partner status. We excel in offering analytics solutions that facilitate data-driven digital transformations for our customers. Aivix is currently looking for data engineers who are interested in Databricks. You will be shaping the beating heart of … we are an enthusiastic group. We encourage everyone to be themselves while actively empowering an entrepreneurial mindset. Ready to contribute to help and build our growing story? As a Databricks Engineer you will be involved in: Making use of best practices in Databricks; Big Data and AI projects Having conversations with the customer to correctly identify the data needs; Developing More ❯
London, England, United Kingdom Hybrid / WFH Options
ShareForce, Inc
This role is an opportunity to lead the build of bespoke data systems for our clients. Responsibilities Design and implement scalable data pipelines and ETL processes using Azure and Databricks technologies including Delta Live Tables. Lead technical discussions with clients and stakeholders to gather requirements and propose solutions. Help clients realise the potential of data science, machine learning, and scaled … data processing within Azure/Databricks ecosystem. Mentor junior team members and support their personal development. Take ownership for the delivery of core solution components. Support with planning, requirements refinement and work estimation. Skills and Experiences Design and develop end-to-end data solutions leveraging Azure services for batch, real-time, and streaming workloads (including data ingestion, cleansing, modelling, and … data platform development, concepts, and methods such as data warehouses and data lakehouse, with the ability to adapt and tailor based on requirements. Experience with Azure Synapse Analytics, Azure Databricks, Microsoft Fabric, Data Factory. Expertise in Python, SQL, and developer tooling such as Visual Studio Code, Azure DevOps. Good experience with CI/CD practices and tools for data platforms More ❯
Leeds, England, United Kingdom Hybrid / WFH Options
VIQU Limited
team based in Leeds, working mostly remotely with just one day on-site per week. You’ll lead the design and delivery of scalable, cloud-based data solutions using Databricks, Python, and SQL, while mentoring a team and driving engineering best practices. About You You might currently be a Senior Data Engineer ready to grow your leadership skills. You’re … passionate about building robust, efficient data pipelines and shaping cloud data architecture in an agile environment. Key Responsibilities Lead development of data pipelines and solutions using Databricks, Python, and SQL Design and maintain data models supporting analytics and business intelligence Build and optimise ELT/ETL processes on AWS or Azure Collaborate closely with analysts, architects, and stakeholders to deliver … as code Mentor and support your team, taking ownership of technical delivery and decisions Drive continuous improvements in platform performance, cost, and reliability Key Requirements Hands-on experience with Databricks or similar data engineering platforms Strong Python and SQL skills in data engineering contexts Expertise in data modelling and building analytics-ready datasets Experience with AWS or Azure cloud data More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
next-generation data platform. Working as part of a growing data team, you will play a critical role in designing and deploying scalable data pipelines and solutions using Azure Databricks and related technologies. This is an opportunity to contribute to a cloud-first, modern data strategy within a collaborative and forward-thinking environment. Key Responsibilities: Design and develop end-to … end data pipelines (batch and streaming) using Azure Databricks, Spark, and Delta Lake. Implement the Medallion Architecture and ensure consistency across raw, enriched, and curated data layers. Build and optimise ETL/ELT processes using Azure Data Factory and PySpark. Enforce data governance through Azure Purview and Unity Catalog. Apply DevOps and CI/CD practices using Git and Azure … analysts and business stakeholders to ensure data quality and usability. Contribute to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, Delta Lake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control More ❯
architectures in a modern cloud environment. You will play a key role in building and optimizing our Medallion architecture (Bronze, Silver, Gold layers), working with modern tools such as Databricks , dbt , Azure Data Factory , and Python/SQL to support critical business analytics and AI/ML initiatives. Key Responsibilities ETL Development : Design and build robust and reusable ETL/… ELT pipelines through the Medallion architecture in Databricks . Data Transformation : Create and manage data models and transformations using dbt , ensuring clear lineage, version control, and modularity. Pipeline Orchestration : Develop and manage workflow orchestration using Azure Data Factory , including setting up triggers, pipelines, and integration runtimes. System Maintenance : Monitor, maintain, and optimize existing data pipelines, including cron job scheduling and … Ensure compliance with data security policies, data retention rules, and privacy regulations. Required Skills and Experience 5+ years of experience in data engineering or similar roles. Strong experience with Databricks , including notebooks, cluster configuration, and Delta Lake. Proficiency in dbt for transformation logic and version-controlled data modeling. Deep knowledge of Azure Data Factory , including pipeline orchestration and integration with More ❯
Greater Manchester, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
forward-thinking organization using data to drive innovation and business performance. They’re expanding their team and are looking for a talented Data Engineer with experience in Azure and Databricks to join the team. Salary and Benefits £55,000 – £65,000 salary depending on experience 10% performance-related bonus Hybrid working model – 2 days in the Greater Manchester office Supportive … do I need to apply for the role: Solid hands-on experience with Azure data tools, including Data Factory, Data Lake, Synapse Analytics, and Azure SQL. Strong proficiency with Databricks, including Spark, Delta Lake, and notebooks. Skilled in Python and SQL for data transformation and processing. Experience with Git and modern CI/CD workflows. Strong analytical mindset and effective More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Noir
Data Engineer - FinTech Company - Newcastle (Tech Stack: Data Engineer, Databricks, Python, Azure, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) I'm working with a leading Software House in the FinTech industry, based in Newcastle, who are looking to hire a talented Data Engineer .This is a fantastic opportunity to join a forward-thinking company where you'll play … working with data and delivering value to stakeholders. Strong proficiency in SQL, Python, and Apache Spark , with hands-on experience using these technologies in a production environment. Experience with Databricks and Microsoft Azure is highly desirable. Financial Services experience is a plus but not essential. Excellent communication skills, with the ability to explain complex data concepts in a clear and More ❯
Bath, England, United Kingdom Hybrid / WFH Options
Noir
Job Reference: NC/RG/DE_1745802035 Job Views: 50 Posted: 28.04.2025 Expiry Date: 12.06.2025 col-wide Job Description: Data Engineer - FinTech Company - Newcastle (Tech Stack: Data Engineer, Databricks, Python, Azure, Power BI, AWS QuickSight, AWS, TSQL, ETL, Agile Methodologies) I’m working with a leading Software House in the FinTech industry, based in Newcastle, who are looking to … working with data and delivering value to stakeholders. * Strong proficiency in SQL, Python, and Apache Spark, with hands-on experience using these technologies in a production environment. * Experience with Databricks and Microsoft Azure is highly desirable. * Financial Services experience is a plus but not essential. * Excellent communication skills, with the ability to explain complex data concepts in a clear and More ❯