in doing this properly - not endless meetings and PowerPoints. What you'll be doing: Designing, building, and optimising end-to-end data pipelines using Azure Databricks, PySpark, ADF, and DeltaLake Implementing a medallion architecture - from raw to enriched to curated Working with DeltaLake and Spark for both batch and streaming data Collaborating with analysts More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Data Consultant/Lakehouse/DeltaLake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/Based in the West Midlands/Solihull/Birmingham area, Permanent role, £50,000 - 70,000 + car/allowance (£5,000) + 15% bonus. One of our leading clients … + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, DeltaLake Data Warehousing ETL Database Design Python/PySpark Azure Blob Storage Azure Data Factory Desirable: Exposure ML/Machine Learning/AI/Artificial Intelligence More ❯
machine learning models in production. What you'll be doing as Lead ML Ops Engineer: Leading the design and implementation of robust ML Ops pipelines using Azure, Databricks, and DeltaLake Architecting and overseeing API services and caching layers (e.g., Azure Cache for Redis) Driving integration with cloud-based data storage solutions such as Snowflake Collaborating with data … from the Machine Learning Operations Lead: Proven experience in ML Ops leadership, with deep expertise in Azure, Databricks, and cloud-native architectures Strong understanding of Postgres, Redis, Snowflake, and DeltaLake Architecture Hands-on experience with Docker, container orchestration, and scalable API design Excellent communication and stakeholder management skills Ability to drive strategic initiatives and influence technical direction More ❯
Employment Type: Permanent
Salary: £70000 - £90000/annum 25+bank, bonus + more
machine learning models in production. What you'll be doing as Lead ML Ops Engineer: Leading the design and implementation of robust ML Ops pipelines using Azure, Databricks, and DeltaLake Architecting and overseeing API services and caching layers (e.g., Azure Cache for Redis) Driving integration with cloud-based data storage solutions such as Snowflake Collaborating with data … from the Machine Learning Operations Lead: Proven experience in ML Ops leadership, with deep expertise in Azure, Databricks, and cloud-native architectures Strong understanding of Postgres, Redis, Snowflake, and DeltaLake Architecture Hands-on experience with Docker, container orchestration, and scalable API design Excellent communication and stakeholder management skills Ability to drive strategic initiatives and influence technical direction More ❯
data warehousing, or analytics engineering Strong SQL and Python skills with hands-on experience in PySpark Exposure to Azure Databricks, Microsoft Fabric, or similar cloud data platforms Understanding of DeltaLake, Git, and CI/CD workflows Experience with relational data modelling and dimensional modelling Awareness of data governance tools such as Purview or Unity Catalog Excellent analytical More ❯
london, south east england, united kingdom Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Robert Half
years as a Data Engineer Hands-on with Databricks, Spark, Python, SQL Cloud experience (Azure, AWS, or GCP) Strong understanding of data quality, governance, and security Nice to Have: DeltaLake, DBT, Snowflake, Terraform, CI/CD, or DevOps exposure You’ll: Build and optimise ETL pipelines Enable analytics and reporting teams Drive automation and best practices Why More ❯
initiative for the business, directly impacting customer experience and long-term data strategy. Key Responsibilities: Design and build scalable data pipelines and transformation logic in Databricks Implement and maintain DeltaLake physical models and relational data models. Contribute to design and coding standards, working closely with architects. Develop and maintain Python packages and libraries to support engineering work. … Participate in Agile ceremonies (stand-ups, backlog refinement, etc.). Essential Skills: PySpark and SparkSQL. Strong knowledge of relational database modelling Experience designing and implementing in Databricks (DBX notebooks, Delta Lakes). Azure platform experience. ADF or Synapse pipelines for orchestration. Python development Familiarity with CI/CD and DevOps principles. Desirable Skills Data Vault 2.0. Data Governance & Quality More ❯
initiative for the business, directly impacting customer experience and long-term data strategy. Key Responsibilities: Design and build scalable data pipelines and transformation logic in Databricks Implement and maintain DeltaLake physical models and relational data models. Contribute to design and coding standards, working closely with architects. Develop and maintain Python packages and libraries to support engineering work. … Participate in Agile ceremonies (stand-ups, backlog refinement, etc.). Essential Skills: PySpark and SparkSQL. Strong knowledge of relational database modelling Experience designing and implementing in Databricks (DBX notebooks, Delta Lakes). Azure platform experience. ADF or Synapse pipelines for orchestration. Python development Familiarity with CI/CD and DevOps principles. Desirable Skills Data Vault 2.0. Data Governance & Quality More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
environment. What you'll be doing as the Machine Learning Operations Engineer: Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment … What we're looking for from the Machine Learning Operations Engineer: Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of DeltaLake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits for the More ❯
Employment Type: Permanent
Salary: £50000 - £70000/annum bonus, 25 days holiday + more
environment. What you'll be doing as the Machine Learning Operations Engineer: Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment … What we're looking for from the Machine Learning Operations Engineer: Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of DeltaLake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits for the More ❯
Herefordshire, West Midlands, United Kingdom Hybrid / WFH Options
IO Associates
and maintain platform software, libraries, and dependencies . Set up and manage Spark clusters , including migrations to new platforms. Manage user accounts and permissions across identity platforms. Maintain the DeltaLake and ensure platform-wide security standards. Collaborate with the wider team to advise on system design and delivery . What we're looking for: Strong Linux engineering More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯