Key Responsibilities Lead the design, development, and maintenance of scalable, high-performance data pipelines on Databricks. Architect and implement data ingestion, transformation, and integration workflows using PySpark, SQL, and Delta Lake. Guide the team in migrating legacy ETL processes to modern cloud-based data pipelines. Ensure data accuracy, schema consistency, row counts, and KPIs during migration and transformation. Collaborate … and analytics. ________________________________________ Required Skills & Qualifications 10-12 years of experience in data engineering, with at least 3+ years in a technical lead role. Strong expertise in Databricks , PySpark , and DeltaLake . DBT Advanced proficiency in SQL, ETL/ELT pipelines, and data modelling. Experience with Azure Data Services (ADLS, ADF, Synapse) or other major cloud platforms (AWS … of data warehousing , transformation logic , SLAs, and dependencies. Hands-on experience with real-time streaming near-realtime batch is a plus., optimisation of data bricks and DBT workload and DeltaLake Familiarity with CI/CD pipelines, DevOps practices, and Git-based workflows. Knowledge of data security, encryption, and compliance frameworks (GDPR, SOC2, ISO ).good to have Excellent More ❯
develop, and maintain ETL/ELT pipelines using Airflow for orchestration and scheduling. Build and manage data transformation workflows in DBT running on Databricks . Optimize data models in DeltaLake for performance, scalability, and cost efficiency. Collaborate with analytics, BI, and data science teams to deliver clean, reliable datasets. Implement data quality checks (dbt tests, monitoring) and … Data Engineer (or similar). Hands-on expertise with: DBT (dbt-core, dbt-databricks adapter, testing & documentation). Apache Airflow (DAG design, operators, scheduling, dependencies). Databricks (Spark, SQL, DeltaLake, job clusters, SQL Warehouses). Strong SQL skills and understanding of data modeling (Kimball, Data Vault, or similar) . Proficiency in Python for scripting and pipeline development. More ❯
in doing this properly - not endless meetings and PowerPoints. What you'll be doing: Designing, building, and optimising end-to-end data pipelines using Azure Databricks, PySpark, ADF, and DeltaLake Implementing a medallion architecture - from raw to enriched to curated Working with DeltaLake and Spark for both batch and streaming data Collaborating with analysts More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Data Consultant/Lakehouse/DeltaLake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/Based in the West Midlands/Solihull/Birmingham area, Permanent role, £50,000 - 70,000 + car/allowance (£5,000) + 15% bonus. One of our leading clients … + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, DeltaLake Data Warehousing ETL Database Design Python/PySpark Azure Blob Storage Azure Data Factory Desirable: Exposure ML/Machine Learning/AI/Artificial Intelligence More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
machine learning models in production. What you'll be doing as Lead ML Ops Engineer: Leading the design and implementation of robust ML Ops pipelines using Azure, Databricks, and DeltaLake Architecting and overseeing API services and caching layers (e.g., Azure Cache for Redis) Driving integration with cloud-based data storage solutions such as Snowflake Collaborating with data … from the Machine Learning Operations Lead: Proven experience in ML Ops leadership, with deep expertise in Azure, Databricks, and cloud-native architectures Strong understanding of Postgres, Redis, Snowflake, and DeltaLake Architecture Hands-on experience with Docker, container orchestration, and scalable API design Excellent communication and stakeholder management skills Ability to drive strategic initiatives and influence technical direction More ❯
Employment Type: Permanent
Salary: £70000 - £90000/annum 25+bank, bonus + more
machine learning models in production. What you'll be doing as Lead ML Ops Engineer: Leading the design and implementation of robust ML Ops pipelines using Azure, Databricks, and DeltaLake Architecting and overseeing API services and caching layers (e.g., Azure Cache for Redis) Driving integration with cloud-based data storage solutions such as Snowflake Collaborating with data … from the Machine Learning Operations Lead: Proven experience in ML Ops leadership, with deep expertise in Azure, Databricks, and cloud-native architectures Strong understanding of Postgres, Redis, Snowflake, and DeltaLake Architecture Hands-on experience with Docker, container orchestration, and scalable API design Excellent communication and stakeholder management skills Ability to drive strategic initiatives and influence technical direction More ❯
data warehousing, or analytics engineering Strong SQL and Python skills with hands-on experience in PySpark Exposure to Azure Databricks, Microsoft Fabric, or similar cloud data platforms Understanding of DeltaLake, Git, and CI/CD workflows Experience with relational data modelling and dimensional modelling Awareness of data governance tools such as Purview or Unity Catalog Excellent analytical More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Lambda, EMR) Strong communication skills and a collaborative mindset Comfortable working in Agile environments and engaging with stakeholders Bonus Skills Experience with Apache Iceberg or similar table formats (e.g., DeltaLake, Hudi) Exposure to CI/CD tools like GitHub Actions, GitLab CI, or Jenkins Familiarity with data quality frameworks such as Great Expectations or Deequ Interest in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Lambda, EMR) Strong communication skills and a collaborative mindset Comfortable working in Agile environments and engaging with stakeholders Bonus Skills Experience with Apache Iceberg or similar table formats (e.g., DeltaLake, Hudi) Exposure to CI/CD tools like GitHub Actions, GitLab CI, or Jenkins Familiarity with data quality frameworks such as Great Expectations or Deequ Interest in More ❯
across SQL Server, PostgreSQL, and cloud databases Proven track record with complex data migration projects (terabyte+ datasets, multiple legacy source systems, structures and unstructured data) Proficiency with Parquet/DeltaLake or other modern data storage formats Experience with streaming architectures using Kafka, Event Hubs, or Kinesis for real-time data processing Knowledge of data architectures supporting AI More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
london (city of london), south east england, united kingdom
Miryco Consultants Ltd
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
machine learning use cases. Support the migration of legacy reporting tools into Databricks and modern BI solutions. Key Skills & Experience Essential: Strong hands-on experience with Databricks (SQL, PySpark, DeltaLake). Solid knowledge of BI and data visualisation tools (e.g., Power BI, Tableau, Qlik). Strong SQL and data modelling skills. Experience working with large, complex financial More ❯
environment. What you'll be doing as the Machine Learning Operations Engineer: Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment … What we're looking for from the Machine Learning Operations Engineer: Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of DeltaLake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits for the More ❯
Employment Type: Permanent
Salary: £50000 - £70000/annum bonus, 25 days holiday + more
environment. What you'll be doing as the Machine Learning Operations Engineer: Developing and maintaining API services using Azure and Databricks Managing caching layers with Azure Cache (Redis) Using Delta Live Tables for data processing and analytics Integrating with cloud-based data storage solutions like Snowflake Collaborating with cross-functional teams in an agile environment Supporting analytics, model deployment … What we're looking for from the Machine Learning Operations Engineer: Solid experience in ML Ops, particularly with Azure and Databricks Familiarity with Postgres, Redis, and Snowflake Understanding of DeltaLake Architecture, Docker, and container services Experience building and orchestrating APIs Strong problem-solving and communication skills Bonus: exposure to Azure Functions, Containers, or Insights Benefits for the More ❯
data pipelines and ETL processes Integrating data from multiple sources into centralised repositories Required skills & experience: Strong SQL skills, Any experience in Python will be an advantage. Databricks including DeltaLake and Unity Catalog. SQL Managed Instance (SQL MI) Power BI Excellent communication and stakeholder engagement skills Experience in building data pipelines Experience working in structured project environments More ❯
methodologies Experience designing models across raw, business, and consumption layers Solid understanding of metadata, cataloguing, and data governance practices Working knowledge of modern data platforms such as Databricks, Azure, DeltaLake, etc. Excellent communication and stakeholder engagement skills Data Architect - Benefits: Competitive base salary with regular reviews Car allowance - circa £5k per annum Discretionary company bonus Enhanced pension More ❯
unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark , DeltaLake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯
Kubernetes stack with secure-by-design tools Update security, software, dependencies and libraries Set up and migrate Spark clusters across platforms Manage user accounts and IdP permissions Maintain the DeltaLake Ensure secure-by-design assurance throughout the platform Experience Required: Strong Linux engineering background Expertise in Kubernetes and Docker Proficient in scripting (Python, Bash) Experience with air More ❯