in building and deploying modern data solutions based on Azure Databricks, enabling faster and more informed business decisions. You'll work hands-on with Azure Databricks, Azure Data Factory, DeltaLake, and Power BI to design scalable data pipelines, implement efficient data models, and ensure high-quality data delivery. This is a great opportunity to shape the future … within the organisation while working with advanced cloud technologies. Key Responsibilities and Deliverables Design, develop, and optimise end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake. Implement Medallion Architecture to structure raw, enriched, and curated data layers efficiently. Build scalable ETL/ELT processes with Azure Data Factory and PySpark. Support data governance initiatives … Collaborate with analysts to validate and refine datasets for reporting. Apply DevOps and CI/CD best practices (Git, Azure DevOps) for automated testing and deployment. Optimise Spark jobs, DeltaLake tables, and SQL queries for performance and cost-effectiveness. Troubleshoot and proactively resolve data pipeline issues. Partner with data architects, analysts, and business teams to deliver end More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Bowerford Associates
Degree in Computer Science, Software Engineering, or similar (applied to Data/Data Specialisation). Extensive experience in Data Engineering, in both Cloud & On-Prem, Big Data and Data Lake environments. Expert knowledge in data technologies, data transformation tools, data governance techniques. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability … monitoring/security is necessary. Significant AWS or Azure hands-on experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, DeltaLake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. To be considered for this role you MUST have in-depth experience … role. KEYWORDS Lead Data Engineer, Senior Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, Cloud, On-Prem, ETL, Azure Data Fabric, ADF, Hadoop , HDFS , Azure Data, DeltaLake, Data Lake Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this More ❯
Employment Type: Permanent
Salary: £75000 - £80000/annum Pension, Good Holiday, Healthcare
Data Science, Analytics, and DevOps teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous process improvements. What you will … and driving automation of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform with components such as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python, SQL, Bash, and PySpark for More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
for someone with strong Databricks expertise to join their team. About the role: Designing and developing robust data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Working with DeltaLake and Azure Data Lake Storage to manage and optimise large datasets. Collaborating with data analysts, engineers, and business stakeholders to deliver clean, reliable data. Supporting the … of legacy systems to a modern Azure-based architecture Ensuring best practices in data governance, security, and performance tuning Requirements: Proven experience with Azure Data Services (ADF, Synapse, Data Lake) Strong hands-on experience with Databricks (including PySpark or SQL) Solid SQL skills and understanding of data modelling and ETL/ELT processes Familiarity with DeltaLakeMore ❯
principles, including data modeling, data warehousing, data integration, and data governance. Databricks Expertise: They have hands-on experience with the Databricks platform, including its various components such as Spark, DeltaLake, MLflow, and Databricks SQL. They are proficient in using Databricks for various data engineering and data science tasks. Cloud Platform Proficiency: They are familiar with cloud platforms … integration patterns. Extensive experience with big data technologies and cloud computing, specifically Azure (minimum 3+ years hands-on experience with Azure data services). Strong experience with Azure Databricks, DeltaLake, and other relevant Azure services. Active Azure Certifications: At least one of the following is required: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Data Scientist More ❯
key role in the design and delivery of advanced Databricks solutions within the Azure ecosystem. Responsibilities: Design, build, and optimise end-to-end data pipelines using Azure Databricks, including Delta Live Tables. Collaborate with stakeholders to define technical requirements and propose Databricks-based solutions. Drive best practices for data engineering. Help clients realise the potential of data science, machine … Support with planning, requirements refinements, and work estimation. Skills & Experiences: Proven experience designing and implementing data solutions in Azure using Databricks as a core platform. Hands-on expertise in DeltaLake, Delta Live Tables and Databricks Workflows. Strong coding skills in Python and SQL, with experience in developing modular, reusable code in Databricks. Deep understanding of lakehouse More ❯
a high-performing and secure environment. The role reports to a project delivery lead and works closely with internal technical teams. Key Responsibilities: Design and implement Databricks Lakehouse architecture (DeltaLake, Unity Catalog, etc.) Develop ETL/ELT pipelines using Spark, Python, SQL, and Databricks workflows Integrate with Azure services and BI tools (e.g., Power BI) Optimise performance … and support CI/CD and MLOps pipelines Enable knowledge transfer through code reviews, training, and reusable templates Key Skill s: In-depth experience with Databricks (DeltaLake, Unity Catalog, Lakehouse architecture). Strong knowledge of Azure services (e.g. Data Lake, Data Factory, Synapse). Solid hands-on skills in Spark, Python, PySpark, and SQL. Understanding of More ❯
spoke data architectures , optimising for performance, scalability, and security. Collaborate with business stakeholders, data engineers, and analytics teams to ensure solutions are fit for purpose. Implement and optimise Databricks DeltaLake, Medallion Architecture, and Lakehouse patterns for structured and semi-structured data. Ensure best practices in Azure networking, security, and federated data access . Key Skills & Experience 5+ … years of experience in data architecture, solution architecture, or cloud data engineering roles. Strong expertise in Azure data services , including: Azure Databricks (DeltaLake, Unity Catalog, MLflow) Azure Event Hub and Azure Data Explorer for real-time and streaming data pipelines Azure Storage (ADLS Gen2, Blob Storage) for scalable data lakes Azure Purview or equivalent for data governance More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Lakehouse/DeltaLake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/Based in the West Midlands/Solihull/Birmingham area (1 day per week), Permanent role, £50,000 70,000 + car/allowance + bonus. One of our leading clients is looking … + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, DeltaLake Data Warehousing ETL CDC Stream Processing Database Design ML Python/PySpark Azure Blob Storage Parquet Azure Data Factory Desirable: Any exposure working in a software house, consultancy, retail More ❯
and managing data and AI solutions - Experience building ETL pipelines, managing data pipelines, and working with large datasets using tools like Spark, Python, and SQL - Experience with technologies like DeltaLake, Delta Live Tables, and Databricks Workflows - Experience collaborating with data scientists - Familiarity with Advana - Strong Python programming skills - Solid SQL knowledge for querying and data manipulation More ❯
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Azure-based data solutions, with a minimum of 5 years' hands-on experience in Azure implementations. Strong technical expertise across key Azure services, including Azure Data Factory, Databricks, Data Lake, DeltaLake, Synapse Analytics, Power BI, Key Vault, Automation Account, PowerShell, SQL Database, and broader Big Data platforms. Comprehensive understanding of the Azure ecosystem and its architectural More ❯
and Lakehouse architectures to ensure our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test and build CI … modeling, warehousing, and real-time streaming. Knowledge of developing and processing full and incremental loads. Experience of automated loads using Databricks workflows and Jobs Expertise in Azure Databricks, including DeltaLake, Spark optimizations, and MLflow. Strong experience with Azure Data Factory (ADF) for data integration and orchestration. Hands-on experience with Azure DevOps, including pipelines, repos, and infrastructure More ❯
Spark • Strong programming skills in Python, with experience in data manipulation libraries (e.g., PySpark, Spark SQL) • Experience with core components of the Databricks ecosystem: Databricks Workflows, Unity Catalog, and Delta Live Tables • Solid understanding of data warehousing principles, ETL/ELT processes, data modeling and techniques, and database systems • Proven experience with at least one major cloud platform (Azure … and scalable ETL processes to ingest, transform, and load data from various sources (databases, APIs, streaming platforms) into cloud-based data lakes and warehouses • Leveraging the Databricks ecosystem (SQL, DeltaLake, Workflows, Unity Catalog) to deliver reliable and performant data workflows • Integrating with cloud services such as Azure, AWS, or GCP to enable secure, cost-effective data solutions More ❯
least 10 years' experience in Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & Batch Processing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self … The Head of Data Engineering & Insight will work within a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks DeltaLake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine More ❯
least 10 years' experience in Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation . Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & Batch Processing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance, cataloguing, security, automation, and self … The Head of Data Engineering & Insight will work within a modern, cloud-based BI ecosystem , including: Data Integration: Fivetran , HVR, Databricks , Apache Kafka, Google BigQuery , Google Analytics 4 Data Lake & Storage: Databricks DeltaLake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API Advanced Analytics: Databricks (AI & Machine More ❯
. Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products (e.g. PowerBI, Plot.ly, matplotlib) Programming OOP DevOps Web technologies HTTP/S REST … APIs Experience with time-series databases (e.g., InfluxDB, kdb+, TimescaleDB) and real-time data processing. Familiarity with distributed computing and data warehousing technologies (e.g., Spark, Snowflake, DeltaLake). Strong understanding of data governance, master data management, and data quality frameworks. Solid grasp of web technologies and APIs (REST, JSON, XML, authentication protocols). Experience with DevOps practices More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
years of experience in a data engineering or similar technical role Hands-on experience with key Microsoft Azure services: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric … Familiarity with software testing methodologies and development team collaboration Experience working with Power BI and DAX Strong documentation, communication, and stakeholder engagement skills Preferred Qualifications: Experience with Lakehouse architecture, DeltaLake, or Databricks Exposure to Agile/Scrum working practices Microsoft certifications (e.g., Azure Data Engineer Associate) Background in consulting or professional services Understanding of data governance and More ❯
AWS, GCP) including AWS primitives such as IAM, S3, RDS, EMR, ECS and more Advanced experience working and understanding the tradeoffs of at least one of the following Data Lake table/file formats: DeltaLake, Parquet, Iceberg, Hudi Previous h ands-on expertise with Spark Experience working with containerisation technologies - Docker, Kubernetes Streaming Knowledge: Experience with More ❯
modern data platforms and engineering practices. Key competencies include: Databricks Platform Expertise : Proven experience designing and delivering data solutions using Databricks on Azure or AWS. Databricks Components : Proficient in DeltaLake, Unity Catalog, MLflow, and other core Databricks tools. Programming & Query Languages : Strong skills in SQL and Apache Spark (Scala or Python). Relational Databases : Experience with on More ❯
. Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products (e.g. PowerBI , Plot.ly, matplotlib) Programming OOP DevOps Application development lifecycle Web technologies HTTP … CSV, JSON, XML, Parquet) Experience with time-series databases (e.g., InfluxDB, kdb+, TimescaleDB) and real-time data processing. Familiarity with distributed computing and data warehousing technologies (e.g., Spark, Snowflake, DeltaLake). Strong understanding of data governance, master data management, and data quality frameworks. Excellent communication and stakeholder management skills. Ability to mentor junior engineers and foster a More ❯
Engineer on cloud-based projects Strong hands-on skills with Databricks , Apache Spark , and Python or Scala Proficient in SQL and working with large-scale data environments Experience with DeltaLake , Azure Data Lake , or similar technologies Familiarity with version control, CI/CD, and infrastructure-as-code tools Ability to deliver independently in a contractor capacity More ❯
Qualifications Required Skills & Experience: 8+ years of experience in data engineering with at least 3 years in Databricks, Spark, and distributed data systems Proficiency with Python, SQL, Spark, and DeltaLake Bachelor's Degree in a STEM field with a preference towards Computer Science and Software Engineering Experience with Natural Language Processing Due to US Government Contract Requirements More ❯