Reading, Berkshire, United Kingdom Hybrid / WFH Options
Bowerford Associates
Degree in Computer Science, Software Engineering, or similar (applied to Data/Data Specialisation). Extensive experience in Data Engineering, in both Cloud & On-Prem, Big Data and Data Lake environments. Expert knowledge in data technologies, data transformation tools, data governance techniques. Strong analytical and problem-solving abilities. Good understanding of Quality and Information Security principles. Effective communication, ability … monitoring/security is necessary. Significant AWS or Azure hands-on experience. ETL Tools such as Azure Data Fabric (ADF) and Databricks or similar ones. Data Lakes: Azure Data, DeltaLake, Data Lake or Databricks Lakehouse. Certifications: AWS, Azure, or Cloudera certifications are a plus. To be considered for this role you MUST have in-depth experience … role. KEYWORDS Lead Data Engineer, Senior Data Engineer, Spark, Java, Python, PySpark, Scala, Big Data, AWS, Azure, Cloud, On-Prem, ETL, Azure Data Fabric, ADF, Hadoop , HDFS , Azure Data, DeltaLake, Data Lake Please note that due to a high level of applications, we can only respond to applicants whose skills and qualifications are suitable for this More ❯
Employment Type: Permanent
Salary: £75000 - £80000/annum Pension, Good Holiday, Healthcare
Data Science, Analytics, and DevOps teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous process improvements. What you will … and driving automation of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform with components such as Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python, SQL, Bash, and PySpark for More ❯
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
for someone with strong Databricks expertise to join their team. About the role: Designing and developing robust data pipelines using Azure Data Factory, Databricks, and Synapse Analytics. Working with DeltaLake and Azure Data Lake Storage to manage and optimise large datasets. Collaborating with data analysts, engineers, and business stakeholders to deliver clean, reliable data. Supporting the … of legacy systems to a modern Azure-based architecture Ensuring best practices in data governance, security, and performance tuning Requirements: Proven experience with Azure Data Services (ADF, Synapse, Data Lake) Strong hands-on experience with Databricks (including PySpark or SQL) Solid SQL skills and understanding of data modelling and ETL/ELT processes Familiarity with DeltaLakeMore ❯
principles, including data modeling, data warehousing, data integration, and data governance. Databricks Expertise: They have hands-on experience with the Databricks platform, including its various components such as Spark, DeltaLake, MLflow, and Databricks SQL. They are proficient in using Databricks for various data engineering and data science tasks. Cloud Platform Proficiency: They are familiar with cloud platforms … integration patterns. Extensive experience with big data technologies and cloud computing, specifically Azure (minimum 3+ years hands-on experience with Azure data services). Strong experience with Azure Databricks, DeltaLake, and other relevant Azure services. Active Azure Certifications: At least one of the following is required: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Data Scientist More ❯
a high-performing and secure environment. The role reports to a project delivery lead and works closely with internal technical teams. Key Responsibilities: Design and implement Databricks Lakehouse architecture (DeltaLake, Unity Catalog, etc.) Develop ETL/ELT pipelines using Spark, Python, SQL, and Databricks workflows Integrate with Azure services and BI tools (e.g., Power BI) Optimise performance … and support CI/CD and MLOps pipelines Enable knowledge transfer through code reviews, training, and reusable templates Key Skill s: In-depth experience with Databricks (DeltaLake, Unity Catalog, Lakehouse architecture). Strong knowledge of Azure services (e.g. Data Lake, Data Factory, Synapse). Solid hands-on skills in Spark, Python, PySpark, and SQL. Understanding of More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
on coding experience with Python or PySpark Proven expertise in building data pipelines using Azure Data Factory or Fabric Pipelines Solid experience with Azure technologies like Lakehouse Architecture, Data Lake, DeltaLake, and Azure Synapse Strong command of SQL Excellent communication and collaboration skills What's in It for You: Up to £60,000 salary depending on More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Lakehouse/DeltaLake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/Based in the West Midlands/Solihull/Birmingham area (1 day per week), Permanent role, £50,000 70,000 + car/allowance + bonus. One of our leading clients is looking … + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, DeltaLake Data Warehousing ETL CDC Stream Processing Database Design ML Python/PySpark Azure Blob Storage Parquet Azure Data Factory Desirable: Any exposure working in a software house, consultancy, retail More ❯
data estate, built primarily on Microsoft Azure and Databricks Key Responsibilities: Design and implement scalable, secure cloud-based data architecture (Azure & Databricks) Develop optimised data models and pipelines using DeltaLake and Azure services Define data standards, policies, and governance practices aligned with compliance Enable real-time analytics and machine learning use cases across business functions Ensure data … engineering and architecture Collaborate with internal stakeholders and third-party vendors Key Skills & Experience: Proven background designing and delivering enterprise-scale data platforms Strong knowledge of Microsoft Azure, Databricks, DeltaLake, and data warehousing Advanced data modelling and ETL/ELT optimisation experience Familiarity with regulatory frameworks such as IFRS 17 , BCBS 239 , and UK Data Protection Excellent More ❯
Azure-based data solutions, with a minimum of 5 years' hands-on experience in Azure implementations. Strong technical expertise across key Azure services, including Azure Data Factory, Databricks, Data Lake, DeltaLake, Synapse Analytics, Power BI, Key Vault, Automation Account, PowerShell, SQL Database, and broader Big Data platforms. Comprehensive understanding of the Azure ecosystem and its architectural More ❯
and Lakehouse architectures to ensure our data solutions are secure, efficient, and optimized. Key Responsibilities: Design and implement data solutions using Azure services, including Azure Databricks, ADF, and Data Lake Storage. Develop and maintain ETL/ELT pipelines to process structured and unstructured data from multiple sources. - Automate loads using Databricks workflows and Jobs Develop, test and build CI … modeling, warehousing, and real-time streaming. Knowledge of developing and processing full and incremental loads. Experience of automated loads using Databricks workflows and Jobs Expertise in Azure Databricks, including DeltaLake, Spark optimizations, and MLflow. Strong experience with Azure Data Factory (ADF) for data integration and orchestration. Hands-on experience with Azure DevOps, including pipelines, repos, and infrastructure More ❯
. Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products (e.g. PowerBI, Plot.ly, matplotlib) Programming OOP DevOps Web technologies HTTP/S REST … APIs Experience with time-series databases (e.g., InfluxDB, kdb+, TimescaleDB) and real-time data processing. Familiarity with distributed computing and data warehousing technologies (e.g., Spark, Snowflake, DeltaLake). Strong understanding of data governance, master data management, and data quality frameworks. Solid grasp of web technologies and APIs (REST, JSON, XML, authentication protocols). Experience with DevOps practices More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
years of experience in a data engineering or similar technical role Hands-on experience with key Microsoft Azure services: Azure Data Factory Azure Synapse Analytics Azure Data Lake Storage Azure SQL Database Solid understanding of data modeling, ETL/ELT, and warehousing concepts Proficiency in SQL and one or more programming languages (e.g., Python, Scala) Exposure to Microsoft Fabric … Familiarity with software testing methodologies and development team collaboration Experience working with Power BI and DAX Strong documentation, communication, and stakeholder engagement skills Preferred Qualifications: Experience with Lakehouse architecture, DeltaLake, or Databricks Exposure to Agile/Scrum working practices Microsoft certifications (e.g., Azure Data Engineer Associate) Background in consulting or professional services Understanding of data governance and More ❯
. Deep knowledge of ETL/ELT frameworks and orchestration tools (e.g., Airflow, Azure Data Factory, Dagster). Proficient in cloud platforms (preferably Azure) and services such as Data Lake, Synapse, Event Hubs, and Functions. Authoring reports and dashboards with either open source or commercial products (e.g. PowerBI , Plot.ly, matplotlib) Programming OOP DevOps Application development lifecycle Web technologies HTTP … CSV, JSON, XML, Parquet) Experience with time-series databases (e.g., InfluxDB, kdb+, TimescaleDB) and real-time data processing. Familiarity with distributed computing and data warehousing technologies (e.g., Spark, Snowflake, DeltaLake). Strong understanding of data governance, master data management, and data quality frameworks. Excellent communication and stakeholder management skills. Ability to mentor junior engineers and foster a More ❯
Team Valley Trading Estate, Gateshead, Tyne and Wear, England, United Kingdom
Nigel Wright Group
modelling and downstream integration into Power BI. Other key responsibilities will include: Build scalable ETL/ELT pipelines using Azure Data Factory and Synapse Model and store data using DeltaLake Contribute to CI/CD workflows for data pipelines Support data APIs from platforms like Intelligent Office, AJ Bell etc Document architecture, data flows and best practises More ❯
spoke data architectures , optimising for performance, scalability, and security. Collaborate with business stakeholders, data engineers, and analytics teams to ensure solutions are fit for purpose. Implement and optimise Databricks DeltaLake, Medallion Architecture, and Lakehouse patterns for structured and semi-structured data. Ensure best practices in Azure networking, security, and federated data access . Key Skills & Experience 5+ More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
WüNDER TALENT
solution design. Requirements Proven experience as a Data Engineer in cloud-first environments. Strong commercial knowledge of AWS services (e.g. S3, Glue, Redshift). Advanced PySpark and Databricks experience (DeltaLake, Unity Catalog, Databricks Jobs etc). Proficient in SQL (T-SQL/SparkSQL) and Python for data transformation and scripting. Hands-on experience with workflow orchestration tools More ❯
unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the More ❯
unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the More ❯
unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the More ❯
Databricks for improved traceability Implement Unity Catalog for automated data lineage Deliver backlog items through Agile sprint planning Skills & Experience Strong hands-on experience with Databricks, Fabric, Apache Spark, DeltaLake Proficient in Python, SQL, and PySpark Familiar with Azure Data Factory, Event Hub, Unity Catalog Solid understanding of data governance and enterprise architecture Effective communicator with experience More ❯
Databricks for improved traceability Implement Unity Catalog for automated data lineage Deliver backlog items through Agile sprint planning Skills & Experience Strong hands-on experience with Databricks, Fabric, Apache Spark, DeltaLake Proficient in Python, SQL, and PySpark Familiar with Azure Data Factory, Event Hub, Unity Catalog Solid understanding of data governance and enterprise architecture Effective communicator with experience More ❯
Databricks for improved traceability Implement Unity Catalog for automated data lineage Deliver backlog items through Agile sprint planning Skills & Experience Strong hands-on experience with Databricks, Fabric, Apache Spark, DeltaLake Proficient in Python, SQL, and PySpark Familiar with Azure Data Factory, Event Hub, Unity Catalog Solid understanding of data governance and enterprise architecture Effective communicator with experience More ❯
like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. You will have expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and DeltaLake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Job Description: Work More ❯
native multi-tenant technology to unlock superior customer experience, development velocity & operational efficiency Lead design and implementation a unified data platform from ingestion, processing through transformation pipelines leveraging Databricks DeltaLake Drive adoption of modern data stack for core platform, orchestration, streaming, data quality, metadata management, monitoring and governance Ensure platform scalability, reliability, and performance to support rapidly More ❯