teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
bradford, yorkshire and the humber, united kingdom
Peregrine
teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
data-driven initiatives. Job Specification ( Technical Skills) : Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, DeltaLake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI Services, ML, Unity Catalog, and … Advanced proficiency in SQL, Python, and at least one additional programming language (Java, C#, C++) is desired. Proven experience with data warehousing and data lake technologies. Solid understanding of database systems (SQL, NoSQL). Platform Architecture: Able to develop and implement data platform architecture (data lakes, data warehouses, data More ❯
Ensure Data Security: Apply protocols and standards to secure clinical data both in-motion and at-rest. Shape Data Workflows: Utilize Databricks components like DeltaLake, Unity Catalog, and ML Flow to ensure efficient, secure, and reliable data workflows. Key Responsibilities Data Engineering with Databricks: Design and maintain … ETL/ELT processes, and data lakes to support data analytics and machine learning. Requirements Expertise in Databricks: Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration. Clinical Data Security: Understanding of More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
Spark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and DeltaLake optimisation. Experience with ETL/ELT processes for integrating diverse data sources. Experience in gathering, documenting, and refining requirements from key business More ❯
Strong experience designing and delivering data solutions in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or More ❯
customers to comprehend their business and technical needs, to develop tailored technical architectures and solutions in the Cloud, focusing on data engineering, data lakes, lake houses, business intelligence and machine learning/AI. Cost Optimization: You will be continuously trying to optimize run costs - both on platform level as … and supporting Big Data solutions for data lakes and data warehouses Expertise in cloud-based Big Data solutions is required - preferably with Azure Data Lake and related technological stack:ADLS Gen2, Spark/Databricks, DeltaLake, Kafka/Events Hub, Stream Analytics, Azure Data Factory, Azure DevOps More ❯
business use cases. Strong knowledge of data governance, data warehousing, and data security principles. Hands-on experience with modern data stacks and technologies (e.g., DeltaLake, SQL, Python, Azure/AWS/GCP). Experience aligning data capabilities with commercial strategy and business performance. Exceptional communication skills. More ❯
london, south east england, United Kingdom Hybrid / WFH Options
83zero
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
IET
architecture, ensuring alignment with best practices and strategic goals. Drive the adoption of Azure data technologies such as Fabric, Data Factory, Databricks, Synapse, and Delta Lake. Develop and maintain scalable data pipelines, ensuring efficient data ingestion, transformation, and storage. Introduce and enhance self-service reporting capabilities for stakeholders across More ❯
code repositories, and automation. Requirements 5+ Years experience in the Data and Analytics Domain. Previous Management experience is preferred. Strong expertise in Databricks (Spark, DeltaLake, Notebooks). Advanced knowledge of SQL development. • Familiarity with Azure Synapse for orchestration and analytics. Working experience with Power BI for reporting More ❯
standards and drive excellence in engineering practices. Architect and oversee the development of cloud-native data infrastructure and pipelines using Databricks , Python , PySpark , and DeltaLake . Guide the implementation of embedded analytics, headless APIs, and real-time dashboards for customer-facing platforms. Partner with Product Owners and … in data/analytics engineering, including 2+ years in a leadership or mentoring role. Strong hands-on expertise in Databricks , Spark , Python , PySpark , and Delta Live Tables . Experience designing and delivering scalable data pipelines and streaming data processing (e.g., Kafka , AWS Kinesis , or Azure Stream Analytics ). Background More ❯
team of 45 people, including Data Scientists, ML Engineers and 2 Data Engineers. Day‑to‑day you will: Monitor, optimise and rebuild ETL/DeltaLake workflows in Databricks. Migrate legacy ingestion jobs to modern, cloud‑native patterns (Azure preferred, some AWS/GCP). Collaborate with scientists More ❯
SQL and Python Prior experience designing solutions on the Databricks Data Intelligence platform, either on Azure or AWS Good knowledge of Databricks components including DeltaLake, Unity Catalogue, ML Flow etc. Experience building data pipelines and ETL processes Experience with any of the following is highly desirable: Snowflake, Kafka, Azure Data More ❯
SQL and Python Prior experience designing solutions on the Databricks Data Intelligence platform, either on Azure or AWS Good knowledge of Databricks components including DeltaLake, Unity Catalogue, ML Flow etc. Experience building data pipelines and ETL processes Experience with any of the following is highly desirable: Snowflake, Kafka, Azure Data More ❯
and building end-to-end data pipelines. Proficient in Python and/or Scala; solid understanding of SQL and distributed computing principles. Experience with DeltaLake, Lakehouse architecture, and data governance frameworks. Excellent client-facing and communication skills. Experience in Azure Data Services is desirable (e.g. Azure Data … Lake, Synapse, Data Factory, Fabric). More ❯
with 3+ years leading Databricks-based solutions. Proven experience in a consulting environment delivering large-scale data platform projects. Hands-on expertise in Spark, DeltaLake, MLflow, Unity Catalog, and DBSQL. Strong proficiency in Python, SQL, and at least one major cloud platform (AWS, Azure, or GCP). More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Cadent Gas
in SAP Datasphere or SAP BW4/Hana Advanced skills in SQL, data modelling, and data transformation Familiarity with Databricks, Apache Spark, PySpark, and DeltaLake Agile mindset with experience in DevOps and iterative delivery Excellent communication and stakeholder engagement abilities At Cadent, we're thrilled to be More ❯
implementation experience with Microsoft Fabric (preferred), along with familiarity with Azure Synapse and Databricks. Experience in core data platform technologies and methods including Spark, DeltaLake, Medallion Architecture, pipelines, etc. Experience leading medium to large-scale cloud data platform implementations, guiding teams through technical challenges and ensuring alignment More ❯
expert proven to deliver fast-paced releases. You will have worked with the latest Azure data platform technologies, particularly Azure Data Factory, Azure Data Lake Storage and Azure Databricks. Hands on experience of working in Databricks. In particular design and usage of the DeltaLake storage format. More ❯
expert proven to deliver fast-paced releases. You will have worked with the latest Azure data platform technologies, particularly Azure Data Factory, Azure Data Lake Storage and Azure Databricks. Hands on experience of working in Databricks. In particular design and usage of the DeltaLake storage format. More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
and act as a multiplier. What we look for in you Understand, assess and effectively apply modern data architectures (dimensional model, data mesh, data lake). Experience in applying and using data observability methods effectively. Experience in modern software development practices (agile, CI/CD, DevOps, infrastructure as code … the following: Strong knowledge of SQL and Python programming. Extensive experience working within a cloud environment. Experience with big data technologies (e.g. Spark, Databricks, DeltaLake, BigQuery). Experience with alternative data technologies (e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep … understanding of file formats and their behaviour such as parquet, delta and iceberg. What we offer We want to give you a great work environment; contribute back to both your personal and professional development; and give you great benefits to make your time at RVU even more enjoyable. Some More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Arthur Recruitment
Modeler/Architect London Market Insurance industry experience Using Data Modelling such as Erwin Experience with modelling methodologies including Kimball etc Usage of Data Lake formats such as Parquet and DeltaLake Strong SQL skills Rate: £600 - £700 P/D Outside IR35 Contract Duration: 6 months More ❯