teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
bradford, yorkshire and the humber, united kingdom
Peregrine
teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
data-driven initiatives. Job Specification (Technical Skills): Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, DeltaLake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI Services, ML, Unity Catalog, and … Advanced proficiency in SQL, Python, and at least one additional programming language (Java, C#, C++) is desired. Proven experience with data warehousing and data lake technologies. Solid understanding of database systems (SQL, NoSQL). Platform Architecture: Able to develop and implement data platform architecture (data lakes, data warehouses, data More ❯
Ensure Data Security: Apply protocols and standards to secure clinical data both in-motion and at-rest. Shape Data Workflows: Utilize Databricks components like DeltaLake, Unity Catalog, and ML Flow to ensure efficient, secure, and reliable data workflows. Key Responsibilities Data Engineering with Databricks: Design and maintain … ETL/ELT processes, and data lakes to support data analytics and machine learning. Requirements Expertise in Databricks: Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration. Clinical Data Security: Understanding of More ❯
Security: Apply protocols and standards to secure clinical data in-motion and at-rest.Shape Data Workflows: Use your expertise with Databricks components such as DeltaLake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable.Key Responsibilities Data Engineering with Databricks: Utilize Databricks … models, ETL/ELT processes, and data lakes to support data analytics and machine learning.Requirements Expertise in Databricks: Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow.Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration.Clinical Data Security: Understanding of protocols and More ❯
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
London, England, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
Apply protocols and standards to secure clinical data in-motion and at-rest. Shape Data Workflows: Use your expertise with Databricks components such as DeltaLake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable. Key Responsibilities Data Engineering with Databricks: Utilize … ETL/ELT processes, and data lakes to support data analytics and machine learning. Requirements Expertise in Databricks: Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration. Clinical Data Security: Understanding of More ❯
Apply protocols and standards to secure clinical data in-motion and at-rest. Shape Data Workflows : Use your expertise with Databricks components such as DeltaLake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable. Key Responsibilities Data Engineering with Databricks : Utilize … CI/CD pipelines and manage container technologies to support a robust development environment. Requirements Expertise in Databricks : Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge : Experience with Azure Data Factory for data orchestration. Clinical Data Security : Understanding of More ❯
/CD pipelines. Expertise in Unity Catalog for data governance and security. Proven ability to optimize Databricks data transformation workloads. Experience with Azure Data Lake, DeltaLake, and cloud-based data solutions. All profiles will be reviewed against the required skills and experience. Due to the high More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
in Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation. Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & Batch Processing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance … BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks DeltaLake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API More ❯
in building/architecting data analytic solutions. 3 years of experience in building data platform using Azure services including Azure Databricks, Azure Data Factory, DeltaLake, Azure Data Lake (ADLS), Power BI. Solid hands-on experience with Azure Databricks - Pyspark coding and Spark SQL coding - Must have. More ❯
Coalville, Leicestershire, East Midlands, United Kingdom Hybrid / WFH Options
Ibstock PLC
Spark for data engineering and analytics. Proficient in SQL and Python/PySpark for data transformation and analysis. Experience in data lakehouse development and DeltaLake optimisation. Experience with ETL/ELT processes for integrating diverse data sources. Experience in gathering, documenting, and refining requirements from key business More ❯
we do Passion for data and experience working within a data driven organization Hands-on experience with architecting, implementing, and performance tuning of: Data Lake technologies (e.g. DeltaLake, Parquet, Spark, Databricks) API & Microservices Message queues, streaming technologies, and event driven architecture NoSQL databases and query languages More ❯
Strong experience designing and delivering data solutions in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or More ❯
customers to comprehend their business and technical needs, to develop tailored technical architectures and solutions in the Cloud, focusing on data engineering, data lakes, lake houses, business intelligence and machine learning/AI. Cost Optimization: You will be continuously trying to optimize run costs - both on platform level as … and supporting Big Data solutions for data lakes and data warehouses Expertise in cloud-based Big Data solutions is required - preferably with Azure Data Lake and related technological stack:ADLS Gen2, Spark/Databricks, DeltaLake, Kafka/Events Hub, Stream Analytics, Azure Data Factory, Azure DevOps More ❯
business use cases. Strong knowledge of data governance, data warehousing, and data security principles. Hands-on experience with modern data stacks and technologies (e.g., DeltaLake, SQL, Python, Azure/AWS/GCP). Experience aligning data capabilities with commercial strategy and business performance. Exceptional communication skills. More ❯
in Data Engineering, with a focus on cloud platforms (Azure, AWS, GCP). You have a proven track record working with Databricks (PySpark, SQL, DeltaLake, Unity Catalog). You have extensive experience in ETL/ELT development and data pipeline orchestration (Databricks Workflows, DLT, Airflow, ADF, Glue More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83zero Ltd
is a MUST! Key expertise and experience we're looking for: Data Engineering in Databricks - Spark programming with Scala, Python, SQL Ideally experience with DeltaLake Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance experience More ❯
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯