teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
bradford, yorkshire and the humber, united kingdom
Peregrine
teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
Ensure Data Security: Apply protocols and standards to secure clinical data both in-motion and at-rest. Shape Data Workflows: Utilize Databricks components like DeltaLake, Unity Catalog, and ML Flow to ensure efficient, secure, and reliable data workflows. Key Responsibilities Data Engineering with Databricks: Design and maintain … ETL/ELT processes, and data lakes to support data analytics and machine learning. Requirements Expertise in Databricks: Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration. Clinical Data Security: Understanding of More ❯
Strong experience designing and delivering data solutions in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or More ❯
quality, and performance Utilise Azure Databricks and adhere to code-based deployment practices Essential Skills: Over 3 years of experience with Databricks (including Lakehouse, DeltaLake, PySpark, Spark SQL) Strong proficiency in SQL with 5+ years of experience Extensive experience with Azure Data Factory Proficiency in Python programming More ❯
SQL and Python Prior experience designing solutions on the Databricks Data Intelligence platform, either on Azure or AWS Good knowledge of Databricks components including DeltaLake, Unity Catalogue, ML Flow etc. Experience building data pipelines and ETL processes Experience with any of the following is highly desirable: Snowflake, Kafka, Azure Data More ❯
SQL and Python Prior experience designing solutions on the Databricks Data Intelligence platform, either on Azure or AWS Good knowledge of Databricks components including DeltaLake, Unity Catalogue, ML Flow etc. Experience building data pipelines and ETL processes Experience with any of the following is highly desirable: Snowflake, Kafka, Azure Data More ❯
Coventry, Warwickshire, United Kingdom Hybrid / WFH Options
Cadent Gas
in SAP Datasphere or SAP BW4/Hana Advanced skills in SQL, data modelling, and data transformation Familiarity with Databricks, Apache Spark, PySpark, and DeltaLake Agile mindset with experience in DevOps and iterative delivery Excellent communication and stakeholder engagement abilities At Cadent, we're thrilled to be More ❯
expert proven to deliver fast-paced releases. You will have worked with the latest Azure data platform technologies, particularly Azure Data Factory, Azure Data Lake Storage and Azure Databricks. Hands on experience of working in Databricks. In particular design and usage of the DeltaLake storage format. More ❯
expert proven to deliver fast-paced releases. You will have worked with the latest Azure data platform technologies, particularly Azure Data Factory, Azure Data Lake Storage and Azure Databricks. Hands on experience of working in Databricks. In particular design and usage of the DeltaLake storage format. More ❯
Cardiff, South Glamorgan, United Kingdom Hybrid / WFH Options
RVU Co UK
and act as a multiplier. What we look for in you Understand, assess and effectively apply modern data architectures (dimensional model, data mesh, data lake). Experience in applying and using data observability methods effectively. Experience in modern software development practices (agile, CI/CD, DevOps, infrastructure as code … the following: Strong knowledge of SQL and Python programming. Extensive experience working within a cloud environment. Experience with big data technologies (e.g. Spark, Databricks, DeltaLake, BigQuery). Experience with alternative data technologies (e.g. duckdb, polars, daft). Familiarity with eventing technologies (Event Hubs, Kafka etc ). Deep … understanding of file formats and their behaviour such as parquet, delta and iceberg. What we offer We want to give you a great work environment; contribute back to both your personal and professional development; and give you great benefits to make your time at RVU even more enjoyable. Some More ❯
Microsoft Fabric. * Leading QA efforts across agile delivery squads. * Developing automated tests for: - PySpark notebooks in Fabric or Databricks - ETL pipelines and transformation logic - Delta tables and multi-layered Lakehouse architectures * Embedding testing into CI/CD flows using Azure DevOps or GitHub Actions. * Creating reusable, parameter-driven test … QA leadership within modern data platforms. * Strong working knowledge of: - Microsoft Fabric (Lakehouse's, Pipelines, Notebooks, Power BI) - Azure tools including ADF, Synapse, Data Lake, and Key Vault - Databricks or Microsoft Fabric Notebooks using PySpark * Experience building automation frameworks (e.g., Pytest, Nutter, Great Expectations). * Familiarity with Medallion Architecture … and Delta Lake. * Skilled in CI/CD integration (Azure DevOps or GitHub Actions) and Git version control. * Comfortable working with various data types - structured, semi-structured, and unstructured. Nice-to-Haves: * Understanding of tools like Microsoft Purview, Unity Catalog, or other governance platforms. * Experience testing Power BI reports More ❯