O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
bradford, yorkshire and the humber, united kingdom
Peregrine
teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
data-driven initiatives. Job Specification ( Technical Skills) : Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, DeltaLake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI Services, ML, Unity Catalog, and … Advanced proficiency in SQL, Python, and at least one additional programming language (Java, C#, C++) is desired. Proven experience with data warehousing and data lake technologies. Solid understanding of database systems (SQL, NoSQL). Platform Architecture: Able to develop and implement data platform architecture (data lakes, data warehouses, data More ❯
data-driven initiatives. Job Specification ( Technical Skills) : Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, DeltaLake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI Services, ML, Unity Catalog, and … Advanced proficiency in SQL, Python, and at least one additional programming language (Java, C#, C++) is desired. Proven experience with data warehousing and data lake technologies. Solid understanding of database systems (SQL, NoSQL). Platform Architecture: Able to develop and implement data platform architecture (data lakes, data warehouses, data More ❯
data-driven initiatives. Job Specification (Technical Skills): Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, DeltaLake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI Services, ML, Unity Catalog, and … Advanced proficiency in SQL, Python, and at least one additional programming language (Java, C#, C++) is desired. Proven experience with data warehousing and data lake technologies. Solid understanding of database systems (SQL, NoSQL). Platform Architecture: Able to develop and implement data platform architecture (data lakes, data warehouses, data More ❯
data-driven initiatives. Job Specification (Technical Skills): Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, DeltaLake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI Services, ML, Unity Catalog, and … Advanced proficiency in SQL, Python, and at least one additional programming language (Java, C#, C++) is desired. Proven experience with data warehousing and data lake technologies. Solid understanding of database systems (SQL, NoSQL). Platform Architecture: Able to develop and implement data platform architecture (data lakes, data warehouses, data More ❯
Ensure Data Security: Apply protocols and standards to secure clinical data both in-motion and at-rest. Shape Data Workflows: Utilize Databricks components like DeltaLake, Unity Catalog, and ML Flow to ensure efficient, secure, and reliable data workflows. Key Responsibilities Data Engineering with Databricks: Design and maintain … ETL/ELT processes, and data lakes to support data analytics and machine learning. Requirements Expertise in Databricks: Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration. Clinical Data Security: Understanding of More ❯
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
London, England, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
Apply protocols and standards to secure clinical data in-motion and at-rest. Shape Data Workflows : Use your expertise with Databricks components such as DeltaLake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable. Key Responsibilities Data Engineering with Databricks : Utilize … CI/CD pipelines and manage container technologies to support a robust development environment. Requirements Expertise in Databricks : Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge : Experience with Azure Data Factory for data orchestration. Clinical Data Security : Understanding of More ❯
clients and internal teams to deliver scalable, efficient data solutions tailored to business needs. Key Responsibilities Develop ETL/ELT pipelines with Databricks and DeltaLake Integrate and process data from diverse sources Collaborate with data scientists, architects, and analysts Optimize performance and manage Databricks clusters Build cloud … pipelines Document architecture and processes What We’re Looking For Required: 5+ years in data engineering with hands-on Databricks experience Proficient in Databricks, DeltaLake, Spark, Python, SQL Cloud experience (Azure preferred, AWS/GCP a plus) Strong problem-solving and communication skills Databricks Champion More ❯
that powers high-impact analytics and machine learning solutions. Key Responsibilities Engineer and maintain modern data platforms with a strong focus on Databricks, including DeltaLake, Apache Spark, and MLflow Build and optimise CI/CD pipelines, infrastructure-as-code (IaC), and cloud integrations (Azure preferred, AWS/… 7+ years of experience in platform engineering, DevOps, or cloud infrastructure, with a focus on data platforms Advanced hands-on experience with Databricks, Spark, DeltaLake, and Python Proficient with Azure services (Data Factory, Storage, DevOps) and experience with IaC tools (Terraform, Bicep, ARM) Experience supporting data pipelines More ❯
warrington, cheshire, north west england, United Kingdom
TechYard
that powers high-impact analytics and machine learning solutions. Key Responsibilities Engineer and maintain modern data platforms with a strong focus on Databricks, including DeltaLake, Apache Spark, and MLflow Build and optimise CI/CD pipelines, infrastructure-as-code (IaC), and cloud integrations (Azure preferred, AWS/… 7+ years of experience in platform engineering, DevOps, or cloud infrastructure, with a focus on data platforms Advanced hands-on experience with Databricks, Spark, DeltaLake, and Python Proficient with Azure services (Data Factory, Storage, DevOps) and experience with IaC tools (Terraform, Bicep, ARM) Experience supporting data pipelines More ❯
bolton, greater manchester, north west england, United Kingdom
TechYard
that powers high-impact analytics and machine learning solutions. Key Responsibilities Engineer and maintain modern data platforms with a strong focus on Databricks, including DeltaLake, Apache Spark, and MLflow Build and optimise CI/CD pipelines, infrastructure-as-code (IaC), and cloud integrations (Azure preferred, AWS/… 7+ years of experience in platform engineering, DevOps, or cloud infrastructure, with a focus on data platforms Advanced hands-on experience with Databricks, Spark, DeltaLake, and Python Proficient with Azure services (Data Factory, Storage, DevOps) and experience with IaC tools (Terraform, Bicep, ARM) Experience supporting data pipelines More ❯
Potters Bar, Hertfordshire, South East, United Kingdom
Canada Life Group (UK) Ltd (The)
hands-on technical role responsible for designing, developing, and maintaining data pipelines within the IT department. The pipelines will be realised in a modern lake environment and the engineer will collaborate in cross-functional teams to gather requirements and develop the conceptual data models. This role plays a crucial … scalability, and efficiency. Highly Desirable: Experience with Informatica ETL, Hyperion Reporting, and intermediate/advanced PL/SQL. Desirable Experience in a financial corporation Lake House/DeltaLake and Snowflake Experience with Spark clusters, both elastic permanent and transitory clusters Familiarity with data governance, data security More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯