O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
data-driven initiatives. Job Specification (Technical Skills): Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, DeltaLake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI Services, ML, Unity Catalog, and … Advanced proficiency in SQL, Python, and at least one additional programming language (Java, C#, C++) is desired. Proven experience with data warehousing and data lake technologies. Solid understanding of database systems (SQL, NoSQL). Platform Architecture: Able to develop and implement data platform architecture (data lakes, data warehouses, data More ❯
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
London, England, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
Apply protocols and standards to secure clinical data in-motion and at-rest. Shape Data Workflows : Use your expertise with Databricks components such as DeltaLake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable. Key Responsibilities Data Engineering with Databricks : Utilize … CI/CD pipelines and manage container technologies to support a robust development environment. Requirements Expertise in Databricks : Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge : Experience with Azure Data Factory for data orchestration. Clinical Data Security : Understanding of More ❯
/CD pipelines. Expertise in Unity Catalog for data governance and security. Proven ability to optimize Databricks data transformation workloads. Experience with Azure Data Lake, DeltaLake, and cloud-based data solutions. All profiles will be reviewed against the required skills and experience. Due to the high More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
in Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation. Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & Batch Processing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance … BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks DeltaLake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API More ❯
in building/architecting data analytic solutions. 3 years of experience in building data platform using Azure services including Azure Databricks, Azure Data Factory, DeltaLake, Azure Data Lake (ADLS), Power BI. Solid hands-on experience with Azure Databricks - Pyspark coding and Spark SQL coding - Must have. More ❯
we do Passion for data and experience working within a data driven organization Hands-on experience with architecting, implementing, and performance tuning of: Data Lake technologies (e.g. DeltaLake, Parquet, Spark, Databricks) API & Microservices Message queues, streaming technologies, and event driven architecture NoSQL databases and query languages More ❯
business use cases. Strong knowledge of data governance, data warehousing, and data security principles. Hands-on experience with modern data stacks and technologies (e.g., DeltaLake, SQL, Python, Azure/AWS/GCP). Experience aligning data capabilities with commercial strategy and business performance. Exceptional communication skills. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83zero Ltd
is a MUST! Key expertise and experience we're looking for: Data Engineering in Databricks - Spark programming with Scala, Python, SQL Ideally experience with DeltaLake Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance experience More ❯
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
london, south east england, United Kingdom Hybrid / WFH Options
83zero
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
to define, develop, and deliver impactful data products to both internal stakeholders and end customers. Responsibilities Design and implement scalable data pipelines using Databricks, DeltaLake, and Lakehouse architecture Build and maintain a customer-facing analytics layer, integrating with tools like PowerBI, Tableau, or Metabase Optimise ETL processes More ❯
to define, develop, and deliver impactful data products to both internal stakeholders and end customers. Responsibilities Design and implement scalable data pipelines using Databricks, DeltaLake, and Lakehouse architecture Build and maintain a customer-facing analytics layer, integrating with tools like PowerBI, Tableau, or Metabase Optimise ETL processes More ❯
performance, scalability, and security. Collaborate with business stakeholders, data engineers, and analytics teams to ensure solutions are fit for purpose. Implement and optimise Databricks DeltaLake, Medallion Architecture, and Lakehouse patterns for structured and semi-structured data. Ensure best practices in Azure networking, security, and federated data access More ❯