Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing …/BI projects. A relevant number of years' experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Following Tools Experience : Microsoft Power BI Synapse Paginated Report Builder Power Platform more »
of Databricks Coding experience with both Python and SQL Experience working with Azure Data Factory for creating ETL solutions Experience working with Azure Data Lake and DeltaLake storage This is just a brief overview of the role. For the full information, simply apply to the role more »
Basingstoke, Hampshire, United Kingdom Hybrid / WFH Options
Nigel Frank International
of Databricks Coding experience with both Python and SQL Experience working with Azure Data Factory for creating ETL solutions Experience working with Azure Data Lake and DeltaLake storage This is just a brief overview of the role. For the full information, simply apply to the role more »
Southampton, Hampshire, United Kingdom Hybrid / WFH Options
Nigel Frank International
of Databricks Coding experience with both Python and SQL Experience working with Azure Data Factory for creating ETL solutions Experience working with Azure Data Lake and DeltaLake storage This is just a brief overview of the role. For the full information, simply apply to the role more »
Swindon, Wiltshire, United Kingdom Hybrid / WFH Options
Nigel Frank International
of Databricks Coding experience with both Python and SQL Experience working with Azure Data Factory for creating ETL solutions Experience working with Azure Data Lake and DeltaLake storage This is just a brief overview of the role. For the full information, simply apply to the role more »
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Nigel Frank International
of Databricks Coding experience with both Python and SQL Experience working with Azure Data Factory for creating ETL solutions Experience working with Azure Data Lake and DeltaLake storage This is just a brief overview of the role. For the full information, simply apply to the role more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom
Greggs Plc
growth for all data, analytic and AI services. Your involvement will span the entire data lifecycle: from orchestrating data ingestion into our Azure Data Lake, designing and developing sophisticated data models for analytics, to deploying Azure Logic Apps and Function Apps for process automation. With tools like Databricks, Data … Design: Develop and implement data ingestion solutions on the Azure platform, ensuring scalability, reliability, and performance. Leveraging technologies such as Azure Data Factory, Databricks, DeltaLake, and Python. Data Integration: Design and maintain robust ETL/ELT processes to integrate data from various sources into Azure data services. … into this role if you: Can demonstrate advanced knowledge of cloud services, preferably the broader Fabric or Databricks suite of products; or Azure Data Lake, Azure Data Factory, Azure Logic Apps, Azure Function Apps, alongside a skillset incorporating data modelling techniques and business data requirement comprehension, documentation and communication. more »
processing systems. Cloud Migration: Migrate existing on-premises databases to the Azure Cloud. Pipeline Management: Develop and manage data pipelines using Azure Data Factory, DeltaLake, and Spark, ensuring secure, reliable, and accessible data sets. Scalable Solutions: Design and implement scalable data solutions leveraging Azure Cloud architecture. Data more »
Newcastle Upon Tyne, Tyne and Wear, North East, United Kingdom
Greggs Plc
growth for all data, analytic and AI services. Your involvement will span the entire data lifecycle: from orchestrating data ingestion into our Azure Data Lake, designing and developing sophisticated data models for analytics, to deploying Azure Logic Apps and Function Apps for process automation. With tools like Databricks, Data … Design: Develop and implement data ingestion solutions on the Azure platform, ensuring scalability, reliability, and performance. Leveraging technologies such as Azure Data Factory, Databricks, DeltaLake, and Python. Agile Delivery: Working in squads utilising the Agile methodology, you will deliver work from inception to completion, managing timelines, resources … data warehousing and SQL scripting. Can demonstrate advanced knowledge of cloud services, preferably the broader Fabric or Databricks suite of products; or Azure Data Lake, Azure Data Factory, Azure Logic Apps, Azure Function Apps, alongside a skillset incorporating data modelling techniques and business data requirement comprehension, documentation and communication. more »
growth for all data, analytic and AI services. Your involvement will span the entire data lifecycle: from orchestrating data ingestion into our Azure Data Lake, designing and developing sophisticated data models for analytics, to deploying Azure Logic Apps and Function Apps for process automation. With tools like Databricks, Data … Design: Develop and implement data ingestion solutions on the Azure platform, ensuring scalability, reliability, and performance. Leveraging technologies such as Azure Data Factory, Databricks, DeltaLake, and Python. Agile Delivery: Working in squads utilising the Agile methodology, you will deliver work from inception to completion, managing timelines, resources … data warehousing and SQL scripting. Can demonstrate advanced knowledge of cloud services, preferably the broader Fabric or Databricks suite of products; or Azure Data Lake, Azure Data Factory, Azure Logic Apps, Azure Function Apps, alongside a skillset incorporating data modelling techniques and business data requirement comprehension, documentation and communication. more »
Leeds, West Yorkshire, United Kingdom Hybrid / WFH Options
VIQU Limited
AWS. Ensure data quality and integrity across all stages of data processing. Develop and maintain documentation for data engineering processes. Utilise Databricks features, including Delta Live Tables (DLT) to simplify streaming and ensure data quality. Work with asset bundles in Databricks for streamlined data operations. Key Skills: Proven experience …/PySpark Hands-on experience with AWS services related to data storage and processing (e.g., S3, Redshift, Glue). In-depth knowledge of Databricks DeltaLake and Light Tables. Familiarity with Databricks Asset Bundles (DAB’s) for streamlining the development of complex data and analytics for the Databricks more »