O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
O, D365 F&O, Dynamics 365 Finance & Operations, D365 Finance & Operations, BI Developer, Business Intelligence Developer, ETL, SQL, BI, Business Intelligence, SSRS, PowerBI, Data Lake, Dataverse – London & Remote - £45-55k plus benefits Our client, a large end user organisation, are looking for a D365 F&O BI Developer … working with enterprise data warehouse/BI projects. Experience working with Data Analysis Expressions (DAX) Extensive experience with SQL Experience with Data Verse Data Lake Experience Main Responsibilities: Dynamics Specific Development, Maintenance, and Support: Developing, maintaining, and providing support for custom analytics and reporting solutions within the Dynamics … Azure Portal to support data storage, processing, and analytics. Azure Synapse Serverless Leveraging Azure Synapse Serverless for scalable and cost-effective data analytics. Data Lake, Lake House and DeltaLake Experience Familiarity with Data Lake and DeltaLake technologies for storing and processing More ❯
teams to align operational strategies with technical and business requirements. Optimize operational performance and cost management for services including Azure Data Factory, Azure Databricks, DeltaLake, and Azure Data Lake Storage. Serve as the domain expert in DataOps by providing strategic guidance, mentoring colleagues, and driving continuous … of data workflows within the Microsoft Azure ecosystem. Hands-on expertise with Azure Data Platform components including Azure Data Factory, Azure Databricks, Azure Data Lake Storage, DeltaLake, Azure SQL, Purview and APIM. Proficiency in developing CI/CD data pipelines and strong programming skills in Python More ❯
ability to develop data pipelines (ETL/ELT) A strong desire to learn and adapt to new technologies and languages Experience working with Databricks DeltaLake Proficiency in Microsoft Azure cloud technologies What will be your key responsibilities? Collaborate in hands-on development using Python, PySpark, and other … to create and maintain data assets and reports for insights Manage data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, DeltaLake, and SQL Contribute to maintaining and enhancing our technology stack, including Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs Support More ❯
Ensure Data Security: Apply protocols and standards to secure clinical data both in-motion and at-rest. Shape Data Workflows: Utilize Databricks components like DeltaLake, Unity Catalog, and ML Flow to ensure efficient, secure, and reliable data workflows. Key Responsibilities Data Engineering with Databricks: Design and maintain … ETL/ELT processes, and data lakes to support data analytics and machine learning. Requirements Expertise in Databricks: Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration. Clinical Data Security: Understanding of More ❯
Apply protocols and standards to secure clinical data in-motion and at-rest. Shape Data Workflows : Use your expertise with Databricks components such as DeltaLake, Unity Catalog, and ML Flow to ensure our data workflows are efficient, secure, and reliable. Key Responsibilities Data Engineering with Databricks : Utilize … CI/CD pipelines and manage container technologies to support a robust development environment. Requirements Expertise in Databricks : Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge : Experience with Azure Data Factory for data orchestration. Clinical Data Security : Understanding of More ❯
unlock real impact. 🌟 What you'll do Build robust data pipelines using Python, PySpark, and cloud-native tools Engineer scalable data models with Databricks, DeltaLake, and Azure tech Collaborate with analysts, scientists, and fellow engineers to deliver insights Drive agile DevOps practices and continuous improvement Stay curious … re looking for Proven experience as a Data Engineer in cloud environments (Azure ideal) Proficiency in Python, SQL, Spark, Databricks Familiarity with Hadoop, NoSQL, DeltaLake Bonus: Azure Functions, Logic Apps, Django, CI/CD tools 💼 What you’ll get from Mars A competitive salary & bonus Hybrid working More ❯
/CD pipelines. Expertise in Unity Catalog for data governance and security. Proven ability to optimize Databricks data transformation workloads. Experience with Azure Data Lake, DeltaLake, and cloud-based data solutions. All profiles will be reviewed against the required skills and experience. Due to the high More ❯
/CD pipelines. Expertise in Unity Catalog for data governance and security. Proven ability to optimize Databricks data transformation workloads. Experience with Azure Data Lake, DeltaLake, and cloud-based data solutions. All profiles will be reviewed against the required skills and experience. Due to the high More ❯
hands-on technical role responsible for designing, developing, and maintaining data pipelines within the IT department. The pipelines will be realised in a modern lake environment and the engineer will collaborate in cross-functional teams to gather requirements and develop the conceptual data models. This role plays a crucial … scalability, and efficiency. Highly Desirable: Experience with Informatica ETL, Hyperion Reporting, and intermediate/advanced PL/SQL. Desirable Experience in a financial corporation Lake House/DeltaLake and Snowflake Experience with Spark clusters, both elastic permanent and transitory clusters Familiarity with data governance, data security More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
in building/architecting data analytic solutions. 3 years of experience in building data platform using Azure services including Azure Databricks, Azure Data Factory, DeltaLake, Azure Data Lake (ADLS), Power BI. Solid hands-on experience with Azure Databricks - Pyspark coding and Spark SQL coding - Must have. More ❯
in Business Intelligence, with 5+ years in a BI leadership role in a global or matrixed organisation. Proven expertise in modern BI architecture (Data Lake, EDW, Streaming, APIs, Real-Time & Batch Processing). Demonstrated experience delivering cloud-based analytics platforms (Azure, AWS, GCP). Strong knowledge of data governance … BI will work within a modern, cloud-based BI ecosystem, including: Data Integration: Fivetran, HVR, Databricks, Apache Kafka, Google BigQuery, Google Analytics 4 Data Lake & Storage: Databricks DeltaLake, Amazon S3 Data Transformation: dbt Cloud Data Warehouse: Snowflake Analytics & Reporting: Power BI, Excel, Snowflake SQL REST API More ❯
to influence others Skills and Abilities Platforms & Tools Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Azure Service Bus, Power BI, DeltaLake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Azure ML is a plus Languages: Python, SQL, T-SQL More ❯
Greater London, England, United Kingdom Hybrid / WFH Options
Aventum Group
to influence others Skills and Abilities Platforms & Tools Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Azure Service Bus, Power BI, DeltaLake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App service, Azure ML is a plus Languages: Python, SQL, T-SQL More ❯
we do Passion for data and experience working within a data driven organization Hands-on experience with architecting, implementing, and performance tuning of: Data Lake technologies (e.g. DeltaLake, Parquet, Spark, Databricks) API & Microservices Message queues, streaming technologies, and event driven architecture NoSQL databases and query languages More ❯
Strong experience designing and delivering data solutions in the Databricks Data Intelligence platform, either on Azure or AWS. Good working knowledge of Databricks components: DeltaLake, Unity Catalog, ML Flow, etc. Expertise in SQL, Python and Spark (Scala or Python). Experience working with relational SQL databases either on premises or More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
City of London, London, United Kingdom Hybrid / WFH Options
83zero Ltd
is a MUST! Key expertise and experience we're looking for: Data Engineering in Databricks - Spark programming with Scala, Python, SQL Ideally experience with DeltaLake Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance experience More ❯
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
o Perform data cleaning, validation, and enrichment. o Ensure data quality and consistency. Azure Databricks Implementation: o Work with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other related services. o Follow best practices for Databricks development and deployment. o Contribute to optimising Databricks workloads. o … Integration: o Integrate data from various sources, including relational databases, APIs, and file systems. o Work with different data formats (e.g., CSV, JSON, Parquet, Delta). o Ensure data is readily available for analysis and modelling. o Data should be accessed from downstream to build dashboards and interactive reports. More ❯