a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
london, south east england, United Kingdom Hybrid / WFH Options
83zero
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Undisclosed
complex data concepts to non-technical stakeholders. Preferred Skills: Experience with insurance platforms such as Guidewire, Duck Creek, or legacy PAS systems. Knowledge of DeltaLake, Apache Spark, and data pipeline orchestration tools. Exposure to Agile delivery methodologies and tools like JIRA, Confluence, or Azure DevOps. Understanding of More ❯
to define, develop, and deliver impactful data products to both internal stakeholders and end customers. Responsibilities Design and implement scalable data pipelines using Databricks, DeltaLake, and Lakehouse architecture Build and maintain a customer-facing analytics layer, integrating with tools like PowerBI, Tableau, or Metabase Optimise ETL processes More ❯
to define, develop, and deliver impactful data products to both internal stakeholders and end customers. Responsibilities Design and implement scalable data pipelines using Databricks, DeltaLake, and Lakehouse architecture Build and maintain a customer-facing analytics layer, integrating with tools like PowerBI, Tableau, or Metabase Optimise ETL processes More ❯
performance, scalability, and security. Collaborate with business stakeholders, data engineers, and analytics teams to ensure solutions are fit for purpose. Implement and optimise Databricks DeltaLake, Medallion Architecture, and Lakehouse patterns for structured and semi-structured data. Ensure best practices in Azure networking, security, and federated data access More ❯
and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake, and MLflow. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. More ❯
and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake and MLflow. To learn more, follow Databricks on Twitter, LinkedIn and Facebook. Benefits At Databricks, we strive to provide comprehensive benefits and More ❯
SQL and Python Prior experience designing solutions on the Databricks Data Intelligence platform, either on Azure or AWS Good knowledge of Databricks components including DeltaLake, Unity Catalogue, ML Flow etc. Experience building data pipelines and ETL processes Experience with any of the following is highly desirable: Snowflake, Kafka, Azure Data More ❯
with 3+ years leading Databricks-based solutions. Proven experience in a consulting environment delivering large-scale data platform projects. Hands-on expertise in Spark, DeltaLake, MLflow, Unity Catalog, and DBSQL. Strong proficiency in Python, SQL, and at least one major cloud platform (AWS, Azure, or GCP). More ❯
with 3+ years leading Databricks-based solutions. Proven experience in a consulting environment delivering large-scale data platform projects. Hands-on expertise in Spark, DeltaLake, MLflow, Unity Catalog, and DBSQL. Strong proficiency in Python, SQL, and at least one major cloud platform (AWS, Azure, or GCP). More ❯
with 3+ years leading Databricks-based solutions. Proven experience in a consulting environment delivering large-scale data platform projects. Hands-on expertise in Spark, DeltaLake, MLflow, Unity Catalog, and DBSQL. Strong proficiency in Python, SQL, and at least one major cloud platform (AWS, Azure, or GCP). More ❯
and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake, and MLflow. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. More ❯
and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits More ❯
implementation experience with Microsoft Fabric (preferred), along with familiarity with Azure Synapse and Databricks. Experience in core data platform technologies and methods including Spark, DeltaLake, Medallion Architecture, pipelines, etc. Experience leading medium to large-scale cloud data platform implementations, guiding teams through technical challenges and ensuring alignment More ❯
they scale their team and client base. Key Responsibilities: Architect and implement end-to-end, scalable data and AI solutions using the Databricks Lakehouse (DeltaLake, Unity Catalog, MLflow). Design and lead the development of modular, high-performance data pipelines using Apache Spark and PySpark. Champion the More ❯
and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake and MLflow. Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet the needs of all of our employees. More ❯
and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, DeltaLake, and MLflow. To learn more, follow Databricks on Twitter, LinkedIn, and Facebook. Benefits: At Databricks, we strive to provide comprehensive benefits and More ❯
south west london, south east england, United Kingdom
OptumUK
for you to demonstrate include: Desire to expand your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: DeltaLake, Iceberg, Hudi Knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications More ❯
for you to demonstrate include: Desire to expand your cloud/platform engineering capabilities Experience working with Big Data Experience of data storage technologies: DeltaLake, Iceberg, Hudi Knowledge and understanding of Apache Spark, Databricks or Hadoop Ability to take business requirements and translate these into tech specifications More ❯