performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy … throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, SparkSQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as … SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources, including relational databases, APIs, and streaming data sources. Implement data integration patterns and best practices. Work with API developers to ensure seamless data exchange. Data Quality & Governance: Hands on experience to use Azure Purview for More ❯
performance, efficiency, and cost-effectiveness. Implement data quality checks and validation rules within data pipelines. Data Transformation & Processing: Implement complex data transformations using Spark (PySpark or Scala) and other relevant technologies. Develop and maintain data processing logic for cleaning, enriching, and aggregating data. Ensure data consistency and accuracy … throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including Delta Lake, SparkSQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need to program using the languages such as … SQL, Python, R, YAML and JavaScript Data Integration: Integrate data from various sources, including relational databases, APIs, and streaming data sources. Implement data integration patterns and best practices. Work with API developers to ensure seamless data exchange. Data Quality & Governance: Hands on experience to use Azure Purview for More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
portsmouth, hampshire, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
oxford district, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
south west london, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
crawley, west sussex, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, SparkSQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
tools to manage the platform, ensuring resilience and optimal performance are maintained. Data Integration and Transformation Integrate and transform data from multiple organisational SQL databases and SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth. Create ETL and ELT processes … using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints. Governance and Compliance Ensure compliance with information security standards in our highly regulated financial landscape by implementing Databricks Unity Catalog for governance, data quality monitoring, and ADLS … architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog More ❯
london, south east england, United Kingdom Hybrid / WFH Options
La Fosse
technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You’ll Bring: Proficiency in Python, PySpark, SparkSQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks More ❯
slough, south east england, United Kingdom Hybrid / WFH Options
La Fosse
technical and non-technical teams Troubleshoot issues and support wider team adoption of the platform What You’ll Bring: Proficiency in Python, PySpark, SparkSQL or Java Experience with cloud tools (Lambda, S3, EKS, IAM) Knowledge of Docker, Terraform, GitHub Actions Understanding of data quality frameworks More ❯
hybrid role - digital Google Cloud transformation programme Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/Linux Platform Version control tools (Git, GitHub), automated deployment tools Google Cloud Platform More ❯
hybrid role - digital Google Cloud transformation programme Proficiency in programming languages such as Python, PySpark and Java develop ETL processes for Data ingestion & preparation SparkSQL CloudRun, DataFlow, CloudStorage GCP BigQuery Google Cloud Platform Data Studio Unix/Linux Platform Version control tools (Git, GitHub), automated deployment tools Google Cloud Platform More ❯