Permanent Spark SQL Jobs in Hampshire

7 of 7 Permanent Spark SQL Jobs in Hampshire

Data Engineer

southampton, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
Posted:

Data Engineer

basingstoke, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
Posted:

Data Engineer

portsmouth, hampshire, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
Posted:

Data Engineer

southampton, south east england, United Kingdom
CACTUS
extraction, transformation, and loading (ETL) processes. o Ensure data pipelines are efficient, reliable, and scalable. Data Transformation & Processing: o Implement data transformations using Spark (PySpark or Scala) and other relevant tools. o Perform data cleaning, validation, and enrichment. o Ensure data quality and consistency. Azure Databricks Implementation: o … Work with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other related services. o Follow best practices for Databricks development and deployment. o Contribute to optimising Databricks workloads. o Need to program using the languages such as SQL, Python, R, YAML and More ❯
Posted:

Data Engineer

basingstoke, south east england, United Kingdom
CACTUS
extraction, transformation, and loading (ETL) processes. o Ensure data pipelines are efficient, reliable, and scalable. Data Transformation & Processing: o Implement data transformations using Spark (PySpark or Scala) and other relevant tools. o Perform data cleaning, validation, and enrichment. o Ensure data quality and consistency. Azure Databricks Implementation: o … Work with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other related services. o Follow best practices for Databricks development and deployment. o Contribute to optimising Databricks workloads. o Need to program using the languages such as SQL, Python, R, YAML and More ❯
Posted:

Data Engineer

portsmouth, hampshire, south east england, United Kingdom
CACTUS
extraction, transformation, and loading (ETL) processes. o Ensure data pipelines are efficient, reliable, and scalable. Data Transformation & Processing: o Implement data transformations using Spark (PySpark or Scala) and other relevant tools. o Perform data cleaning, validation, and enrichment. o Ensure data quality and consistency. Azure Databricks Implementation: o … Work with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other related services. o Follow best practices for Databricks development and deployment. o Contribute to optimising Databricks workloads. o Need to program using the languages such as SQL, Python, R, YAML and More ❯
Posted:

Senior Data Engineer

Southampton, Hampshire, United Kingdom
Aztec
tools to manage the platform, ensuring resilience and optimal performance are maintained. Data Integration and Transformation Integrate and transform data from multiple organisational SQL databases and SaaS applications using end-to-end dependency-based data pipelines, to establish an enterprise source of truth. Create ETL and ELT processes … using Azure Databricks, ensuring audit-ready financial data pipelines and secure data exchange with Databricks Delta Sharing and SQL Warehouse endpoints. Governance and Compliance Ensure compliance with information security standards in our highly regulated financial landscape by implementing Databricks Unity Catalog for governance, data quality monitoring, and ADLS … architecture. Proven experience of ETL/ELT, including Lakehouse, Pipeline Design, Batch/Stream processing. Strong working knowledge of programming languages, including Python, SQL, PowerShell, PySpark, Spark SQL. Good working knowledge of data warehouse and data mart architectures. Good experience in Data Governance, including Unity Catalog More ❯
Employment Type: Permanent
Salary: GBP Annual
Posted: