Permanent Spark SQL Jobs in Kent

4 of 4 Permanent Spark SQL Jobs in Kent

Data Engineer

dartford, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
Posted:

Data Engineer

maidstone, south east england, United Kingdom
Realm
deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code More ❯
Posted:

Data Engineer

dartford, south east england, United Kingdom
CACTUS
extraction, transformation, and loading (ETL) processes. o Ensure data pipelines are efficient, reliable, and scalable. Data Transformation & Processing: o Implement data transformations using Spark (PySpark or Scala) and other relevant tools. o Perform data cleaning, validation, and enrichment. o Ensure data quality and consistency. Azure Databricks Implementation: o … Work with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other related services. o Follow best practices for Databricks development and deployment. o Contribute to optimising Databricks workloads. o Need to program using the languages such as SQL, Python, R, YAML and More ❯
Posted:

Data Engineer

maidstone, south east england, United Kingdom
CACTUS
extraction, transformation, and loading (ETL) processes. o Ensure data pipelines are efficient, reliable, and scalable. Data Transformation & Processing: o Implement data transformations using Spark (PySpark or Scala) and other relevant tools. o Perform data cleaning, validation, and enrichment. o Ensure data quality and consistency. Azure Databricks Implementation: o … Work with Azure Databricks Unity Catalog, including Delta Lake, Spark SQL, and other related services. o Follow best practices for Databricks development and deployment. o Contribute to optimising Databricks workloads. o Need to program using the languages such as SQL, Python, R, YAML and More ❯
Posted: