PySpark Jobs in Buckinghamshire

9 of 9 PySpark Jobs in Buckinghamshire

Senior Big Data Engineer (Databricks) - RELOCATION TO ABU DHABI

milton keynes, south east england, United Kingdom
SoftServe
cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business More ❯
Posted:

Senior Big Data Engineer (Databricks) - RELOCATION TO ABU DHABI

high wycombe, south east england, United Kingdom
SoftServe
cloud providers and collaborate with many technology partners like AWS, GCP, Microsoft, Databricks, Snowflake, Confluent, and others. IF YOU ARE Experienced with Python/PySpark Proficient working with Databricks Lakehouse architecture and principles Having 2+ years of designing data models, building ETL pipelines, and wrangling data to solve business More ❯
Posted:

Data Engineer - Microsoft Fabric

milton keynes, south east england, United Kingdom
Agile
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
Posted:

Data Engineer - Microsoft Fabric

high wycombe, south east england, United Kingdom
Agile
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
Posted:

Azure Data Engineer

Milton Keynes, Buckinghamshire, UK
Kinetech
relational databases Experience of Azure Fabric and its use in Data engineering and Data management. A high degree of proficiency with tools like Terraform, PySpark, and Databricks. Understanding of data migration concepts, including mapping, transformation, cleansing, and validation. Strong attention to detail and problem-solving ability. Must be comfortable More ❯
Posted:

Azure Data Engineer

Milton Keynes, England, United Kingdom
Kinetech
relational databases Experience of Azure Fabric and its use in Data engineering and Data management. A high degree of proficiency with tools like Terraform, PySpark, and Databricks. Understanding of data migration concepts, including mapping, transformation, cleansing, and validation. Strong attention to detail and problem-solving ability. Must be comfortable More ❯
Posted:

Azure Data Engineer

high wycombe, south east england, United Kingdom
Kinetech
relational databases Experience of Azure Fabric and its use in Data engineering and Data management. A high degree of proficiency with tools like Terraform, PySpark, and Databricks. Understanding of data migration concepts, including mapping, transformation, cleansing, and validation. Strong attention to detail and problem-solving ability. Must be comfortable More ❯
Posted:

Data Engineer

milton keynes, south east england, United Kingdom
Realm
ready for deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code Nice to Have More ❯
Posted:

Data Engineer

high wycombe, south east england, United Kingdom
Realm
ready for deployment. Clear communication and the capacity to articulate technical choices effectively are crucial. Must-Have Skills: 3+ years Databricks (Lakehouse, Delta Lake, PySpark, Spark SQL) 5+ years SQL Python Azure Excellent client-facing communication skills Experience deploying Databricks pipelines Experience provisioning Databricks as code Nice to Have More ❯
Posted: