PySpark Jobs in Chester

4 of 4 PySpark Jobs in Chester

Data Engineer - Microsoft Fabric

Chester, England, United Kingdom
JR United Kingdom
and the broader Azure ecosystem. Requirements Proven experience as a Data Engineer working with Microsoft Fabric or related Azure data services. Knowledge of using PySpark in notebooks for data analysis and manipulation. Strong proficiency with SQL and data modelling. Experience with modern ELT/ETL tools within the Microsoft More ❯
Posted:

Senior Data Architect

Chester, England, United Kingdom
JR United Kingdom
at scale. Hands-on expertise in core GCP data services such as BigQuery, Composer, Dataform, Dataproc, and Pub/Sub. Strong programming skills in PySpark, Python, and SQL. Proficiency in ETL processes, data mining, and data storage principles. Experience with BI and data visualisation tools, such as Looker or More ❯
Posted:

Founding Machine Learning Engineer

Chester, England, United Kingdom
Hybrid / WFH Options
JR United Kingdom
Strong hands-on experience with ML frameworks (PyTorch, TensorFlow, Keras). Proficiency in Python and C/C++. Experience with scalable data tools (e.g., PySpark, Kubernetes, Databricks, Apache Arrow). Proven ability to manage GPU-intensive data processing jobs. 4+ years of applied research or industry experience. Creative problem More ❯
Posted:

Data Engineer , Python, PySpark, and SQL, AWS

Chester, England, United Kingdom
JR United Kingdom
Social network you want to login/join with: Data Engineer , Python, PySpark, and SQL, AWS, chester col-narrow-left Client: Athsai Location: chester, United Kingdom Job Category: Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 06.06.2025 Expiry Date: 21.07.2025 col-wide Job Description … a crucial role in designing and implementing robust data pipelines, ensuring the integrity and accessibility of data across various platforms. Required Skills Proficient in PySpark and AWS Strong experience in designing, implementing, and debugging ETL pipelines Expertise in Python, PySpark, and SQL In-depth knowledge of Spark and More ❯
Posted: