analysts and stakeholders to translate business needs into technical solutions. Maintain clear documentation and contribute to internal best practices. Requirements Strong hands-on experience with PySpark (RDDs, DataFrames, SparkSQL). Proven ability to build and optimise ETL pipelines and dataflows. Familiar with Microsoft Fabric or similar lakehouse/data platform environments. Experience with Git, CI More ❯
business stakeholders to translate requirements into technical solutions Create, maintain, and update documentation and internal knowledge repository Your Profile Essential skills/knowledge/experience: Ability to write Spark code for large scale data processing, including RDDs, DataFrames, and SparkSQL Hands-on experience with lakehouses, dataflows, pipelines, and semantic models Ability to build More ❯
business stakeholders to translate requirements into technical solutions. Create, maintain, and update documentation and internal knowledge repositories. Your Profile Essential Skills/Knowledge/Experience Ability to write Spark code for large-scale data processing, including RDDs, DataFrames, and Spark SQL. Hands-on experience with lakehouses, dataflows, pipelines, and semantic models. Ability to build ETL workflows. More ❯