joining data from various sources. About the role The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in PythonPySpark and SQL. You will have expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address More ❯
team environment. Desired, but not crucial skills: Building and deploying Cloud applications. Web API development. Working with financial data. Linux/Unix Experience. Containerised development. Analytics libraries like Pandas, PySpark, DuckDB etc. Basic knowledge of working with Docker, Podman, Kubernetes. Experience with Infrastructure-as-Code tools like Terraform. Proficient with Git. Proficient with CI/CD automation using GitLab More ❯
experience with Databricks (Delta Lake, Unity Catalog, Lakehouse architecture). Strong knowledge of Azure services (e.g. Data Lake, Data Factory, Synapse). Solid hands-on skills in Spark, Python, PySpark, and SQL. Understanding of data modelling, governance, and BI integration. Familiarity with CI/CD, Git, and Infrastructure as Code (e.g. Terraform). Excellent communication and mentoring skills. Desirable More ❯