quality, and performance Utilise Azure Databricks and adhere to code-based deployment practices Essential Skills: Over 3 years of experience with Databricks (including Lakehouse, DeltaLake, PySpark, Spark SQL) Strong proficiency in SQL with 5+ years of experience Extensive experience with Azure Data Factory Proficiency in Python programming More ❯
SQL and Python Prior experience designing solutions on the Databricks Data Intelligence platform, either on Azure or AWS Good knowledge of Databricks components including DeltaLake, Unity Catalogue, ML Flow etc. Experience building data pipelines and ETL processes Experience with any of the following is highly desirable: Snowflake, Kafka, Azure Data More ❯
and building end-to-end data pipelines. Proficient in Python and/or Scala; solid understanding of SQL and distributed computing principles. Experience with DeltaLake, Lakehouse architecture, and data governance frameworks. Excellent client-facing and communication skills. Experience in Azure Data Services is desirable (e.g. Azure Data … Lake, Synapse, Data Factory, Fabric). More ❯
This role is ideal for someone who enjoys blending technical precision with innovation. You’ll: Build and manage ML pipelines in Databricks using MLflow, DeltaLake, Spark, and Mosaic AI. Train and deploy generative models (LLMs, GANs, VAEs) for NLP, content generation, and synthetic data. Architect scalable solutions More ❯