data and analytic solutions. Developing data processing pipelines in python for Databricks including many of the following technologies: Spark, Delta, Delta Live Tables, PyTest, GreatExpectations (or similar). Building and orchestrating data and analytical processing for streaming data with technologies such as Kafka, AWS Kinesis or Azure More ❯
for AI/ML models using tools like Mosaic ML. Working with semantic layers and advanced data modelling techniques. Implementing automated testing frameworks like GreatExpectations or PyTest . Ready to lead the charge in a company that values innovation, autonomy and real impact? Let’s chat. Apply More ❯
for AI/ML models using tools like Mosaic ML. Working with semantic layers and advanced data modelling techniques. Implementing automated testing frameworks like GreatExpectations or PyTest . Ready to lead the charge in a company that values innovation, autonomy and real impact? Let’s chat. Apply More ❯
Experience with cloud-based data platforms , preferably Azure Databricks, Azure Data Factory, or Synapse Analytics . Basic experience with test automation frameworks (e.g., PyTest, GreatExpectations, DBT tests, or similar). Proficiency in SQL and scripting languages (e.g., Python, Scala) for test automation. Understanding of CI/CD More ❯
with cloud-based data platforms , particularly Azure Databricks, Azure Data Factory, and Synapse Analytics . Hands-on experience with test automation frameworks (e.g., PyTest, GreatExpectations, DBT tests, or similar). Proficiency in SQL and scripting languages (e.g., Python, Scala) for test automation. Experience in CI/CD More ❯
communication skills with both technical and business customers • Experience with CI/CD pipelines for data transformations • Experience with data quality and testing tools (GreatExpectations, dbt tests) What will help you on the job • Experience working within an agile team and familiarity with JIRA • Degree in a More ❯
Document test results, defects, and quality metrics. Preferred qualifications: Experience with PySpark or notebooks in Databricks. Exposure to Azure DevOps, Unit Testing frameworks, or GreatExpectations for data testing. Knowledge of data warehousing or medallion architecture (bronze, silver, gold layers). Experience with data visualization tools (e.g., Power More ❯
Document test results, defects, and quality metrics. Preferred qualifications: Experience with PySpark or notebooks in Databricks. Exposure to Azure DevOps, Unit Testing frameworks, or GreatExpectations for data testing. Knowledge of data warehousing or medallion architecture (bronze, silver, gold layers). Experience with data visualization tools (e.g., Power More ❯