Experience with real-time analytics from telemetry and event-based streaming (e.g., Kafka) Experience managing operational data stores with high availability, performance, and scalability Expertise in data lakes, lakehouses, Apache Iceberg, and data mesh architectures Proven ability to build, deliver, and support modern data platforms at scale Strong knowledge of data governance, data quality, and data cataloguing Experience with More ❯
and managing cloud infrastructure as code Proficiency in programming languages such as Python, Spark, SQL Strong experience with SQL databases Expertise in data pipeline and workflow management tools (e.g., Apache Airflow, ADF) Experience with cloud platforms (Azure preferred) and related data services Excellent problem-solving skills and attention to detail Inclusive and curious, continuously seeks to build knowledge and More ❯
of CI/CD tools and technologies (e.g., Git, Gitlab, Jenkins, GCP, AWS) Knowledge of containerisation and microservice architecture Ability to develop dashboard UIs for publishing performance (e.g., Grafana, Apache Superset, etc.) Exposure to safety certification standards and processes We provide: Competitive salary, benchmarked against the market and reviewed annually Company share programme Hybrid and/or flexible work More ❯
in a hybrid environment requiring clear and effective communication. Strong engineering fundamentals with a passion for simplicity and precision Ideal, But Not Required Experience with database technologies (Postgres, DynamoDB, Apache Iceberg). Experience with serverless technologies (e.g. Lambda) Required Experience Prior industry experience with Python. Prior industry experience with public cloud providers (preferably AWS). Our Offer Work with More ❯