No problem Docker & Linux - smooth setup and deployment AI/ML - real models, not just slide decks Git - because version control is second nature Bonus Skills (nice to have): Apache Spark/Kubernetes/Prefect, Airflow or similar/Torch, VLLM, or LLM-related tooling/pydantic.ai or similar/Shell scripting Team We believe in flat hierarchies More ❯
the development of internal ML platforms, and help identify high-impact opportunities for rapid capability delivery. You'll also be responsible for building robust MLOps pipelines using tools like Airflow and MLflow, ensuring scalable and automated model deployment and governance. What you'll need to succeed Essential Skills: * Bachelor's degree (2:1 or above) in science, engineering, mathematics … or computer science * Strong Python development skills * Practical experience with machine learning or deep learning in scientific/engineering contexts * Familiarity with MLOps tools and practices (Airflow, MLflow) and containerisation (e.g., Docker) * Passion for working with real-world datasets and data visualisation * Interest in material discovery, computer vision, big data, and optimisation techniques * Excellent communication and collaboration skills * Problem More ❯
concepts in simple, actionable terms. Collaborate with the central data engineering team managing the cloud infrastructure (BigQuery, GCP). Coordinate with third party partners responsible for pipeline orchestration and Airflow maintenance. Drive standardisation across naming conventions, definitions, and documentation to strengthen governance and data discoverability. Influence the organisation’s data roadmap, ensuring it aligns with evolving business priorities. Take … cloud data warehouses (ideally BigQuery). Proficiency in modular data modelling frameworks such as Dataform or dbt, with an emphasis on testing and documentation. Experience with orchestration tools (e.g. Airflow) for managing data pipelines. Familiarity with modern data stacks, version control (Git), and CI/CD best practices. Strong ability to validate and interrogate large datasets, with a high More ❯
them into SDKs, and use them as the foundation for first-class integrations with platforms like Snowflake, Databricks, and BigQuery, as well as orchestration and streaming technologies such as Airflow, Kafka, and dbt. You will ensure these APIs and SDKs are designed with a best-in-class developer experience - consistent, intuitive, well-documented, and secure. Your work will be … ideally with a focus on APIs, SDKs, data platforms, data integration, or enterprise SaaS. Data platform knowledge: Strong familiarity with data warehouses/lakehouses (Snowflake, Databricks, BigQuery), orchestration tools (Airflow, Prefect), streaming (Kafka, Flink), and transformation (dbt). Technical proficiency: Solid understanding of REST/GraphQL APIs, SDK development, authentication/authorization standards (OAuth, SSO), and best practices in More ❯
Karlsruhe, Baden-Württemberg, Germany Hybrid / WFH Options
Cinemo GmbH
.000 € per year Requirements: Minimum 1 to 2 years of proven experience in ML-Ops, including end-to-end machine learning lifecycle management Familiarity with MLOps tools like MLFlow, Airflow, Kubeflow or custom implemented solutions. Experience designing and managing CI/CD pipelines for machine learning projects with experience in CI/CD tools (e.g., Github actions, Bitbucket Pipelines … workflows Automate repetitive and manual processes involved in machine learning operations to improve efficiency Implement and manage in-cloud ML-Ops solutions, leveraging Terraform for infrastructure as code Technologies: Airflow AWS BitBucket CI/CD Cloud Embedded GitHub Support Kubeflow Machine Learning Mobile Python Terraform C++ DevOps More: Cinemo is a global provider of highly innovative infotainment products that More ❯