You will be the Product Manager in the Developer Ecosystem team at Databricks in Amsterdam. You will be responsible for enhancing the customer developer experience across the inner and outer development loops, ensuring seamless integration with tools, CI/CD workflows, and automation frameworks. Our customers require enterprise grade software best practices for their teams and this team delivers on … You have a deep understanding of developer workflows, cloud infrastructure, CI/CD best practices, and the evolving landscape of data and AI development to drive meaningful impact across Databricks' developer community. Specifically as a Product Manager in the Developer Ecosystem team you will work on: Strategy & Roadmap : Define and execute the vision for the developer ecosystem, prioritizing features and … integrations that improve developer velocity, collaboration, and productivity. Developer Tooling & Experience : Drive improvements in Databricks Asset Bundles (DABs), IDE integrations, CLI and SDK enhancements, CI/CD workflows, automation frameworks, and testing tools to enhance the inner loop development experience. Integrations & Extensibility : Ensure Databricks integrates seamlessly with third-party developer tools (GitHub, GitLab, Jenkins, Terraform, etc.), enabling efficient DevOps and More ❯
RDQ426R402 At Databricks, we are passionate about enabling data teams to solve the world's toughest problems - from making the next mode of transportation a reality to accelerating the development of medical breakthroughs. We do this by building and running the world's best data and AI infrastructure platform so our customers can use deep data insights to improve their … of virtual machines. And we're only getting started. As a Product Manager on the Lakeflow Jobs team , you will shape the future of data aware orchestration on the Databricks Data Intelligence Platform. Jobs is Databricks' built in orchestrator, managing hundreds of millions of workloads every week with 99.95% reliability. It powers ETL, AI/ML, BI, and streaming workloads … major orchestration tools (Airflow, Dagster, Prefect, etc.) Technical skills: understanding of data pipelines (Spark, dbt, Lakeflow Pipelines) Strong data analysis and operationalization skills (SQL, Python, building operational dashboards) About DatabricksDatabricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data More ❯
Familiarity with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI-driven analytics More ❯
Familiarity with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI-driven analytics More ❯
Familiarity with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI-driven analytics More ❯
Familiarity with data security, privacy, and compliance frameworks ● Exposure to machine learning pipelines, MLOps, or AI-driven data products ● Experience with big data platforms and technologies such as EMR, Databricks, Kafka, Spark ● Exposure to AI/ML concepts and collaboration with data science or AI teams. ● Experience integrating data solutions with AI/ML platforms or supporting AI-driven analytics More ❯
At Databricks, we are passionate about enabling data teams to solve the world's toughest problems - from making the next mode of transportation a reality to accelerating the development of medical breakthroughs. We do this by building and running the world's best data and AI infrastructure platform so our customers can use deep data insights to improve their business. … As a Product Manager on the Lakeflow Jobs team , you will shape the future of data-aware orchestration on the Databricks Data Intelligence Platform. Jobs is Databricks' built-in orchestrator, managing hundreds of millions of workloads every week with 99.95% reliability. It powers ETL, AI/ML, BI, and streaming workloads for thousands of customers. Your focus will be on … major orchestration tools (Airflow, Dagster, Prefect etc.) Technical skills understanding of data pipelines (Spark, dbt, Lakeflow Pipelines) Strong data analysis and operationalization skills (SQL, Python, building operational dashboards) About DatabricksDatabricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data More ❯
have experience working with some of the following technologies: Power BI Power Apps Blob storage Synapse Azure Data Factory (ADF) IOT Hub SQL Server Azure Data Lake Storage Azure Databricks Purview Power Platform Python More ❯
/C++; 2-3 years of professional experience. Cloud certification (Beginner and/or Associate Developer certificate for Azure) with a focus on Azure Kubernetes, Docker and dashboarding Azure Databricks Role and responsibilities Engage deeply with business teams to identify opportunities and translate the needs into innovative and practical AIML solutions. Design, build, and deploy state of the art AIML More ❯
skills; comfortable influencing C suite clients. Advantageous competencies (but not essential): Exposure to AI/ML, NLP or advanced modelling Exposure to the modern data stack tools (e.g. Snowflake, Databricks). Experience managing P&L, setting go to market strategy or building consulting practices. Exposure to behavioural data sources (eg Google or Adobe analytics). Benefits At Interpath, our people More ❯
learning and Data science applications Ability to use wide variety of open-source technologies Knowledge and experience using at least one Data Platform Technology such as Quantexa, Palantir and DataBricks Knowledge of test automation frameworks and ability to automate testing within the pipeline To discuss this or wider Technology roles with our recruitment team, all you need to do is More ❯
Leeds, England, United Kingdom Hybrid/Remote Options
KPMG UK
learning and Data science applications Ability to use wide variety of open-source technologies Knowledge and experience using at least one Data Platform Technology such as Quantexa, Palantir and DataBricks Knowledge of test automation frameworks and ability to automate testing within the pipeline To discuss this or wider Technology roles with our recruitment team, all you need to do is More ❯
Job Title: Solutions Architect (Databricks Expert) Rate: Competitive Location: Europe Contract Length: 6-12 months A consultancy client of ours have secured a project requiring a Databricks expert. This is an exciting opportunity to work on cutting-edge AI projects and build cloud-based systems that deliver real impact. Databricks Solution Architect Key Responsibilities: Architect and optimise scalable, high-performance … AI and data solutions leveraging Azure Databricks and modern data platform technologies. Serve as a subject matter expert on Databricks architecture, performance tuning, and best practices to enable advanced analytics and machine learning use cases. Partner with data engineering, BI, analytics, and AI/ML teams to design robust, reusable, and production-grade data pipelines and model deployment frameworks. Champion … the adoption of Databricks capabilities including Delta Lake, Unity Catalog, and MLflow, ensuring alignment with enterprise AI strategy. Lead the migration of legacy ETL and data processing workflows to modern, Databricks-native architectures that support AI-driven initiatives. Enforce data quality, governance, lineage, and security standards to maintain a trusted and compliant AI data ecosystem. Mentor and uplift teams, promoting More ❯
Job Title: Solutions Architect (Databricks Expert) Rate: Competitive Location: Europe Contract Length: 6-12 months A consultancy client of ours have secured a project requiring a Databricks expert. This is an exciting opportunity to work on cutting-edge AI projects and build cloud-based systems that deliver real impact. Databricks Solution Architect Key Responsibilities: Architect and optimise scalable, high-performance … AI and data solutions leveraging Azure Databricks and modern data platform technologies. Serve as a subject matter expert on Databricks architecture, performance tuning, and best practices to enable advanced analytics and machine learning use cases. Partner with data engineering, BI, analytics, and AI/ML teams to design robust, reusable, and production-grade data pipelines and model deployment frameworks. Champion … the adoption of Databricks capabilities including Delta Lake, Unity Catalog, and MLflow, ensuring alignment with enterprise AI strategy. Lead the migration of legacy ETL and data processing workflows to modern, Databricks-native architectures that support AI-driven initiatives. Enforce data quality, governance, lineage, and security standards to maintain a trusted and compliant AI data ecosystem. Mentor and uplift teams, promoting More ❯
Key Responsibilities End-to-end development of AI/ML solutions. MLOps practices: CI/CD, model monitoring, retraining. Use of open-source and enterprise tools (LangChain, Azure OpenAI, Databricks). Generative AI features: embeddings, RAG, AI agents. Clean, testable code with modern engineering practices. Align with enterprise architecture and governance. Collaborate with architects and stakeholders. Lifecycle management of models. More ❯
Key Responsibilities End-to-end development of AI/ML solutions. MLOps practices: CI/CD, model monitoring, retraining. Use of open-source and enterprise tools (LangChain, Azure OpenAI, Databricks). Generative AI features: embeddings, RAG, AI agents. Clean, testable code with modern engineering practices. Align with enterprise architecture and governance. Collaborate with architects and stakeholders. Lifecycle management of models. More ❯
Key Responsibilities End-to-end development of AI/ML solutions. MLOps practices: CI/CD, model monitoring, retraining. Use of open-source and enterprise tools (LangChain, Azure OpenAI, Databricks). Generative AI features: embeddings, RAG, AI agents. Clean, testable code with modern engineering practices. Align with enterprise architecture and governance. Collaborate with architects and stakeholders. Lifecycle management of models. More ❯
Key Responsibilities End-to-end development of AI/ML solutions. MLOps practices: CI/CD, model monitoring, retraining. Use of open-source and enterprise tools (LangChain, Azure OpenAI, Databricks). Generative AI features: embeddings, RAG, AI agents. Clean, testable code with modern engineering practices. Align with enterprise architecture and governance. Collaborate with architects and stakeholders. Lifecycle management of models. More ❯
inside IR35 Location: Remote Clearance : active SC Key Responsibilities: Design, develop, and optimize data pipelines using Microsoft Fabric and Azure services (eg, Azure Data Factory, Azure Synapse Analytics, Azure Databricks). Build and maintain scalable, high-performance data architectures to support analytics, reporting, and machine learning workloads. Implement data ingestion, ETL/ELT processes, and data warehousing solutions. Collaborate with More ❯
inside IR35 Location: Remote Clearance : active SC Key Responsibilities: Design, develop, and optimize data pipelines using Microsoft Fabric and Azure services (eg, Azure Data Factory, Azure Synapse Analytics, Azure Databricks). Build and maintain scalable, high-performance data architectures to support analytics, reporting, and machine learning workloads. Implement data ingestion, ETL/ELT processes, and data warehousing solutions. Collaborate with More ❯
Key Responsibilities * End-to-end development of AI/ML solutions. * MLOps practices: CI/CD, model monitoring, retraining. * Use of open-source and enterprise tools (LangChain, Azure OpenAI, Databricks). * Generative AI features: embeddings, RAG, AI agents. * Clean, testable code with modern engineering practices. * Align with enterprise architecture and governance. * Collaborate with architects and stakeholders. * Lifecycle management of models. More ❯
Coventry, West Midlands, United Kingdom Hybrid/Remote Options
Coventry Building Society
development of data products. Desirable experience would be to have: Experience of working in an Agile Team; preferably Safe. Experience in specific tooling Qlik Replicate/Qlik Compose/DataBricks/Informatica/SAS An understanding of data modelling methodology (Kimball, Data Vault, Lakehouse) Understanding of Data Science, AI and Machine Learning ways of working About us Were one of More ❯
Products, Retail, Telecom or Financial Services industries. Applied knowledge of supply chain and associated data, e.g. procurement, manufacturing, logistics Good experience in working with data (Python/PySpark/Databricks) in a cloud-based data systems environment (ideally Azure). Experience in developing using agile software development methodologies, principles such as DevOps, CI/CD, and unit testing. Comfortable working More ❯
Coventry, Warwickshire, United Kingdom Hybrid/Remote Options
Coventry Building Society
development of data products. Desirable experience would be to have: Experience of working in an Agile Team; preferably Safe. Experience in specific tooling Qlik Replicate/Qlik Compose/DataBricks/Informatica/SAS An understanding of data modelling methodology (Kimball, Data Vault, Lakehouse) Understanding of Data Science, AI and Machine Learning ways of working About us Were one of More ❯
Products, Retail, Telecom or Financial Services industries. Applied knowledge of supply chain and associated data, e.g. procurement, manufacturing, logistics Good experience in working with data (Python/PySpark/Databricks) in a cloud-based data systems environment (ideally Azure). Experience in developing using agile software development methodologies, principles such as DevOps, CI/CD, and unit testing. Comfortable working More ❯