of AI/ML concepts: supervised and unsupervised learning, GenAI, LLMs, embeddings, MLOps, vector search. Experience designing solutions using tools such as Azure ML, AWS SageMaker, Google Vertex AI, Databricks, LangChain, and Hugging Face. Ability to develop architecture artefacts (e.g., HLDs), estimate effort and cost, and contribute to proposal writing. Strong storytelling and presentation skills; able to build confidence with More ❯
The Product Security Team's mission is to Left-shift SDLC (Security Development Lifecycle) processes for ALL code written in Databricks (for Customer Use or Supporting Customer internally) to reduce the likelihood of introducing new vulnerabilities in production and minimize the count and effect of externally identified vulnerabilities on Databricks Services. You will be an individual contributor on the product … security team at Databricks, managing SDLC functions for features and products within Databricks. This would include, but is not limited to, security design reviews, threat models, manual code reviews, exploit writing and exploit chain creation. You will also support IR and VRP programs when there is a vulnerability report or a product security incident. You will work with a global … Work on DAST tools and related automation for auto-assessment and defect filing. Maintain the automation framework and add new features as needed to support different security compliances that Databricks may want to get into - FedRamp, PCI, HIPPA, etc. Prioritize security from a risk management perspective, rather than an absolute textbook version. Help develop and implement security processes to improve More ❯
Experience with network-centric datasets (fiber, GPON, ethernet, Wi-Fi telemetry). • Exposure to streaming technologies (Kafka, Event Hubs) and real-time analytics. • Knowledge of Machine Learning Ops (MLflow, Databricks). Deadline: ASAP Contract Type: Full Time Location: London Interested? The full job specification can be downloaded at the link below. Job Description To apply, please complete the form below More ❯
quality assurance, test automation, or data validation . Experience in testing data pipelines, ETL/ELT workflows, and big data environments . Familiarity with Azure data platforms , such as Databricks, Azure Data Factory, Synapse Analytics, or ADLS . Proficiency in SQL and scripting languages (e.g., Python, Scala) for data validation and test automation. Experience with test automation frameworks (e.g., Great More ❯
need to have: Proven experience delivering AI/ML solutions in production , with a strong focus on large language models (LLMs) and generative AI. Hands-on expertise with Azure Databricks , including data engineering, model development, and orchestration of ML workflows. Practical experience using LLMs and RAG , including prompt design, vector databases, model deployment, and integration into applications. Strong Python development More ❯
data. Basic understanding of common machine learning techniques and ability to implement them in Python. Exposure to Azure ecosystem (e.g., Azure Machine Learning, Azure Blob Storage, Azure Synapse, or Databricks on Azure) is desirable. Familiarity with version control and collaborative coding workflows (e.g., Git, Azure DevOps). Experience with Power BI or similar tools to produce interactive dashboards or visualisations. More ❯
Databricks is seeking a strategic and dynamic Senior Partner Marketing professional with experience in high-growth SaaS companies to lead partner marketing initiatives across the EMEA region. You will be responsible for developing, executing, and optimizing complex partner marketing strategies across the EMEA region to drive joint pipeline, revenue, and customer success by collaborating with a diverse ecosystem of partners … and other communication events. Leverage the global marketing team to equip partners with marketing resources (campaigns in a box), enablement tools, and training that empower them to effectively promote Databricks offerings in the market. Ensure consistent messaging and positioning in all partner-led activities. Pipeline Development and ROI Identify pipeline opportunities and drive regional initiatives through co-marketing efforts. Manage … with the ability to influence cross-functional teams. Education: Bachelor's degree in Marketing, Business, or a related field required; Master's degree or MBA is a plus. About DatabricksDatabricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data More ❯
Want to help solve the world's toughest problems with data and AI? This is what we do every day at Databricks. Databricks operates at the leading edge of the Data and AI space. Our customers turn to us to lead the accelerated innovation their businesses need to gain first-mover advantage in today's ultra-competitive landscape. As a … Start-up Account Executive at Databricks, you will focus on acquiring new logos to grow our presence in South Africa. As a successful candidate, you are a self-starter who understands how to generate pipeline and manage full sales cycles. You have experience with the consumption business, you know how to sell innovation and value to new prospective customers, identify … customer details including use case, purchase time frames, next steps, and forecasting in Salesforce Identify new use case opportunities and showcase value to prospects Promote the value of the Databricks' Data Intelligence Platform Ensure 100% satisfaction among all customers What we look for: Experience penetrating new accounts, executing on first use cases, growing initial consumption and closing deals in a More ❯
Databricks is hiring an experienced IT Support Specialist to help scale and optimise our business processes, working with users globally to improve productivity and provide in-person service at our London office by resolving an array of technical issues. You will be a vital member of the IT Support team and ensure the best possible user experience is provided in … when dealing with users who may be frustrated or stressed due to technical issues. Empathy: Understanding and acknowledging users' concerns and frustrations and showing compassion toward their situation. About DatabricksDatabricks is the data and AI company. More than 10,000 organizations worldwide - including Comcast, Condé Nast, Grammarly, and over 50% of the Fortune 500 - rely on the Databricks Data … Intelligence Platform to unify and democratize data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe and was founded by the original creators of Lakehouse, Apache Spark, Delta Lake and MLflow. To learn more, follow Databricks on Twitter ,LinkedIn and Facebook . Benefits At Databricks, we strive to provide comprehensive benefits and perks that meet More ❯
Want to help solve the world's toughest problems with data and AI? This is what we do every day at Databricks. Databricks operates at the leading edge of the Data and AI space. Our customers turn to us to lead the accelerated innovation their businesses need to gain first-mover advantage in today's ultra-competitive landscape. We are … looking for a creative, delivery-oriented Named Enterprise Account Executive to maximise the phenomenal market opportunity that exists for Databricks in Qatar. As an Account Executive, you know how to sell innovation and value to existing customers, identify new use cases and grow consumption and can guide deals forward to compress decision cycles. You love understanding a product in depth … details including use case, purchase time frames, next steps, and forecasting in Salesforce Identify new use case opportunities and showcase value to existing customers Promote the value of the Databricks' Data Intelligence Platform Orchestrate and utilise our field engineering teams to ensure valuable outcomes for clients Build and demonstrate value with all engagements to guide successful negotiations to close point More ❯
practices. Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
practices. Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
practices. Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
or Azure DevOps . Excellent communication skills and stakeholder engagement capabilities. Nice to Have Familiarity with data visualization tools (e.g., Power BI, Tableau). Exposure to cloud platforms (e.g., Databricks , Snowflake ). Understanding of data governance , lineage , and metadata management . Experience with data cataloguing and quality frameworks . More ❯
supporting automated workflows in Alteryx Designer. Experience deploying workflows to the Production Gallery. Knowledge of database fundamentals, data design, SQL, and data warehouse concepts is advantageous. Exposure to PowerBI, Databricks, Microsoft Azure, and Profisee is a plus. Knowledge of Json, Python, XML, and R is beneficial. Experience with non-relational databases and unstructured data is advantageous. Familiarity with Azure DevOps More ❯
in: Data modeling and database design (SQL & NoSQL) Cloud-based architecture on Azure (required), and familiarity with AWS Microsoft Fabric and Logic Apps Azure-based machine learning workflows and Databricks on Azure Designing and optimizing data lakes, warehouses, and pipelines Experience implementing data governance, security standards, and compliance practices. Strong understanding of metadata management, data lineage, and data quality frameworks. More ❯
R, or Spark for data insights Data Bricks/Data QISQL for data access and processing (PostgreSQL preferred, but general SQL knowledge is important) Latest Data Science platforms (e.g., Databricks, Dataiku, AzureML, SageMaker) and frameworks (e.g., TensorFlow, MXNet, scikit-learn) Software engineering practices (coding standards, unit testing, version control, code review) Hadoop distributions (Cloudera, Hortonworks), NoSQL databases (Neo4j, Elastic), streaming More ❯
North West London, London, United Kingdom Hybrid / WFH Options
Anson Mccade
Key Experience Required: Proven experience as a Data Solution Architect on complex, multi-disciplinary consulting engagements Deep knowledge of Kafka , Confluent , and event-driven architecture Hands-on experience with Databricks , Unity Catalog , and Lakehouse architectures Strong architectural understanding across AWS, Azure, GCP , and Snowflake Familiarity with Apache Spark, SQL/NoSQL databases, and programming (Python, R, Java) Knowledge of data More ❯
firm Modeling data for a civil service department replacing a legacy HR system Experience and qualifications Technical 3+ years' experience in data or software engineering Knowledge of Python, SQL, Databricks, Snowflake, and major cloud platforms (AWS/Azure/GCP) Ability to learn quickly and adapt to new technologies and sectors Understanding of data engineering best practices and system design More ❯
will specialise in Azure Integration Services, Data Engineering in Fabric, and DevOps. You will work directly with leading Azure tools including Azure Data Factory, Logic Apps, Fabric, Power BI, Databricks, Azure DevOps, OpenAI, and more. Your role will involve creating robust, scalable cloud-native solutions, automating deployments, and ensuring seamless integrations. Our culture is collaborative, fast-paced, and centered around More ❯
Experience with machine learning frameworks and libraries such as TensorFlow, PyTorch, Scikit-Learn, etc. Experience developing Python APIs using tools such as FastAPI. Knowledge of database technologies (SQL, MongoDB, Databricks) and data pipeline tools. Familiar with ML CI/CD pipelines for development, testing, versioning, and task automation. Familiar with cloud platforms (AWS, GCP, Azure) and containerization (Docker, Kubernetes). More ❯
contribute to internal projects alongside our central data team. Experience & skills you may have Strong working knowledge of SQL, Python, Spark, and experience with cloud infrastructure (ideally AWS and Databricks) Hands-on experience with building data pipelines in Spark within live production environments Coding transformation processes using tools such as dbt Implementing data quality checks and validation Applying CI/ More ❯
languages such as Python or R, with extensive experience with LLMs, ML algorithms, and models. Experience with cloud services like Azure ML Studio, Azure Functions, Azure Pipelines, MLflow, Azure Databricks, etc., is a plus. Experience working in Azure/Microsoft environments is considered a real plus. Proven understanding of data science methods for analyzing and making sense of research data More ❯
SKILLS) Bachelor's or Master's degree in Computer Science, Engineering, or relevant experience hands-on with data engineering Strong hands-on knowledge of data platforms and tools, including Databricks, Spark, and SQL Experience designing and implementing data pipelines and ETL processes Good knowledge of ML ops principles and best practices to deploy, monitor and maintain machine learning models in More ❯
e.g., Pandas, NumPy) and deep expertise in SQL for building robust data extraction, transformation, and analysis pipelines. Hands-on experience with big data processing frameworks such as Apache Spark, Databricks, or Snowflake, with a focus on scalability and performance optimization Familiarity with graph databases (e.g., Neo4j, Memgraph) or search platforms (e.g., Elasticsearch, OpenSearch) to support complex data relationships and querying More ❯