at least $6M+ in new bookings. Experience co-owning and driving proposals and SOWs of $1M+ (help from Ness is a bonus). Experience selling alliance partnership services (Salesforce, Databricks, AWS, Confluent). Willing to travel to France monthly and as needed. Willing to commit at least 3 years to the role, with a view to greater responsibilities at Ness. More ❯
business needs our type of product, you'll work with a variety of new clients and industries as Zip scales. Current clients include OpenAI, Coinbase, Snowflake, Notion, Canva, Samsara, Databricks and many more! Your Role We're looking for a Senior Solutions Consultant - Managed Services with experience with Netsuite, Sage, Coupa or Quickbooks implementations services to lead post-implementation services More ❯
business needs our type of product, you'll work with a variety of new clients and industries as Zip scales. Current clients include OpenAI, Coinbase, Snowflake, Notion, Canva, Samsara, Databricks, etc. You Will Lead onboarding for new customers, with a heavy emphasis on understanding requirements and creatively configuring the product to solve their problems Responsible for leading the end-to More ❯
Southeast Asia, Australia, and New Zealand-including many of the world's largest Fortune 1000 and Global 2000 companies. With strong global momentum, a growing partner ecosystem including SentinelOne, Databricks, and Google Cloud, and a major fundraise on the horizon, we're scaling quickly toward long-term growth and IPO readiness. Join us as we define the future of SaaS More ❯
to make the biggest commercial impact in our business. Technical Acumen & Innovation: Demonstrate deep technical skill and hands-on experience with modern cloud-based data platforms and databases (e.g, Databricks, BigQuery, Snowflake). You'll be expected to understand and deploy their best features to achieve optimal outcomes. AI Enablement: Play a crucial role in enabling AI and machine learning More ❯
focused on designing modern, scalable, and secure data platforms for enterprise clients. You'll play a key role in shaping data architecture across the full Azure stack- including Azure Databricks and Azure Data Factory (ADF) -and will guide engineering teams in delivering robust, future-proof solutions using lakehouse and medallion architecture principles . Key Responsibilities Design end-to-end data … architectures using Azure services, including Azure Databricks, ADF, Synapse Analytics , and Data Lake Storage Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD … and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensional modelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Randstad Technologies
focused on designing modern, scalable, and secure data platforms for enterprise clients. You'll play a key role in shaping data architecture across the full Azure stack- including Azure Databricks and Azure Data Factory (ADF) -and will guide engineering teams in delivering robust, future-proof solutions using lakehouse and medallion architecture principles . Key Responsibilities Design end-to-end data … architectures using Azure services, including Azure Databricks, ADF, Synapse Analytics , and Data Lake Storage Define scalable data models and implement architectural patterns such as lakehouse and medallion Lead technical solution design during client engagements, from discovery to delivery Establish and enforce data governance, modelling, and lifecycle standards Support engineering and DevOps teams with guidance on best practices, CI/CD … and infrastructure-as-code Requirements 7+ years in data architecture or senior engineering roles Strong hands-on experience with Azure Databricks and Azure Data Factory Proficient in SQL, Python , and Spark Expertise in data modelling and architectural patterns for analytics (e.g., lakehouse, medallion, dimensional modelling) Solid understanding of cloud security, private networking, GDPR, and PII compliance Excellent communication skills with More ❯
Summary: Join a team building a modern Azure-based data platform. This hands-on engineering role involves designing and developing scalable, automated data pipelines using tools like Data Factory, Databricks, Synapse, and Logic Apps. You'll work across the full data lifecycle - from ingestion to transformation and delivery - enabling smarter, faster insights. Key Responsibilities: * Develop and maintain data pipelines using … Collaborate with cross-functional teams in an agile environment. Collaboration With: * Data Engineers, Architects, Product Owners, Test Analysts, and BI Teams. Skills & Experience: * Proficiency in Azure tools (Data Factory, Databricks, Synapse, etc.). * Strong SQL and experience with data warehousing (Kimball methodology). * Programming skills in Python, Scala, or PySpark. * Familiarity with Power BI, SharePoint, and data integration technologies. * Understanding More ❯
Summary: Join a team building a modern Azure-based data platform. This hands-on engineering role involves designing and developing scalable, automated data pipelines using tools like Data Factory, Databricks, Synapse, and Logic Apps. You'll work across the full data lifecycle - from ingestion to transformation and delivery - enabling smarter, faster insights. Key Responsibilities: * Develop and maintain data pipelines using … Collaborate with cross-functional teams in an agile environment. Collaboration With: * Data Engineers, Architects, Product Owners, Test Analysts, and BI Teams. Skills & Experience: * Proficiency in Azure tools (Data Factory, Databricks, Synapse, etc.). * Strong SQL and experience with data warehousing (Kimball methodology). * Programming skills in Python, Scala, or PySpark. * Familiarity with Power BI, SharePoint, and data integration technologies. * Understanding More ❯
in complex, enterprise-scale environments Proven ability to deploy data products through development stages into production, with monitoring and observability tools Deep technical expertise in PySpark, SQL, Java, Spark, Databricks, dbt, AWS, and Azure Familiarity with European jurisdictions and global reporting requirements Experience with orchestration and CI/CD tools such as Airflow, Databricks Workflows, and Azure DevOps Strong problem More ❯
in complex, enterprise-scale environments Proven ability to deploy data products through development stages into production, with monitoring and observability tools Deep technical expertise in PySpark, SQL, Java, Spark, Databricks, dbt, AWS, and Azure Familiarity with European jurisdictions and global reporting requirements Experience with orchestration and CI/CD tools such as Airflow, Databricks Workflows, and Azure DevOps Strong problem More ❯
Sheffield, South Yorkshire, Yorkshire, United Kingdom Hybrid / WFH Options
Fdo Consulting Limited
Databricks Data Engineer (roles available at Lead, Senior and Mid Level), £ 50000 - 85000 + benefits. SQL, ETL, Data Warehousing, Databricks etc. Home Based with one day a month at the office in Nottingham. Strong commercial knowledge of Databricks is required for this role. Expanding SaaS product company are looking for a number of Data Engineer as they continue to grow. … Data Engineers and BA's to understand the data needs of the business. Skills Required Include - Previous experience as Data Engineer in a delivery focused environment. Excellent knowledge of Databricks Experience analysing complex business problems and designing workable technical solutions. Excellent knowledge of the SDLC, including testing and delivery in an agile environment. Excellent knowledge of SQL and ETL Experience … expand its data team. In these roles you will use your technical skills and soft skills/people skills allowing the data team to further develop. Strong, hands-on databricks skills are mandatory for this role. This role is home based with one day a month at their office in Nottingham. Salary is in the range £ 50000 - 85000 + benefits More ❯
West London, London, England, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
ongoing development and enhancement of the Data Platform. Design, implement, and oversee scalable data pipelines and ETL/ELT processes within MS Fabric, leveraging expertise in Azure Data Factory, Databricks, and other Azure services. Advocate for engineering best practices and ensure long-term sustainability of systems. Integrate principles of data quality, observability, and governance throughout all processes. Participate in recruiting … Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment More ❯
West London, London, United Kingdom Hybrid / WFH Options
Young's Employment Services Ltd
ongoing development and enhancement of the Data Platform. Design, implement, and oversee scalable data pipelines and ETL/ELT processes within MS Fabric, leveraging expertise in Azure Data Factory, Databricks, and other Azure services. Advocate for engineering best practices and ensure long-term sustainability of systems. Integrate principles of data quality, observability, and governance throughout all processes. Participate in recruiting … Engineer, Tech Lead, Data Engineering Manager etc. Proven success with modern data infrastructure: distributed systems, batch and streaming pipelines Hands-on knowledge of tools such as Apache Spark, Kafka, Databricks, DBT or similar Experience building, defining, and owning data models, data lakes, and data warehouses Programming proficiency in Python, Pyspark, Scala or Java. Experience operating in a cloud-native environment More ❯
Python, dbt, and modern data warehousing platforms to turn raw inputs into high-quality, trusted outputs. Own the data layer: Manage and optimise our cloud-based data warehouse (Redshift, Databricks), ensuring our data infrastructure is scalable, cost-effective, and resilient. Drive decision-making with clean data : Develop gold-standard datasets and dashboards using Amazon Quicksight to support reporting and analytics … years of experience in data engineering, analytics engineering, or a similar technical data role Advanced skills in SQL and Python Proven experience with cloud data warehousing platforms like Redshift, Databricks, BigQuery, or Snowflake Experience building and maintaining ELT pipelines using dbt in a production environment Strong understanding of data modelling principles (e.g., bronze/silver/gold layer design) Hands More ❯
decision making for Cox Automotive. You'll collaborate with a talented team, using open-source tools such as R, Python, and Spark, data visualisation tools like Power BI, and Databricks data platform. Key Responsibilities: Develop and implement analytics strategies that provide actionable insights for our business and clients. Apply the scientific method to create robust, reproducible solutions Collaborate with stakeholders … seamlessly with team members and external clients. Proficiency in R or Python. Solid understanding of SQL; experience working with Spark (Java, Python, or Scala variants) and cloud platforms like Databricks is a plus. Strong statistical knowledge, including hypothesis testing, confidence intervals, and A/B testing. Ability to understand and communicate the commercial impact of data activities. Why Join Us More ❯
Data Engineer (Databricks) - Leeds Our client is a global innovator and world leader with a highly recognizable name within technology. They are looking for Data Engineers with significant Databricks experience to join an exceptional Agile engineering team. We are seeking a Data Engineer with strong Python, PySpark, and SQL experience, a clear understanding of Databricks, and a passion for Data More ❯
We're seeking an experienced Databricks Engineer to help design and deliver strategic data solutions across enterprise platforms. This role will support both proof-of-concept work and the development of a Strategic Operational Data Store (ODS), with a focus on performance, scalability, and governance. This role is 3 days a week onsite in either Newcastle, London or Edinburgh and … falls outside IR35. Key Responsibilities Lead Databricks platform evaluation, including performance benchmarking and stress testing Collaborate with Databricks Professional Services for architectural assurance Design and document scalable data models and integration patterns Support enterprise integration, including Salesforce data and unified client ID strategies Migrate Azure Data Factory flows to Databricks for improved traceability Implement Unity Catalog for automated data lineage … Deliver backlog items through Agile sprint planning Skills & Experience Strong hands-on experience with Databricks, Fabric, Apache Spark, Delta Lake Proficient in Python, SQL, and PySpark Familiar with Azure Data Factory, Event Hub, Unity Catalog Solid understanding of data governance and enterprise architecture Effective communicator with experience engaging stakeholders Desirable Experience with Salesforce data integration Prior involvement in platform PoCs More ❯
We're seeking an experienced Databricks Engineer to help design and deliver strategic data solutions across enterprise platforms. This role will support both proof-of-concept work and the development of a Strategic Operational Data Store (ODS), with a focus on performance, scalability, and governance. This role is 3 days a week onsite in either Newcastle, London or Edinburgh and … falls outside IR35. Key Responsibilities Lead Databricks platform evaluation, including performance benchmarking and stress testing Collaborate with Databricks Professional Services for architectural assurance Design and document scalable data models and integration patterns Support enterprise integration, including Salesforce data and unified client ID strategies Migrate Azure Data Factory flows to Databricks for improved traceability Implement Unity Catalog for automated data lineage … Deliver backlog items through Agile sprint planning Skills & Experience Strong hands-on experience with Databricks, Fabric, Apache Spark, Delta Lake Proficient in Python, SQL, and PySpark Familiar with Azure Data Factory, Event Hub, Unity Catalog Solid understanding of data governance and enterprise architecture Effective communicator with experience engaging stakeholders Desirable Experience with Salesforce data integration Prior involvement in platform PoCs More ❯
Who we are in a nutshell. At BES Group, we pride ourselves in being the leading end to end risk management solutions provider in the UK and Ireland. That means it’s our job to help keep our customers assets More ❯
Birmingham, West Midlands, United Kingdom Hybrid / WFH Options
MYO Talent
Data Engineer/Data Engineering/Lakehouse/Delta Lake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/Based in the West Midlands/Solihull/Birmingham area (1 day per week), Permanent role, £50,000 70,000 + car/allowance + bonus. One of our leading clients is looking to recruit … Permanent role Salary £50,000 70,000 + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, Delta Lake Data Warehousing ETL CDC Stream Processing Database Design ML Python/PySpark Azure Blob Storage Parquet Azure Data Factory Desirable: Any exposure working in More ❯
Engineer to join on a contract basis to support major digital transformation projects with Tier 1 banks. You'll help design and build scalable, cloud-based data solutions using Databricks , Python , Spark , and Kafka -working on both greenfield initiatives and enhancing high-traffic financial applications. Key Skills & Experience: Strong hands-on experience with Databricks , Delta Lake , Spark Structured Streaming , and More ❯
Job Title: Databricks Architect Location : Nottingham Contract : Perm Recruiter: Noaman Hussain About the role Boots and the No7 Beauty company recognise the value of data as a strategic asset and have invested in creating an independent Data Office. Our data office supports the business with data platforms, management, and usage in the business to lead commercial value and achieve strategic … science and analytics community within Boots. We're looking for someone to make the next step in their career and join us as a Platform Architect focusing on our Databricks technology offerings. Platform Architecture is a broad discipline so we are looking for candidates from a range of backgrounds from technical to architectural. As Databricks Architect you will: Ensure our … Databricks offering supports projects and teams in delivering value from data. Lead the definition, design and implementation of new capabilities within Databricks Work with other Solution Architects in the team to ensure new requirements are cost-effective, deliverable and supportable. Work with technology and engineering partners and with internal teams to deliver platform capabilities, allowing us to maximise the value More ❯
Data Engineer - AWS, Databricks & Pyspark Contract Role - Data Engineer Location: Hybrid (1 day per month onsite in Harrow, London) Rate: £350 per day (Outside IR35) Duration: 6 months A client of mine is looking for a Data Engineer to help maintain and enhance their existing cloud-based data platform. The core migration to a Databricks Delta Lakehouse on AWS has … Key Responsibilities: - Maintain and optimise existing ETL pipelines to support reporting and analytics - Assist with improvements to performance, scalability, and cost-efficiency across the platform - Work within the existing Databricks environment to develop new data solutions as required - Collaborate with analysts, data scientists, and business stakeholders to deliver clean, usable datasets - Contribute to good data governance, CI/CD workflows … and engineering standards - Continue developing your skills in PySpark, Databricks, and AWS-based tools Tech Stack Includes: - Databricks (Delta Lake, PySpark) - AWS - CI/CD tooling (Git, DevOps pipeline - Cloud-based data warehousing and analytics tools If your a mid to snr level Data Engineer feel free to apply or send your C.V Data Engineer - AWS, Databricks & Pyspark More ❯
/ETL ingestion pipelines to handle data movement and transformation from structured and unstructured data sources. Experience with the Azure cloud platform including: Data Ingestion: Azure Data Factory (ADF), Databricks, Logic Apps, Azure Functions Databricks platform - including managing, developing, and deploying workflows, jobs, and notebooks. Proven experience in modelling data in a data warehouse using Inmon or Kimball approaches. Experience More ❯