City of London, London, United Kingdom Hybrid/Remote Options
Az-Tec Talent
and optimise ETL/ELT solutions using best practices in data modelling and architecture. Collaborate with client teams to understand requirements and design tailored data solutions. Work across Snowflake, Databricks, AWS, Azure , or similar environments. Support internal projects focused on capability development, data tooling, and process improvement. Engage in stakeholder discussions to influence technical direction and project outcomes. What You … years of data engineering experience , including at least 2 years hands-on with modern data platforms. Strong proficiency in SQL , data modelling, and query optimisation. Practical experience with Snowflake, Databricks, AWS Redshift, or Microsoft Fabric . Solid understanding of ETL/ELT pipelines and data warehousing principles . Strong communication and problem-solving skills. Ability to work both independently and … Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and infrastructure-as-code concepts. What’s on Offer Hybrid working model (2–3 days per week More ❯
and optimise ETL/ELT solutions using best practices in data modelling and architecture. Collaborate with client teams to understand requirements and design tailored data solutions. Work across Snowflake, Databricks, AWS, Azure , or similar environments. Support internal projects focused on capability development, data tooling, and process improvement. Engage in stakeholder discussions to influence technical direction and project outcomes. What You … years of data engineering experience , including at least 2 years hands-on with modern data platforms. Strong proficiency in SQL , data modelling, and query optimisation. Practical experience with Snowflake, Databricks, AWS Redshift, or Microsoft Fabric . Solid understanding of ETL/ELT pipelines and data warehousing principles . Strong communication and problem-solving skills. Ability to work both independently and … Desirable: Consulting experience or client-facing delivery background. Familiarity with tools such as dbt, Fivetran, Matillion , or similar. Programming skills in Python, Java, or Scala . Cloud certifications (SnowPro, Databricks Certified, AWS/Azure/GCP). Knowledge of DataOps, CI/CD , and infrastructure-as-code concepts. What’s on Offer Hybrid working model (2–3 days per week More ❯
Luton, England, United Kingdom Hybrid/Remote Options
easyJet
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks and a large amount of interesting data this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use … engineering practices (e.g. TDD, CI/CD). • Experience with Apache Spark or any other distributed data programming frameworks. • Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. • Experience with cloud infrastructure like AWS or Azure. • Experience with Linux and containerisation (e.g Docker, shell scripting). • Understanding Data modelling and Data Cataloguing principles. • Understanding of … end monitoring, quality checks, lineage tracking and automated alerts to ensure reliable and trustworthy data across the platform. • Experience of building a data transformation framework with dbt. • Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. What you’ll get in return • Competitive base salary • Up to 20% bonus • 25 days holiday • BAYE, SAYE & Performance share More ❯
Luton, England, United Kingdom Hybrid/Remote Options
easyJet
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and … solutions. Job Accountabilities Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. Highly competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing … workflow and knowledge of when and how to use dedicated hardware. Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Hadoop, Beam) Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. Experience with data quality and/or and data lineage frameworks like Great Expectations, dbt data quality, OpenLineage or Marquez More ❯
what’s possible with data and AI. What You’ll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI and ML outputs into scalable, automated data workflows. Implement monitoring, alerting, and data quality … tools and cloud platforms to turn data into something powerful. You’ll bring: 3+ years’ experience in data engineering or cloud platform development (including Azure, AWS, GCP, Snowflake or Databricks) Strong proficiency in SQL and Python. Proven experience with CI/CD tools, DevOps, and automation practices. Experience of taking a ML Model to production. Solid understanding of data modelling More ❯
City of London, London, United Kingdom Hybrid/Remote Options
Datatech Analytics
what’s possible with data and AI. What You’ll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI and ML outputs into scalable, automated data workflows. Implement monitoring, alerting, and data quality … tools and cloud platforms to turn data into something powerful. You’ll bring: 3+ years’ experience in data engineering or cloud platform development (including Azure, AWS, GCP, Snowflake or Databricks) Strong proficiency in SQL and Python. Proven experience with CI/CD tools, DevOps, and automation practices. Experience of taking a ML Model to production. Solid understanding of data modelling More ❯
what's possible with data and AI. What You'll Do Design & build high-performance ETL/ELT pipelines in modern cloud environments (including Azure, AWS, GCP, Snowflake or Databricks). Lead CI/CD automation, environment versioning, and production deployments for data products. Integrate AI and ML outputs into scalable, automated data workflows. Implement monitoring, alerting, and data quality … tools and cloud platforms to turn data into something powerful. You'll bring: 3+ years' experience in data engineering or cloud platform development (including Azure, AWS, GCP, Snowflake or Databricks) Strong proficiency in SQL and Python. Proven experience with CI/CD tools, DevOps, and automation practices. Solid understanding of data modelling, orchestration, and workflow management. A desire to lead More ❯
and explore opportunities for use within the organization Knowledge and Requirements: Five to seven years of experience in Data Warehousing, Data integration or Data Engineering projects Experience in Azure Databricks ,Informatica Powercenter, IICS, Cognos, Netezza Performance servers Experience working within Azure ecosystem Experienced in any of these analytical platforms - PowerBI, AzureML, Databricks or Synapse Proficient in SQL Experience using Python More ❯
to design and build the data solutions need to measure, report and analyze the performance of the business and its operations. Our focus lies in technologies such as Azure, Databricks, Power BI, and ADF. We actively explore and adopt new technologies to innovate and enhance our processes. We are looking for people who want to learn new skills, get hands … with the latest technologies and become very knowledgeable of BIS systems. Successful candidates will join a great team with a real mix of experience, knowledge and skills. Essential Skills DataBricks or similar and Azure Data Factory Languages: SQL, Python Data Platform: Azure data lake and SQL Server RDBMS ETL: SQL Server Integration Services and ADF Experienced in effectively working in More ❯
Contract type: Permanent The team and the opportunity Are you ready to shape how organisations use data to make smarter decisions? At EY Ireland, we're looking for a Databricks expert to help design and deliver modern data platforms that drive real business impact. This is your chance to work with cutting-edge technologies, influence strategic decisions, and collaborate with … passionate about data architecture and want to make a difference, we'd love to hear from you. Your key responsibilities Designing scalable, secure, and high-performing data architectures using Databricks and cloud-native tools Leading the development of data models, pipelines, and integration frameworks that support analytics and AI Collaborating with clients and internal teams to understand business needs and … problem-solving Ideally, you will also have Familiarity with data security, privacy, and compliance frameworks Knowledge of data mesh, data fabric, or other emerging architecture patterns Professional certification in Databricks What We Look For: We are looking for candidates who are: Innovative and agile , with a purpose-driven mindset. Inclusive and able to work effectively in diverse teams. Passionate about More ❯
Role: Senior Databricks AI Platform SREBill Rate: $87/hour C2CLocation: ALpharetta,GADuration: 12+ months/long-term Interview Criteria: Telephonic + ZoomDirect Client Requirement Job DescriptionWe are looking for a Senior Databricks AI Platform SRE to join our Platform SRE team. This role will be critical in designing, building, and optimizing a scalable, secure, and developer-friendly Databricks platform … cloud architects to automate infrastructure, enforce best practices, and streamline the end-to-end ML lifecycle using modern cloud-native technologies. Responsibilities: Design and implement secure, scalable, and automated Databricks environments to support AI/ML workloads. Develop infrastructure-as-code (IaC) solutions using Terraform for provisioning Databricks, cloud resources, and network configurations. Build automation and self-service capabilities using … workspace provisioning, orchestration and monitoring. Collaborate with data science and ML teams to define compute requirements, governance policies, and efficient workflows across dev/qa/prod environments. Integrate Databricks offering with cloud-native services on Azure/AWS Champion CI/CD and GitOps for managing ML infrastructure and configurations. Ensure compliance with enterprise security and data governance policies More ❯
Machine Learning Engineer (Databricks) - £60 - £70k Edinburgh Hybrid - 2 days onsite Im on the lookout for an MLOps Engineer who can truly bridge the gap between Data Engineering and Data Science. This role is all about leveraging Databricks and Python to design, build, and scale data models that drive genuine business impact. Youll be joining a scaling B2B tech company … Take a hands-on approach to data science, analytics, and ML solutions. Continuously optimise data workflows for performance, reliability, and scalability. What youll need: Proven hands-on experience with Databricks, Python, PySpark, and SQL. Machine learning experience in a cloud environment (AWS, Azure, or GCP). Strong understanding of ML libraries such as scikit-learn, TensorFlow, or MLflow. Solid background More ❯
Edinburgh, Roxburgh's Court, City of Edinburgh, United Kingdom
Bright Purple
Machine Learning Engineer (Databricks) - £60 - £70k Edinburgh Hybrid - 2 days onsite I’m on the lookout for an MLOps Engineer who can truly bridge the gap between Data Engineering and Data Science. This role is all about leveraging Databricks and Python to design, build, and scale data models that drive genuine business impact. You’ll be joining a scaling B2B … a hands-on approach to data science, analytics, and ML solutions. • Continuously optimise data workflows for performance, reliability, and scalability. What you’ll need: • Proven hands-on experience with Databricks, Python, PySpark, and SQL. • Machine learning experience in a cloud environment (AWS, Azure, or GCP). • Strong understanding of ML libraries such as scikit-learn, TensorFlow, or MLflow. • Solid background More ❯
Requirements 10+ years in data architecture/engineering, 3+ in a lead role Expertise in Snowflake, Databricks, MDM, CDPs, BI tools (e.g., Looker) Strong in data modeling, streaming, AI/ML pipelines, and governance Key Responsibilities Design scalable, secure data platforms (SQL/NoSQL, lakehouse, data mesh) Build real-time pipelines, semantic layers, and feature stores Implement governance, privacy, and More ❯
APIs. Proficiency with Kafka and distributed streaming systems. Solid understanding of SQL and data modeling. Experience with containerization (Docker) and orchestration (Kubernetes). Working knowledge of Flink, Spark, or Databricks for data processing. Familiarity with AWS services (ECS, EKS, S3, Lambda, etc.). Basic scripting in Python for automation or data manipulation. More ❯
EC4N 6JD, Vintry, United Kingdom Hybrid/Remote Options
Syntax Consultancy Ltd
Databricks Engineer (SC Cleared) London (Hybrid) 6 Month Contract £(Apply online only)/day (Inside IR35) Databricks Engineer needed with active SC Security Clearance for 6 Month Contract based in Central London (Hybrid). Developing a cutting-edge Azure Databricks platform for economic data modelling, analysis, and forecasting. Start ASAP in Nov/Dec 2025. Hybrid Working - 2 days/… Central London office. A chance to work with a leading global IT and Digital transformation business specialising in Government projects: In-depth Data Engineering + strong hands-on Azure Databricks expertise. Azure Data Services, Azure Data Factory, Azure Blob Storage + Azure SQL Database. Designing, developing, building + optimising data pipelines, implementing data transformations, ensuring data quality and reliability. Deep More ❯
API's fix and enhancement for projects and will involve some stakeholder management. Key skills: · Machine Learning experience · Microservices architecture · Realtime Services/API · Python · Docker · Kubernetes · Jenkins · MLFlow · Databricks Ideally, you’ll have experience within the payments industry, but we’re looking for the skill set over the industry, so experience and understanding of data pipelines and large scale More ❯
API's fix and enhancement for projects and will involve some stakeholder management. Key skills: · Machine Learning experience · Microservices architecture · Realtime Services/API · Python · Docker · Kubernetes · Jenkins · MLFlow · Databricks Ideally, you’ll have experience within the payments industry, but we’re looking for the skill set over the industry, so experience and understanding of data pipelines and large scale More ❯
platform from the ground up. - Hands-on coding appetite - this isn't a pure management role - Team leadership experience - comfortable managing and mentoring engineers - Advanced skills: SQL, Python, PySpark, Databricks, Azure stack - Enjoys building from scratch - thrives in low-maturity, high-potential environments. If this role sounds appealing to you apply now to learn more about the client DVF are More ❯
platform from the ground up. - Hands-on coding appetite - this isn't a pure management role - Team leadership experience - comfortable managing and mentoring engineers - Advanced skills: SQL, Python, PySpark, Databricks, Azure stack - Enjoys building from scratch - thrives in low-maturity, high-potential environments. If this role sounds appealing to you apply now to learn more about the client DVF are More ❯
and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. You will also have a number of years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines. ETL process expertise is essential. Proficiency in working with Snowflake or similar cloud-based data warehousing solutions is also essential. Experience in data More ❯
experience with test automation frameworks such as Playwright , SpecFlow , or similar Deep understanding of Agile methodologies and Test-Driven Development (TDD) Strong data testing expertise using SQL , Snowflake , or Databricks Proven ability to write, execute, and maintain complex automated test suites Solid background in the Asset Management domain Experience with Aladdin or similar enterprise asset management platforms Location: Hybrid More ❯
ID) Azure Networking and Security Azure DevOps and CI/CD pipelines Azure Monitor, Log Analytics, and Policy RBAC/IAM and cloud governance best practices Familiarity with Azure Databricks , Dynamics 365 , and Azure Service Bus is highly desirable Strong knowledge of cloud compliance, security controls , and automation practices Experience collaborating with Architecture, Security, and Development teams within cloud environments More ❯
West End, England, United Kingdom Hybrid/Remote Options
Henderson Scott
ID) Azure Networking and Security Azure DevOps and CI/CD pipelines Azure Monitor, Log Analytics, and Policy RBAC/IAM and cloud governance best practices Familiarity with Azure Databricks , Dynamics 365 , and Azure Service Bus is highly desirable Strong knowledge of cloud compliance, security controls , and automation practices Experience collaborating with Architecture, Security, and Development teams within cloud environments More ❯
Leeds, West Yorkshire, England, United Kingdom Hybrid/Remote Options
Tenth Revolution Group
Head of Data Engineering - Azure & Databricks - Hybrid - Up to £100,000 A forward-thinking and nationally recognised organisation, known for its commitment to innovation and data-driven decision-making, is seeking a Head of Data Engineering to lead its growing data function. With a strong culture of collaboration, investment in cutting-edge technology, and a clear roadmap for digital transformation More ❯