closely with IT Operations, business stakeholders, and the broader Data & Innovation team to deliver impactful solutions. Skills and Experience Significant experience in data engineering and system development,? in particular Databricks, Microsoft technologies (Azure, Power Platform, M365) and integrations. Experience working with a broad range of SaaS applications such as Workday and Monday.com desirable. Proven ability to lead and mentor technical More ❯
and planning, through deployment to ongoing support and maintenance. Why should you work at Microsoft Business Group? Multi Award Winning Business Official and Award-Winning Microsoft Partner Award Winning Databricks Partner Official GitHub Partner Cloud Native Organisation Growing International and Diverse teams Hugely collaborative environment, where ideas and knowledge sharing are actively encouraged Private Medical Insurance Employee Assistance Program Income More ❯
architecture and enterprise data. Proficient with programming languages (C#, ASP.NET.NET Core, SQL). Demonstrated experience in working with REST APIs. Demonstrated experience in working with Cloud data warehouse like Databricks and Snowflake. Demonstrated experience in working with modern cloud data and DevOps techniques. Strong understanding of Agile methodologies with experience working on Agile delivery teams. Exceptional communication skills with the More ❯
Newbury, Berkshire, United Kingdom Hybrid / WFH Options
Intuita Consulting
calculated fields when required Possess hands on experience with SQL and knowledge of data warehousing and dimensional modelling will be advantageous Experience using large data platforms such as Snowflake, Databricks or similar Exposure to other visualisation platforms is helpful, but not essential Required Characteristics Cares about quality, ensuring data accuracy and consistency across all visualisations Confident in presenting dashboards and More ❯
Manchester, North West, United Kingdom Hybrid / WFH Options
We Are Dcoded Limited
is modelled, consumed, and visualised, while championing best practices in engineering. As a Senior Data Engineer, you'll also act as a mentor. Sharing your knowledge of cloud platforms, Databricks, and modern engineering techniques with junior team members, and influencing the wider data culture. Key Responsibilities: Design and deliver robust data pipelines using Databricks, Azure Data Factory, and Azure SQL. … about solving complex data challenges at scale and has a proven track record of delivering enterprise-level solutions. Essential Skills: 5 years' experience in Data Engineering. Strong expertise in Databricks, Azure Data Factory, Azure SQL, and Azure Synapse/DW. Solid understanding of dimensional modelling (Kimball, Star Schema) and EDW solutions. Experience working with structured and unstructured data. Familiarity with … shape the future of a high-growth retail & e-commerce business through technology and data. Impact at scale - work on a platform supporting millions of customers. Cutting-edge tech - Databricks, Azure, Data Factory, and modern BI tools. Career growth - opportunities to progress, upskill, and influence strategy. Collaborative culture - knowledge sharing, learning, and innovation are encouraged. Hybrid working - enjoy a balance More ❯
leading the engineering team and also input into design and strategy decisions. Initially there will be a focus on GCP but will also involve working extensively with Snowflake and Databricks, as well as other modern cloud and data engineering tools. As the technical leader for the data engineering function, you will define architectural standards, mentor engineers, and collaborate with cross … had essential hands-on experience building pipelines in Python, analysing data requirements with SQL, and modern data engineering practices and deep technical expertise in GCP, AWS and/or Databricks alongside experience of driving the team and inputting to the overall vision/strategy. Responsibilities Provide technical leadership and mentorship to the data engineering team, driving best practices in architecture … architectures Advanced proficiency in Python (including PySpark) and SQL, with experience building scalable data pipelines and analytics workflows Strong background in cloud-native data infrastructure (e.g., BigQuery, Redshift, Snowflake, Databricks) Demonstrated ability to lead teams, set technical direction, and collaborate effectively across business and technology functions Desirable skills Familiarity with machine learning pipelines and MLOps practices Additional experience with DatabricksMore ❯
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across both technical and non More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non-technical More ❯
more than 90 million passengers this year, we employ over 10,000 people. Its big-scale stuff and we're still growing. Job Purpose With a big investment into Databricks and a large amount of interesting data this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use … engineering practices (e.g. TDD, CI/CD). Experience with Apache Spark or any other distributed data programming frameworks. Comfortable writing efficient SQL and debugging on cloud warehouses like Databricks SQL or Snowflake. Experience with cloud infrastructure like AWS or Azure. Experience with Linux and containerisation (e.g Docker, shell scripting). Understanding Data modelling and Data Cataloguing principles. Understanding of … end monitoring, quality checks, lineage tracking and automated alerts to ensure reliable and trustworthy data across the platform. Experience of building a data transformation framework with dbt. Familiarity with Databricks as a data and AI platform or the Lakehouse Architecture. What you'll get in return Competitive base salary Up to 20% bonus BAYE, SAYE & Performance share schemes Flexible benefits More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
IET
in equal measure. Youll oversee the modernisation of our Azure-based data environment, champion scalable and resilient data pipelines, and drive adoption of tools such as Fabric, Data Factory, Databricks, Synapse, and Delta Lake. Working across all areas of the IET, youll ensure our data is secure, accessible, and delivering maximum value for analytics, business intelligence, and operational excellence. Youll … about the role Lead and modernise The IETs data architecture, ensuring alignment with best practice and strategic goals. Drive adoption of Azure data technologies such as Fabric, Data Factory, Databricks, Synapse, and Delta Lake. Develop and maintain scalable, resilient data pipelines to support analytics and reporting. Stay hands-on in solving technical challenges, implementing solutions, and ensuring the reliability of More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
reliable data movement and transformation. • Data Modelling using Kimball, 3NF or Dimensional methodologies • Utilize SQL and Python languages to extract, transform, and load data from various sources into Azure Databricks and Azure SQL/SQL Server. • Design and implement metadata driven pipelines to automate data processing tasks. • Collaborate with cross-functional teams to understand data requirements and implement appropriate solutions. … are some obvious required Skills and Experience we are going to be seeking out. For this role we'd be expecting to see: • Solid experience in Azure, specifically Azure Databricks and Azure SQL/SQL Server. • Proficiency in SQL and Python languages. • Hands-on experience in designing and building data pipelines using Azure Data Factory (ADF). • Familiarity with building More ❯
science role(s) within healthcare and/or life sciences organisations. BSc degree in a technical discipline (computer science, mathematics, engineering) Microsoft certification for Azure and Azure AI and Databricks certification Skills/Abilities Skilled in writing code in Python and/or R Advanced machine learning model building skills Advanced programming skills - including SQL Server for querying databases and … design) Strong experience of effectively liaising with stakeholders Proven ability to design and manage projects Experience of cloud data warehouse design and build. Knowledge and experience of using the Databricks platform Knowledge and experience of Azure Databricks, Azure Data Factory, Devops and Git. Disclosure and Barring Service Check This post is subject to the Rehabilitation of Offenders Act (Exceptions Order More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Arc IT Recruitment
looking to blend strategic leadership with hands-on technical skills. The role and skill sets... Ownership & Operation: Lead and operate core AWS services, CI/CD DevOps services, and Databricks data platform services. Scale-Up Experience: Been there, done that in a 200-2000+ sized org during their growth journey. Technically Proficient: Comfortable with large infrastructure changes using Pulumi More ❯
and cloud-native security practices. Key Skills & Experience: Terraform for Azure infrastructure automation GitHub Actions and CI/CD pipeline design Azure Private Link and Private Link Service configuration Databricks and Unity Catalog for data governance Azure Policy and compliance enforcement Identity and access management (OAuth, federated credentials) Azure security best practices including BCDR and high availability Cost management and More ❯
and cloud-native security practices. Key Skills & Experience: Terraform for Azure infrastructure automation GitHub Actions and CI/CD pipeline design Azure Private Link and Private Link Service configuration Databricks and Unity Catalog for data governance Azure Policy and compliance enforcement Identity and access management (OAuth, federated credentials) Azure security best practices including BCDR and high availability Cost management and More ❯
Vault 2.0 methodologies Experience designing models across raw, business, and consumption layers Solid understanding of metadata, cataloguing, and data governance practices Working knowledge of modern data platforms such as Databricks, Azure, Delta Lake, etc. Excellent communication and stakeholder engagement skills Data Architect - Benefits: Competitive base salary with regular reviews Car allowance - circa £5k per annum Discretionary company bonus Enhanced pension More ❯
EASA regulations, and strict confidentiality protocols. Strong IT skills, ideally with experience in easyJet systems (Safetynet/AIMS), remote working tools (MS Teams, SharePoint), and data analysis platforms (Tableau, Databricks, Python). Excellent communication, interpersonal, and presentation skills; professional and methodical approach to analysis. Ability to work closely with the Flight Data Manager to ensure the highest standards of integrity More ❯
of creating thought leadership content and customer-facing collateral. Commercial acumen with experience in costing and pricing solutions. Technical Expertise Strong knowledge of modern data platforms (e.g., Microsoft Fabric, Databricks, Snowflake, AWS). Familiarity with data governance, data strategy, and enterprise data foundations . Technical proficiency at Solution/Architect level (Level 2–3). Soft Skills Exceptional communication skills More ❯
Reddit, and Lyft rely on Zip to manage billions of dollars in spend. We're a fast-growing team that helped scale category-defining companies like Airbnb, Meta, Salesforce, Databricks, Ramp, Apple, and Google. With a $2.2 billion valuation and $370 million in funding from Y Combinator, BOND, DST Global, and CRV, we're focused on developing cutting-edge technology More ❯
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯
london (city of london), south east england, united kingdom
Capgemini
At Capgemini Financial Services, we are seeking a AWS Databricks Developer to join Capgemini's Insights and Data Practice. You will have the following experience : 8+ years of experience in data engineering or cloud development. Strong hands-on experience with AWS services Proficiency in Databricks, Apache Spark, SQL, and Python. Experience with data modeling, data warehousing, and DevOps practices. Familiarity … with Delta Lake, Unity Catalog, and Databricks REST APIs. Excellent problem-solving and communication skills. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. It is a responsible and diverse group of 350,000 team members in More ❯