to enact step-change operational efficiency and maximize business value by confidently utilizing trustworthy data. What are we looking for? Great experience as a Data Engineer Experience with Spark, Databricks, or similar data processing tools. Proficiency in working with the cloud environment and various software’s including SQL Server, Hadoop, and NoSQL databases. Proficiency in Python (or similar), SQL and … Spark. Proven ability to develop data pipelines (ETL/ELT). Strong inclination to learn and adapt to new technologies and languages. Strong understanding and experience in working with Databricks Delta Lake. Proficiency in Microsoft Azure cloud technologies Strong inclination to learn and adapt to new technologies and languages. What will be your key responsibilities? Collaborate in hands-on development … relevant technologies to create and maintain data assets and reports for business insights. Assist in engineering and managing data models and pipelines within a cloud environment, utilizing technologies like Databricks, Spark, Delta Lake, and SQL. Contribute to the maintenance and enhancement of our progressive tech stack, which includes Python, PySpark, Logic Apps, Azure Functions, ADLS, Django, and ReactJs. Support the More ❯
Exposure across all the SDLC process, including testing and deployment. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Good knowledge of Python and Spark are required. Experience in ETL & ELT Good understanding of one scripting language Good understanding of how to enable analytics … a diverse and challenging set of customers to success. Good understanding of the CPG (Consumer Packaged Goods) domain is preferred. Skills: Data Ops, ML Ops, Deep expertise in Azure Databricks , ETL frameworks. Expertise in Microsoft Azure is mandatory including components like Azure Data Factory, Azure Data Lake Storage, Azure SQL, Azure DataBricks, HD Insights, ML Service etc. Fractal provides equal More ❯
hybrid role based in the London area. As Data Analyst , you will be responsible for delivering data-driven insights across a wide range of datasets. Using tools such as Databricks, Tableau, Looker Studio, Amplitude, and DBT, you will extract, transform, and analyse data to support key business functions. You will collaborate closely with stakeholders to understand their data needs, generate … with forecasting, target setting, and promotional analysis. Data Governance & Optimisation Conduct QA checks for data accuracy. Improve data architecture and integrate new data sources. Apply advanced analytics using SQL, Databricks, and other tools. What are we looking for? 2+ years’ experience in a similar role. Strong analytical mindset with a passion for uncovering insights. Skilled in SQL and data visualisation More ❯
vision and architectural direction across entire programmes and client estates. Key Responsibilities Senior Data Architects: Lead the design and delivery of cloud-native data solutions using modern platforms (e.g. Databricks, Snowflake, Kafka, Confluent) Architect data lakes, lakehouses, streaming pipelines, and event-driven architectures Oversee engineering teams and collaborate with analysts and QA functions Translate complex requirements into scalable, robust data … Core Experience: Proven track record in data architecture , either from a delivery or enterprise strategy perspective Deep experience with cloud platforms (Azure, AWS, GCP) and modern data ecosystems (Spark, Databricks, Kafka, Snowflake) Strong understanding of Data Mesh , Data Fabric , and data product-led approaches Data modelling expertise (relational, dimensional) and familiarity with tools like Erwin , Sparx , Archi Experience with ETL More ❯
understanding of data modelling, warehousing principles, and analytics best practices Experience working cross-functionally with strong communication skills Nice to have: Python, AWS/GCP/Azure, Medallion Architecture, Databricks, and knowledge of data governance or experience in high-growth environments This is a fantastic opportunity to join a purpose-led business in a high-impact role, helping to shape More ❯
City of London, London, United Kingdom Hybrid / WFH Options
LHH
or internal clients within large organisations, through e.g. the RFI/RFP process, as preferred bidder, documented bids and face to face presentations. Experience of data science platforms (e.g. Databricks, Dataiku, AzureML, SageMaker) and machine learning frameworks (e.g. Keras, Tensorflow, PyTorch, scikit-learn) Cloud platforms – demonstrable experience of building and deploying solutions to Cloud (e.g. AWS, Azure, Google Cloud) including More ❯
City of London, London, United Kingdom Hybrid / WFH Options
un:hurd music
as PySpark. Extensive experience with PyTorch (preferred) and/or TensorFlow. Hands-on experience with deploying machine learning models in production using cloud platforms, especially Microsoft Azure ML/Databricks ML Flow. Experience in integrating CI/CD pipelines for ML models. A proactive, hands-on approach to building, iterating, and owning solutions end-to-end. Strong interest in machine More ❯
Data Lead | £110,000 per annum | Databricks, Azure, Python, PowerBI | London My client are seeking a hands-on Data Manager to join a fast-moving team and architect a scalable, efficient data environment using Azure Databricks , Python , and Power BI — with the ambition to integrate Microsoft Fabric . This is a technical leadership role , ideally suited to someone who thrives … station) Salary - £110,000 Key Responsibilities: Architecting and delivering a modern, enterprise-level data platform. Leading a small but capable team of 2. Building and optimising pipelines in Azure Databricks using Python . Collaborating with Data Analysts, BI teams, and business stakeholders to ensure data is accessible and actionable. Driving the transition toward more flexible and scalable reporting (Power BI … Fabric). Reducing reliance on core data engineering for basic data access and analysis — removing bottlenecks and empowering users. Key Skills: Strong hands-on experience with Python and Azure Databricks . Proven ability to design and implement data platform architecture at an enterprise level. Proficiency in Power BI ; familiarity with Microsoft Fabric is a bonus. Experience working closely with cross More ❯
Fabric PowerBI Data Analytical Specialist with MS Fabric inc Data Factory, Synapse OneLake, SQL and familiar with Databricks is required by a leading commercial interior design company in the heart of the City, a short walk from Farringdon Station, paying upto 70k + All commuting costs paid , it is an office based role . You'll leverage Microsoft Fabric to … for efficient data workflows. Advanced Power BI Skills: Dashboard development, data modelling, and DAX for strategic insights. Data Warehousing Knowledge: Understanding of principles and practices for effective data management. Databricks Familiarity: Experience with Databricks for data processing and analytics. Stakeholder Engagement: Ability to pro actively engage with internal teams and translate business needs. Mentoring & Team Development: Skills to guide junior More ❯
City of London, London, United Kingdom Hybrid / WFH Options
developrec
Lead Data Engineer – Snowflake, Databricks, Azure Job Type: Contract (Outside IR35) Location: Hybrid - 2 days a week in London Start Date: ASAP Rate: Up to £550 per day A leading technology consultancy is looking for a highly skilled Lead Data Engineer with strong Snowflake, Databricks and Azure experience This position offers the opportunity to work within a fast-paced, collaborative … maintain comprehensive documentation of migration strategies, standards, and processes. Provide mentorship and technical leadership to team members. Essential Skills & Experience Extensive experience as an Lead Data Engineer Strong Snowflake, Databricks, Azure experience Proven track record with data migration methodologies and execution. Skilled in SQL and PL/SQL programming. Experience with database backup, recovery, and disaster recovery strategies. Strong analytical More ❯
London/Hybrid | 💷 £750/day | 🗓 12-Month Rolling | ✅ Outside IR35 Our client, a leading Investment Management firm, is looking for a contract Data Engineer with strong Python and Databricks skills to help drive their data platform modernisation. You’ll be working on business-critical data pipelines and analytics infrastructure, supporting both investment and operations teams. This is a hands … on role in a fast-moving environment, ideal for someone who thrives in financial services. What you’ll be doing: Building scalable data pipelines in Python and Databricks Designing data models to support analytics, reporting & regulatory needs Working with cloud-based data platforms (Azure preferred) Partnering closely with data analysts, quants, and tech teams Tech you’ll be using: Python … strong hands-on coding essential) Databricks/Spark Azure Data Lake/Delta Lake SQL/ETL frameworks CI/CD tools and version control (Git, Azure DevOps) What we’re looking for: Strong commercial experience as a Data Engineer Deep Python and Spark/Databricks expertise Financial services experience — ideally investment or asset management Comfortable working in a cloud More ❯
be responsible for managing large-scale data pipelines, ensuring data quality and integrity, and supporting cross-functional teams with reliable and scalable data solutions—primarily using Microsoft Azure, Python, Databricks , as well as the integration of Microsoft Fabric . This role will require 3 days a week onsite in their state of the art office and will remain a technically … hands on role. Key Responsibilities Design, build, and manage scalable data pipelines and ETL/ELT processes in Azure Data Factory , Databricks , and other Azure services. Develop and maintain data models, data lakes, and data warehouses (e.g., Azure Synapse). Use Python for data manipulation, transformation, and automation tasks. Collaborate with data analysts and stakeholders to meet business requirements. Monitor More ❯
this role, you’ll be instrumental in supporting the Head of Data in building and deploying fit for purpose data quality management capability underpinned by modern data stack (Azure Databricks, ADF and Power BI), ensuring that data is reliable and trustworthy, then extract insights from it to improve operations and optimise resources. Key responsibilities and primary deliverables Drive requirements for … data quality measurement. Build reports & dashboards for data quality and other business problems as per business priorities using Power BI and Databricks Dashboard Create and maintain the data quality tracker to document rule planning and implementation. Deliver continuous improvements of data quality solution based upon feedback. Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source … tasks to prepare data for analysis. Documentation of data quality findings and recommendations for improvement. Work with Data Architecture & Engineering to design and build data quality solution utilising Azure Databricks stack. Take ownership of design and work with data architecture and engineering to build of data pipelines to automate data movement and processing. Manage and mitigate risks through assessment, in More ❯
this role, you’ll be instrumental in supporting the Head of Data in building and deploying fit for purpose data quality management capability underpinned by modern data stack (Azure Databricks, ADF and Power BI), ensuring that data is reliable and trustworthy, then extract insights from it to improve operations and optimise resources. Key responsibilities & deliverables: Drive requirements for data quality … measurement Build reports & dashboards for data quality and other business problems as per business priorities using Power BI and Databricks Dashboard Create and maintain the data quality tracker to document rule planning and implementation. Deliver continuous improvements of data quality solution based upon feedback. Investigation and analysis of data issues related to quality, lineage, controls, and authoritative source identification. Execution … tasks to prepare data for analysis. Documentation of data quality findings and recommendations for improvement. Work with Data Architecture & Engineering to design and build data quality solution utilising Azure Databricks stack. Take ownership of design and work with data architecture and engineering to build of data pipelines to automate data movement and processing. Manage and mitigate risks through assessment, in More ❯
Product, Analytics, Security, and Software Engineering to define, develop, and deliver impactful data products to both internal stakeholders and end customers. Responsibilities Design and implement scalable data pipelines using Databricks, Delta Lake, and Lakehouse architecture Build and maintain a customer-facing analytics layer, integrating with tools like PowerBI, Tableau, or Metabase Optimise ETL processes and data workflows for performance, reliability … Terraform) Take ownership of system observability, stability, and documentation Requirements Strong experience in Python (especially Pandas and PySpark) and SQL Proven expertise in building data pipelines and working with Databricks and Lakehouse environments Deep understanding of Azure (or similar cloud platforms), including Virtual Networks and secure data infrastructure Experience with DevOps practices and infrastructure-as-code (Terraform preferred) Familiarity with More ❯
and troubleshoot data pipelines and systems Qualifications & Skills: x5 + experience with Python programming for data engineering tasks Strong proficiency in SQL and database management Hands-on experience with Databricks and Apache Spark Familiarity with Azure cloud platform and related services Knowledge of data security best practices and compliance standards Excellent problem-solving and communication skills Multi-Year Project - Flexible More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Cognitive Group | Part of the Focus Cloud Group
What We’re Looking For Proven experience delivering AI and data solutions in a Microsoft Partner or specialist technology consultancy environment Strong technical proficiency across Azure Synapse , Data Lake , Databricks , Power BI , Microsoft Fabric , and relevant AI services (e.g. Azure OpenAI, ML Studio) Deep understanding of modern data architectures, data governance, and operationalisation of AI models in cloud environments Experience More ❯
City of London, England, United Kingdom Hybrid / WFH Options
Xpertise Recruitment
/statistical models, particularly in pricing (e.g., price elasticity, revenue management, dynamic pricing). Experience with data visualization tools (e.g., Tableau, PowerBI) and data management platforms (e.g., SQL, Snowflake, Databricks). A passion for building high-performing teams and creating a collaborative, data-driven culture. Excellent written and verbal communication skills, with the ability to simplify complex topics for non More ❯
City of London, London, United Kingdom Hybrid / WFH Options
twentyAI
to modern data platforms. Your Background Experience with Microsoft Azure data tools — especially Data Factory and Synapse. Familiarity with Microsoft Fabric will be beneficial. Otherwise, experience with platforms like Databricks or Snowflake is also valued. Proficiency in Infrastructure as Code, preferably with Terraform, and understanding of CI/CD pipelines in a data engineering context. Practical knowledge of distributed processing More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Nationwide Building Society
role, you'll work in the Data Platforms Engineering team, utilising cutting-edge Cloud technology within Microsoft Azure to enhance our Enterprise Data Lakehouse platform. Using tools such as Databricks, Python, and Terraform , you'll design and develop key platform functionality and data analytics solutions. Expect a hands-on experience where you'll contribute significantly to our daily operations, making More ❯
City of London, England, United Kingdom Hybrid / WFH Options
We Are Dcoded Limited
Location: London or Manchester (Hybrid) Salary: £120,000 - £130,000 + Bonus & Benefits Industry: Tech Consultancy | FTSE 250 Clients We're hiring a Databricks Solution Architect for a leading consultancy working with FTSE 250 clients. You'll design and deliver scalable data platforms using Databricks and cloud technologies - we are searching for a true Consultant who can help deliver commercial … value for their clients. Responsibilities: Architect and implement Databricks solutions for enterprise clients Define best practices for data engineering, AI/ML, and analytics Work with Azure, AWS, or GCP to ensure seamless cloud integration Guide and mentor technical teams on best practices Requirements: Strong experience as a Solution Architect in Databricks and cloud platforms Expertise in Apache Spark, Delta More ❯
City Of London, England, United Kingdom Hybrid / WFH Options
Fruition Group
an experienced Azure Data Engineer to play a key role in delivering the next phase of a business-critical data programme. You'll apply your technical expertise across Azure Databricks, Data Factory, Azure SQL, and SQL Server, contributing directly to the integration of financial data into a new Azure Data Platform. Responsibilities: Design, build and optimise scalable data solutions using … Azure cloud technologies. Lead the development of data pipelines in Azure Databricks, Data Factory, and T-SQL. Provide senior-level technical direction to onshore and offshore development teams. Translate data requirements into effective data engineering solutions. Collaborate closely with business stakeholders to ensure alignment with project goals. Maintain detailed documentation and ensure robust testing practices. Plan and manage progress in … Agile sprints alongside project managers and scrum masters. Requirements: Extensive experience in Azure data engineering, with strong expertise in Databricks, Azure SQL, and Data Factory. Deep technical knowledge of SQL Server including stored procedures and complex data transformation logic. Proven experience in designing and delivering data warehousing and dimensional modelling solutions. Excellent collaboration skills with a track record of working More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Mars
transformation journey where your work will unlock real impact. 🌟 What you'll do Build robust data pipelines using Python, PySpark, and cloud-native tools Engineer scalable data models with Databricks, Delta Lake, and Azure tech Collaborate with analysts, scientists, and fellow engineers to deliver insights Drive agile DevOps practices and continuous improvement Stay curious, keep learning, and help shape our … digital platforms 🧠 What we’re looking for Proven experience as a Data Engineer in cloud environments (Azure ideal) Proficiency in Python, SQL, Spark, Databricks Familiarity with Hadoop, NoSQL, Delta Lake Bonus: Azure Functions, Logic Apps, Django, CI/CD tools 💼 What you’ll get from Mars A competitive salary & bonus Hybrid working with flexibility built in Access to Mars University More ❯
z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth … Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools More ❯
z2bz0 years of total experience in DWBI, Big Data, Cloud Technologies Implementation experience and hands on experience in either of the 2 Cloud technologies – Azure, AWS, GCP, Snowflake, Databricks Must Have Hands on experience on at least 2 Hyperscalers (GCP/AWS/Azure platforms) and specifically in Big Data processing services (Apache Spark, Beam or equivalent). In-depth … Big data, Apache spark, beam on BigQuery/Redshift/Synapse, Pub Sub/Kinesis/MQ/Event Hubs, Kafka Dataflow/Airflow/ADF Desirable Skills Designing Databricks based solutions for Azure/AWS, Jenkins, Terraform, StackDriver or any other DevOps tools Priyanka Sharma Senior Delivery Consultant Office: 02033759240 Email: psharma@vallumassociates.com More ❯