London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
testing methodologies and development team collaboration Experience working with Power BI and DAX Strong documentation, communication, and stakeholder engagement skills Preferred Qualifications: Experience with Lakehouse architecture, Delta Lake, or Databricks Exposure to Agile/Scrum working practices Microsoft certifications (e.g., Azure Data Engineer Associate) Background in consulting or professional services Understanding of data governance and data security principles Nice to More ❯
relevant insights.* Experience working in regulated environments such as pensions, insurance, or financial services is highly advantageous. Desirable * Exposure to actuarial modelling or financial risk analysis.* Familiarity with Azure Databricks, Synapse, or distributed computing tools.* Experience applying model explainability and responsible AI techniques. Rewards & Benefits This role offers a comprehensive and flexible benefits package designed to support wellbeing and personal More ❯
and AI/ML use cases Implement CI/CD workflows Ensure GDPR compliance and secure data handling Requirements: 5+ years in data engineering or Strong experience with Azure, Databricks, Microsoft Fabric, and Snowflake Proficiency in SQL, Python, and tools like dbt and Airflow Familiarity with DevOps practices in a data context Benefits: Work on impactful, enterprise-wide data projects More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and internal policies. What We're Looking For: Solid experience in data engineering or backend data development. Strong experience with Azure data services (e.g., Data Factory, Dake Lake, Synapse, Databricks) and tools like dbt. Proficiency in SQL and Python, with a solid understanding of data modelling and transformation. Experience integrating data from enterprise systems (CRM, ERP, HRIS). Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
and internal policies. What We're Looking For: Solid experience in data engineering or backend data development. Strong experience with Azure data services (e.g., Data Factory, Dake Lake, Synapse, Databricks) and tools like dbt. Proficiency in SQL and Python, with a solid understanding of data modelling and transformation. Experience integrating data from enterprise systems (CRM, ERP, HRIS). Familiarity with More ❯
hands-on experience in enterprise data engineering and Azure cloud data technologies. You must be confident working across: Azure Data Services, including: Azure Data Factory Azure Synapse Analytics Azure Databricks Microsoft Fabric (desirable) Python and PySpark for data engineering, transformation, and automation ETL/ELT pipelines across diverse structured and unstructured data sources Data lakehouse and data warehouse architecture design More ❯
Azure ML, Cognitive Services) Solid understanding of SQL, REST APIs, and cloud data integration Familiarity with Python, R, or similar scripting languages Bonus points for experience with Azure Synapse, Databricks, or Power Apps Microsoft certifications (PL-300, DP-203, AI-900) are a plus Benefits : Salary up to around £60,000 depending on experience 10% annual bonus Pension with More ❯
Azure ML, Cognitive Services) Solid understanding of SQL, REST APIs, and cloud data integration Familiarity with Python, R, or similar scripting languages Bonus points for experience with Azure Synapse, Databricks, or Power Apps Microsoft certifications (PL-300, DP-203, AI-900) are a plus Benefits : Salary up to around £60,000 depending on experience 10% annual bonus Pension with More ❯
Reading, Oxfordshire, United Kingdom Hybrid / WFH Options
Henderson Drake
Promote user adoption, training, and change management initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding More ❯
Reading, Berkshire, United Kingdom Hybrid / WFH Options
Henderson Drake
Promote user adoption, training, and change management initiatives Ensure high standards of documentation and data security compliance Technical Skills (desirable): Microsoft Azure Data Services (e.g., Azure Data Factory, Synapse, Databricks, Fabric) Data warehousing and lakehouse design ETL/ELT pipelines SQL, Python for data manipulation and machine learning Big Data frameworks (e.g., Hadoop, Spark) Data visualisation (e.g., Power BI) Understanding More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Client Server Ltd
and data modelling; as well as healthcare technology and regulatory requirements (ISO13485, ISO27001) You have experience of managing data in Azure using tools such as Azure Data Factory, Azure Databricks, Azure Synapse Analytics You have a good knowledge of Data Lake and Data Warehouse architectures You have experience implementing monitoring and logging for data pipelines You are collaborative, enjoy problem More ❯
Senior Databricks Data Engineer (flexible location) Bibby Financial Services have an exciting opportunity available for a reliable Senior Databricks Data Engineer to join our team. You will join us on a full time, permanent basisand in return, you will receive a competitive salary of £60,000 - £70,000 per annum. About the role: As our Senior Databricks Data Engineer, you … coach, support and organise to ensure we sustain a predictable pipeline of delivery, whilst ensuring all appropriate governance and best practice is adhered to. Your responsibilities as our Senior Databricks Data Engineer will include: Understand the business/product strategy and supporting goals with the purpose of ensuring data interpretation aligns Provide technical leadership on how to break down initiatives … databases and APIs Deliver large-scale data processing workflows (ingestion, cleansing, transformation, validation, storage) using best practice tools and techniques What we are looking for in our ideal Senior Databricks Data Engineer: A Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Alternatively, relevant experience in the data engineering field Databricks, including Unity Catalog Terraform More ❯
practices. Collaborate with cross-functional teams to translate business needs into technical solutions. Core Skills Cloud & Platforms : Azure, AWS, SAP Data Engineering : ELT, Data Modeling, Integration, Processing Tech Stack : Databricks (PySpark, Unity Catalog, DLT, Streaming), ADF, SQL, Python, Qlik DevOps : GitHub Actions, Azure DevOps, CI/CD pipelines Please click here to find out more about our Key Information Documents. More ❯
supporting automated workflows in Alteryx Designer. Experience deploying workflows to the Production Gallery. Knowledge of database fundamentals, data design, SQL, and data warehouse concepts is beneficial. Exposure to PowerBI, Databricks, Azure, and Profisee is advantageous. Knowledge of Json, Python, XML, and R is a plus. Experience with non-relational and unstructured data is beneficial. Familiarity with Azure DevOps or GitHub More ❯
Python capabilities - minimum 2-3 years hands-on experience Comprehensive Data Engineering background - proven track record in enterprise data solutions Experience with ETL processes and data transformation, preferably using Databricks Strong foundation in Data Warehousing architectures and dimensional modeling Familiarity with batch processing from relational database sources Communication & Collaboration Skills of the Data Engineer Outstanding stakeholder engagement abilities across technical More ❯
data engineering languages - Python, Scala, or Java . Deep knowledge of SQL and distributed data processing frameworks. Experience with cloud-based platforms such as Azure, AWS, or GCP , particularly Databricks , Informatica , or similar. Understanding of data modelling (analytical & operational) and non-functional performance tuning. Strong communicator, able to present complex technical concepts clearly to varied audiences. Previous delivery into public More ❯
Reading, Berkshire, South East, United Kingdom Hybrid / WFH Options
Bowerford Associates
experience (including QGIS) FME Advanced Database and SQL skills Certifications : AWS or FME certifications are a real plus. Experience with ETL tools such as AWS Glue, Azure Data Factory, Databricks or similar is a bonus. The role comes with excellent benefits to support your well-being and career growth. KEYWORDS Principal Geospatial Data Engineer, Geospatial, GIS, QGIS, FME, AWS, On More ❯
Haywards Heath, Sussex, United Kingdom Hybrid / WFH Options
First Central Services
practices and tools (Azure DevOps preferred). Experience with microservices architecture, RESTful API development, and system integration. Prior experience in financial services or insurance sectors advantageous. Familiarity with AzureML, Databricks, related Azure technologies, Docker, Kubernetes, and containerization is advantageous. Advanced proficiency in Python, and familiarity with AI frameworks such as LangChain Skilled in designing and operationalising AI Ops frameworks within More ❯
governance, security, and performance standards. Requirements: Recent experience as a Lead Data Solution Architect or equivalent. Skilled in streaming/event-driven architectures (Kafka, Confluent). Deep knowledge of Databricks, Unity Catalog, and Snowflake. Understanding of Data Mesh/Fabric and product-led approaches. Familiarity with cloud platforms (AWS, Azure, GCP). Leadership in Agile environments with budget and delivery More ❯
cloud-based data solutions. Key Responsibilities: Lead cross-functional teams to deliver scalable data solutions across cloud platforms Architect data lakes, lakehouses, and event-driven systems using tools like Databricks, Kafka, and Confluent Apply Data Mesh and Data Fabric principles to modern data architectures Define and implement data product strategies with a product-led mindset Ensure compliance with data governance More ❯
for front office reporting. What you'll need to succeed Strong business intelligence/data engineering experience with strong Power BI expertise. Data engineering experience building production pipelines - with Databricks Business experience working around the Front Office/Risk is a must! Understanding of regulatory reporting processes. Financial product knowledge e.g. - equities, fixed income, derivatives. Experience on Azure cloud platforms. More ❯
full lifecycle delivery experience. Ability to lead architectural strategy and oversee implementation. Architectural Expertise Deep knowledge of event-driven architectures (Kafka, Confluent). Experience with data lakes/lakehouses (Databricks, Unity Catalog). Familiarity with Data Mesh, Data Fabric, and product-led data strategies. Expertise in cloud platforms (AWS, Azure, GCP, Snowflake). Technical Skills Proficiency in big data tools More ❯
Milton Keynes, England, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
both technical depth and client-facing credibility . Key Responsibilities: Lead the design and delivery of enterprise-scale data platforms using cloud technologies (Azure, AWS, GCP) and tools like Databricks, Snowflake, Synapse. Shape cloud migration and modernization strategies with a strong focus on DevOps practices. Architect scalable data models and robust ETL/ELT pipelines using industry-standard frameworks. Implement More ❯
DATABRICKS ENGINEER 6-MONTH CONTRACT £450-£550 PER DAY (OUTSIDE IR35) This role is a great opportunity for a skilled Databricks Engineer to join a data-driven financial services firm undergoing a major transformation of their data infrastructure. You'll be a key player in modernising their data platform with Databricks and Azure, while supporting scalable data pipelines and machine … THE COMPANY This business is investing heavily in data as it digitises operations across its risk, product, and customer intelligence teams. They're building a modern Lakehouse architecture using Databricks, Delta Lake, and Azure Data Lake. You'll be part of a collaborative, forward-thinking team with a focus on best practices, automation, and end-to-end data enablement. THE … Azure stack. Your responsibilities will include: Designing and deploying ETL pipelines using PySpark and Delta Lake on Databricks. Supporting the deployment and operationalisation of ML models with MLflow and Databricks Workflows. Building out reusable data products and feature stores for data science teams. Tuning performance across clusters, jobs, and workflows. Migrating legacy systems (SSIS/SQL) to Databricks and cloud More ❯
Teams across multiple locations and time zones Strong interpersonal and communication skills with an ability to lead a team and keep them motivated. Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow More ❯