data-driven initiatives. Job Specification (Technical Skills): Cloud Platforms: Expert-level proficiency in Azure (Data Factory, Databricks, Spark, SQL Database, DevOps/Git, Data Lake, DeltaLake, Power BI), with working knowledge of Azure WebApp and Networking. Conceptual understanding of Azure AI Services, ML, Unity Catalog, and … Advanced proficiency in SQL, Python, and at least one additional programming language (Java, C#, C++) is desired. Proven experience with data warehousing and data lake technologies. Solid understanding of database systems (SQL, NoSQL). Platform Architecture: Able to develop and implement data platform architecture (data lakes, data warehouses, data More ❯
practices in data engineering, governance, and performance optimisation. Key Responsibilities Lead the architecture and design of a next-gen data platform using Databricks and DeltaLake Collaborate with stakeholders including data engineers, analysts, and business leads to define data requirements and architecture patterns Ensure the platform is scalable … mentorship to engineering teams Champion best practices in data quality, lineage, governance , and performance tuning Integrate with a wider Azure ecosystem (e.g. Azure Data Lake, Synapse, Power BI) Required Skills & Experience Proven experience as a Data Architect in enterprise environments Extensive hands-on experience with Databricks (including SQL, PySpark … DeltaLake) Solid background in data warehousing , data lakes , and big data frameworks Strong knowledge of Azure cloud services , especially in data integration Experience working in regulated environments (e.g. financial services, insurance, banking) Excellent communication skills, capable of engaging with technical and non-technical stakeholders alike Comfortable working More ❯
Ensure Data Security: Apply protocols and standards to secure clinical data both in-motion and at-rest. Shape Data Workflows: Utilize Databricks components like DeltaLake, Unity Catalog, and ML Flow to ensure efficient, secure, and reliable data workflows. Key Responsibilities Data Engineering with Databricks: Design and maintain … ETL/ELT processes, and data lakes to support data analytics and machine learning. Requirements Expertise in Databricks: Proficiency with Databricks components such as DeltaLake, Unity Catalog, and ML Flow. Azure Data Factory Knowledge: Experience with Azure Data Factory for data orchestration. Clinical Data Security: Understanding of More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Focus on SAP
Employment Type: Contract, Full time Start: ASAP Location: London - Hybrid Languages: English Key skills: 5+ years of Data Engineer. Proven expertise in Databricks (including DeltaLake, Workflows, Unity Catalog). Strong command of Apache Spark, SQL, and Python. Hands-on experience with cloud platforms (AWS, Azure, or GCP … Apache Spark. Collaborate with Data Scientists, Analysts, and Product teams to understand data needs and deliver clean, reliable datasets. Optimize data workflows and storage (DeltaLake, Lakehouse architecture). Manage and monitor data pipelines in cloud environments (AWS, Azure, or GCP). Work with structured and unstructured data More ❯
london, south east england, United Kingdom Hybrid / WFH Options
trg.recruitment
London, Hybrid 💰 Rate: Up to £600 per day 📆 Contract: 6 months (Outside IR35, potential to go perm) 🛠 Tech Stack: Azure Data Factory, Synapse, Databricks, DeltaLake, PySpark, Python, SQL, Event Hub, Azure ML, MLflow We’ve partnered with a new AI-first professional services consultancy that’s taking … supporting team capability development What You Need: ✔ 5+ years in data engineering or backend cloud development ✔ Strong Python, SQL, and Databricks skills (especially PySpark & DeltaLake) ✔ Deep experience with Azure: Data Factory, Synapse, Event Hub, Azure Functions ✔ Understanding of MLOps tooling like MLflow and integration with AI pipelines More ❯
enriching, and aggregating data. Ensure data consistency and accuracy throughout the data lifecycle. Azure Databricks Implementation: Work extensively with Azure Databricks Unity Catalog, including DeltaLake, Spark SQL, and other relevant services. Implement best practices for Databricks development and deployment. Optimise Databricks workloads for performance and cost. Need More ❯
business use cases. Strong knowledge of data governance, data warehousing, and data security principles. Hands-on experience with modern data stacks and technologies (e.g., DeltaLake, SQL, Python, Azure/AWS/GCP). Experience aligning data capabilities with commercial strategy and business performance. Exceptional communication skills. More ❯
london, south east england, United Kingdom Hybrid / WFH Options
83zero
a MUST! Key expertise and experience we’re looking for: Data Engineering in Databricks – Spark programming with Scala, Python, SQL and Ideally experience with DeltaLake or Databricks workflows, jobs, etc. Familiarity with Azure Data Lake: experience with data ingestion and ETL/ELT frameworks Data Governance More ❯
Stevenage, Hertfordshire, South East, United Kingdom Hybrid / WFH Options
IET
architecture, ensuring alignment with best practices and strategic goals. Drive the adoption of Azure data technologies such as Fabric, Data Factory, Databricks, Synapse, and Delta Lake. Develop and maintain scalable data pipelines, ensuring efficient data ingestion, transformation, and storage. Introduce and enhance self-service reporting capabilities for stakeholders across More ❯
code repositories, and automation. Requirements 5+ Years experience in the Data and Analytics Domain. Previous Management experience is preferred. Strong expertise in Databricks (Spark, DeltaLake, Notebooks). Advanced knowledge of SQL development. • Familiarity with Azure Synapse for orchestration and analytics. Working experience with Power BI for reporting More ❯
standards and drive excellence in engineering practices. Architect and oversee the development of cloud-native data infrastructure and pipelines using Databricks , Python , PySpark , and DeltaLake . Guide the implementation of embedded analytics, headless APIs, and real-time dashboards for customer-facing platforms. Partner with Product Owners and … in data/analytics engineering, including 2+ years in a leadership or mentoring role. Strong hands-on expertise in Databricks , Spark , Python , PySpark , and Delta Live Tables . Experience designing and delivering scalable data pipelines and streaming data processing (e.g., Kafka , AWS Kinesis , or Azure Stream Analytics ). Background More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Intellect Group
Data Architect or Lead Data Engineer , ideally within consulting or large enterprise environments. Proven expertise with Azure , Databricks , and associated data tooling (e.g., Synapse, DeltaLake, Azure Data Factory). Strong understanding of data architecture within the insurance and asset management industries, including regulatory and compliance requirements. Proficiency More ❯
with 3+ years leading Databricks-based solutions. Proven experience in a consulting environment delivering large-scale data platform projects. Hands-on expertise in Spark, DeltaLake, MLflow, Unity Catalog, and DBSQL. Strong proficiency in Python, SQL, and at least one major cloud platform (AWS, Azure, or GCP). More ❯
expert proven to deliver fast-paced releases. You will have worked with the latest Azure data platform technologies, particularly Azure Data Factory, Azure Data Lake Storage and Azure Databricks. Hands on experience of working in Databricks. In particular design and usage of the DeltaLake storage format. More ❯
expert proven to deliver fast-paced releases. You will have worked with the latest Azure data platform technologies, particularly Azure Data Factory, Azure Data Lake Storage and Azure Databricks. Hands on experience of working in Databricks. In particular design and usage of the DeltaLake storage format. More ❯
london, south east england, United Kingdom Hybrid / WFH Options
Arthur Recruitment
Modeler/Architect London Market Insurance industry experience Using Data Modelling such as Erwin Experience with modelling methodologies including Kimball etc Usage of Data Lake formats such as Parquet and DeltaLake Strong SQL skills Rate: £600 - £700 P/D Outside IR35 Contract Duration: 6 months More ❯
understanding of cloud security, networking, and DR principles Experience working on major digital transformation or IT consolidation programmes 💻 Tech Stack/Tooling Databricks , Spark, DeltaLake Terraform , GitHub/GitLab, Azure DevOps Cloud-native services: e.g. GCP (Dataflow, BigQuery), Azure, or AWS equivalents Jira, Confluence CI/CD More ❯
and data engineers, lead small project teams, and help recruit top talent as we grow. Technical Delivery: Provide hands-on support with Databricks notebooks, DeltaLake, Spark optimization, MLflow integration, and related areas when necessary. Business Development Support: Partner with sales and leadership teams to support presales activities More ❯