City of London, London, England, United Kingdom Hybrid / WFH Options
Lorien
such as private endpoints, Firewalls, NSGs, NAT gateways and route tables. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple Should this position be of interest please submit your CV and I will More ❯
on-prem Analysis Services to the cloud, moving our WMS, ERP and bespoke apps to scalable platforms, and expanding AI/ML initiatives. We are evaluating both Fabric and Databricks as a potential strategic fit. As our hands-on Data & Analytics Engineer, youll play a key role in shaping and influencing that decision, helping assess the best platform for our … ownership across tooling, standards, engineering and enablement rather than being confined to a narrow BI silo. Key Responsibilities Data Platform Build-out Design relational & Lakehouse schemas in Fabric or Databricks Lead the re-architecture of SSAS cubes to modern Lakehouse models Set up medallion architecture and govern data pipelines. Contribute to the evaluation and selection of data platform architecture based … dimensional modelling, 3NF) Experience designing and building modern data pipelines and Lakehouse architectures Hands-on experience with at least one enterprise-grade data platform (e.g., Microsoft Fabric, Azure Synapse, Databricks, or equivalent) Proficiency in ELT/ETL development using tools such as Data Factory, Dataflow Gen2, Databricks Workflows, or similar orchestration frameworks Experience with Python and/or PySpark for More ❯
on-prem Analysis Services to the cloud, moving our WMS, ERP and bespoke apps to scalable platforms, and expanding AI/ML initiatives. We are evaluating both Fabric and Databricks as a potential strategic fit. As our hands-on Data & Analytics Engineer, you’ll play a key role in shaping and influencing that decision, helping assess the best platform for … ownership across tooling, standards, engineering and enablement rather than being confined to a narrow BI silo. Key Responsibilities Data Platform Build-out Design relational & Lakehouse schemas in Fabric or Databricks Lead the re-architecture of SSAS cubes to modern Lakehouse models Set up medallion architecture and govern data pipelines. Contribute to the evaluation and selection of data platform architecture based … dimensional modelling, 3NF) Experience designing and building modern data pipelines and Lakehouse architectures Hands-on experience with at least one enterprise-grade data platform (e.g., Microsoft Fabric, Azure Synapse, Databricks, or equivalent) Proficiency in ELT/ETL development using tools such as Data Factory, Dataflow Gen2, Databricks Workflows, or similar orchestration frameworks Experience with Python and/or PySpark for More ❯
machines - both Windows and Linux. Familiarity with server patching and maintenance. Strong understanding of security best practices within Azure and ideally AWS. Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for More ❯
machines - both Windows and Linux. Familiarity with server patching and maintenance. * Strong understanding of security best practices within Azure and ideally AWS. * Experience of configuring cloud data services (preferably Databricks) in Azure and ideally AWS. * Excellent communication and collaboration skills, with the ability to work across multiple technical and non-technical teams. What happens now? After submitting your application for More ❯
in BFSI or enterprise-scale environments is a plus. Preferred: Exposure to cloud platforms (AWS, Azure, GCP) and their data services. Knowledge of Big Data platforms (Hadoop, Spark, Snowflake, Databricks). Familiarity with data governance and data catalog tools. More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
engineering roles, ideally within consumer tech, logistics, or e-commerce. Strong proficiency in Python, SQL, and machine learning frameworks. Experience with cloud platforms (Azure, AWS, GCP) and tools like Databricks, DBT, Airflow, or Terraform. Familiarity with AI/ML applications and modern analytics tooling. Excellent communication skills and ability to work independently in a fast-paced environment. Why Join? Be More ❯
and Finance. Develop target-state architectures and data strategies in line with business needs. Create and manage conceptual, logical, and physical data models. Design and implement Lakehouse Architectures using Databricks and other modern data platforms. Ensure robust data governance, integration, and quality across systems. Collaborate with IT and business stakeholders to deliver scalable, cloud-based solutions. What We're Looking More ❯
contributing effectively as part of a multi-disciplinary team. Beneficial: Awareness of UX principles and best practices in dashboard design. Experience querying or connecting to cloud data platforms (e.g. Databricks, Snowflake). A degree in a numerate discipline. What You`ll Get in Return A discretionary annual bonus so you can share in the company`s success 25 days` paid More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Executive Facilities
such as Gong, Outreach, Seismic– experience launching CRM/tools into sales organisations a plus Advanced Excel/Power BI/VBA skills Knowledge of SQL and experience with Databricks/Github preferred BI reporting tools such as Tableau a bonus Comfortable representing sales ops and collaborating on cross-functional projects (finance, marketing, customer success) Benefits Great holiday allowance Hybrid More ❯
regulatory reporting systems in the banking sector. Strong analytical skills with the ability to communicate effectively across teams. Excellent time management and collaboration skills. Desirable Skills Exposure to Azure, Databricks , and middleware (IIB, Kafka, MQ). Experience in creating and managing data marts . Familiarity with modern data integration platforms . Why Apply? This is a fantastic opportunity to work More ❯
backend components of a next-gen trading platform. Work with Java and/or Kotlin to deliver robust solutions. Deploy containerised applications using Kubernetes and Docker. Leverage MongoDB and Databricks for data processing and analytics. Integrate with relational databases and support legacy data migration. Collaborate with stakeholders and contribute to technical decision-making. Ensure code quality through testing, debugging, and More ❯
backend components of a next-gen trading platform. Work with Java and/or Kotlin to deliver robust solutions. Deploy containerised applications using Kubernetes and Docker. Leverage MongoDB and Databricks for data processing and analytics. Integrate with relational databases and support legacy data migration. Collaborate with stakeholders and contribute to technical decision-making. Ensure code quality through testing, debugging, and More ❯
the designed, so you will need to hit the ground running and make sure nothing falls behind. Skills needed. Background in data engineering Azure experience - must have worked with Databricks Data modelling, migration and transformation Desirable Skills WhereScape General information. Location - Remote – odd visit to the clients site a month Rate - £400 per day IR35 – outside Interview – 1 stage Start More ❯
Leading strategic environment configuration (ODS) Implementing monitoring and alerting frameworks Sharing DevOps best practices and ensuring knowledge transfer across the team Preferred Experience: Strong hands-on expertise with Azure (Databricks, SQL, ADF) Familiarity with Event Hubs and real-time streaming pipelines If you're passionate about delivering in a fast-paced environment and thrive on close-knit team collaboration, this More ❯
Stevenage, Hertfordshire, South East, United Kingdom
Queen Square Recruitment Limited
building and managing Docker containers . Strong Linux Infrastructure knowledge. Solid background in Azure Infrastructure engineering . Nice to Have: Hands-on experience with Domino , Azure DevOps , Python , GitHub , Databricks . Familiarity with Agile Scrum methodologies . Why Join? This is an exciting opportunity to lead technical innovation within HPC, working on impactful projects at scale. Youll have the chance More ❯
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: ·Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. ·Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. ·Experience working across both technical and non More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Datatech
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledge of modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
Datatech Analytics
Principal Data Engineer - Azure Databricks (Unity Catalog) - Contract Location: Bristol - Hybrid - 2 days a week in the office Contract Length: 12 Months Day Rate: Negotiable Job Ref: J12998 A data for good organisation that is in the early stages of building a modern Analytics and Data Engineering function, are looking to bring in a Principal Data Engineer to support and … responsible for designing and implementing scalable, reusable data pipelines to support a range of analytics and AI-driven projects. The organisation is currently transitioning to Azure as well as Databricks and Principal Data Engineer will play a crucial role in shaping that platform, helping the data team upskill, and embed best practices across the function. They will also lead on … organisation, support the development of conceptual and dimensional data models, and contribute to fostering a culture of innovation and evidence led decision-making. Experience Required: Essential experience with Azure Databricks - including Unity Catalog, Python (ideally PySpark), and SQL. Practical knowledgeof modern ELT workflows, with a focus on the Extract and Load stages. Experience working across both technical and non-technical More ❯
and cloud-native security practices. Key Skills & Experience: Terraform for Azure infrastructure automation GitHub Actions and CI/CD pipeline design Azure Private Link and Private Link Service configuration Databricks and Unity Catalog for data governance Azure Policy and compliance enforcement Identity and access management (OAuth, federated credentials) Azure security best practices including BCDR and high availability Cost management and More ❯
and cloud-native security practices. Key Skills & Experience: Terraform for Azure infrastructure automation GitHub Actions and CI/CD pipeline design Azure Private Link and Private Link Service configuration Databricks and Unity Catalog for data governance Azure Policy and compliance enforcement Identity and access management (OAuth, federated credentials) Azure security best practices including BCDR and high availability Cost management and More ❯
Newcastle upon Tyne, Tyne and Wear, Tyne & Wear, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
Databricks Data Engineer - Newcastle (hybrid) - £450/pd Please note - this role will require you to attend the Newcastle-upon-tyne based office 2-3 days per week. To be eligible for this role you must have the unrestricted right to work in the UK - this organisation is not able to offer sponsorship. A leading construction organisation is seeking an … experienced Azure Databricks Data Engineer to support the ongoing development and optimisation of their cloud data platform. Key Responsibilities: Enhance and scale the existing Azure Databricks environment. Collaborate with data architects and analysts to deliver robust data pipelines. Implement best practices for data engineering, performance tuning, and security. Work closely with stakeholders to understand data requirements and deliver solutions. Required … experience: Strong hands-on experience with Azure Databricks and Spark. Solid understanding of Azure Data Lake, Data Factory, and Synapse. Proven track record in building scalable data pipelines and ETL processes. Experience working in hybrid environments and cross-functional teams. Logistical Info: Fast-paced, high-impact project within a major enterprise. Flexible hybrid working (2-3 days onsite in Newcastle More ❯
Modeller This 12-month inside IR35 contract, based in London, will see you influencing architectural direction and embedding best practices across:?? Data Modelling at enterprise scale? Azure Lakehouse/Databricks solution design, Fabric, Purview? Architectural design patterns and their data impact? Data residency and governance standards? Pseudonymisation requirements for new data setsThis is not a traditional data role. It's More ❯
Analytics teams, you'll help evolve our data warehouse and implement best-in-class engineering practices across Azure. Key Responsibilities Build and enhance ETL/ELT pipelines in Azure Databricks Develop facts and dimensions for financial reporting Collaborate with cross-functional teams to deliver robust data solutions Optimize data workflows for performance and cost-efficiency Implement governance and security using … Unity Catalog Drive automation and CI/CD practices across the data platform Explore new technologies to improve data ingestion and self-service Essential Skills Azure Databricks : Expert in Spark (SQL, PySpark), Databricks Workflows Data Pipeline Design : Proven experience in scalable ETL/ELT development Azure Services : Data Lake, Blob Storage, Synapse Data Governance : Unity Catalog, access control, metadata management More ❯
of greenfield projects. Experience: Kotlin (and ideally Java) development experience within financial services (mandatory) Strong experience with Containerisation tools such as Kubernetes and Docker Good experience with MongoDB, and Databricks Please apply using the links or reach out directly: (see below More ❯