City of London, London, United Kingdom Hybrid / WFH Options
ECS
Solution Architect - Databricks Initial 7-month Contract Role Remote Working £600 - £700, Inside IR35 We're partnering with a Global IT Partner who are looking for a Databricks Solution Architect to provide guidance on data architecture and design and optimise Databricks solutions. As a Databricks Solution Architect, you will be responsible for: Lead end-to-end architecture of data lakes … to Azure with a core focus on Databricks Provide technical leadership on Databricks Design and implement data pipelines in Azure Databricks, optimizing for performance, cost, and scalability Advise of data lake migration, data modelling and pipeline optimisation Refactor and modernize existing data models to integrate with Power BI Coordinate phased migrations to ensure zero disruption to live operations Replace legacy … tools by consolidating workflows within Databricks and native Azure services Requirements: 10+ years in cloud architecture/engineering with a strong focus on building scalable data pipelines Expertise in Azure Databricks (7+years) including building and managing ETL pipelines using PySpark or Scala (essential) Solid understanding of Apache Spark, Delta Lake, and distributed data processing concepts Hands-on experience with Azure More ❯
CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
london, south east england, united kingdom Hybrid / WFH Options
Focused Futures Consultancy LTD
CDO level and translating requirements into solution blueprints, proposals, and presentations. Enterprise Solution Design – Architect end-to-end data and cloud solutions across platforms such as Azure, AWS, GCP, Databricks, Snowflake, Synapse, or Azure Fabric. Cloud Strategy & Adoption – Define and lead cloud migration, modernisation, and optimisation strategies using tools such as Terraform, Azure DevOps, and CI/CD pipelines. Data More ❯
AWS, GCP, or Azure), infrastructure-as-code practices, and ideally container orchestration with Kubernetes. Familiarity with SQL and NoSQL databases (Cassandra, Postgres), ideally combined with data collaboration platforms (Snowflake, Databricks) Strong DevOps mindset with experience in CI/CD pipelines, monitoring, and observability tools (Grafana or equivalent). Exposure to analytics, reporting, and BI tools such as Apache Superset, Lightdash More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of unity catalog alongside core databricks functionality to drive metadata management Strong understanding of cloud economics, including cost management strategies and optimising solutions for customer needs.Experience with infrastructure as code, proficiency using tools such as Terraform More ❯
Edinburgh, Midlothian, United Kingdom Hybrid / WFH Options
Aberdeen
as Azure Data Factory, Synapse Pipelines, and SQL/T-SQL, ensuring data quality, performance, and reliability. Contribute to the evolution of our cloud-native data architecture, leveraging Azure Databricks, Azure Data Lake, and Snowflake where appropriate. Apply strong data modelling and transformation skills to support analytics, regulatory reporting, and operational use cases. Promote and implement engineering best practices, including More ❯
or similar platforms. Strong understanding of data warehousing concepts, data modelling, and ETL processes. Strong understanding of SAP Datasphere and basic knowledge of SAP Business data cloud and Azure Databricks Excellent analytical and problem-solving skills, with the ability to work with complex data sets. Strong communication and interpersonal skills, with the ability to collaborate effectively with technical and non More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Asset Resourcing Limited
engineering teams to resolve them. Strong analytical mindset and attention to detail. Clear, concise communicator able to present technical findings simply. Desirable: Experience testing in big data environments using Databricks, Snowflake, or Redshift. Knowledge of data governance and lineage tracking tools. Exposure to data performance and load testing. Experience in an Agile delivery environment. More ❯
data quality, governance, and compliance processes Skills & experience required • Proven background leading data engineering teams or projects in a technology-driven business • Expert knowledge of modern cloud data platforms (Databricks, Snowflake, ideally AWS) • Advanced Python programming skills and fluency with the wider Python data toolkit • Strong capability with SQL, Spark, Airflow, Terraform, and workflow orchestration tools • Solid understanding of CICD More ❯
basis your varied role will include, but will not be limited to: Design, build, and optimize high-performance data pipelines and ETL workflows using tools like Azure Synapse, Azure Databricks or Microsoft Fabric. Implement scalable solutions to ingest, store, and transform vast datasets, ensuring data availability and quality across the organization. Write clean, efficient, and reusable Python code tailored to More ❯
Newbury, Berkshire, England, United Kingdom Hybrid / WFH Options
Intuita
modelling data through a medallion-based architecture, with curated dimensional models in the gold layer built for analytical use Strong understanding and or use of unity catalog alongside core databricks functionality to drive metadata management Strong understanding of cloud economics, including cost management strategies and optimising solutions for customer needs.Experience with infrastructure as code, proficiency using tools such as Terraform More ❯
DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative mindset with the ability to communicate effectively More ❯
DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative mindset with the ability to communicate effectively More ❯
Warrington, Cheshire, England, United Kingdom Hybrid / WFH Options
Brookson
example - Computer Science, Mathematics, Engineering or other STEM A strong team player with empathy, humility and dedication to joint success and shared development. Desirable Experience and Qualifications: Experience with Databricks or Delta Lake architecture. Experience building architecture and Data Warehousing within the Microsoft Stack Experience in development Source control (e.g. Bit Bucket, Github) Experience in Low Code Analytical Tools (e.g. More ❯
Reigate, England, United Kingdom Hybrid / WFH Options
esure Group
Skilled in team collaboration and exhibits good interpersonal skills. Able to prioritise, multi-task, and deliver at pace with an iterative mindset. Experience with modern data platforms such as Databricks or Snowflake is advantageous; LLM API experience is a bonus. What’s in it for you?: Competitive salary that reflects your skills, experience and potential. Discretionary bonus scheme that recognises More ❯
DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative mindset with the ability to communicate effectively More ❯
DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative mindset with the ability to communicate effectively More ❯
DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative mindset with the ability to communicate effectively More ❯
DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative mindset with the ability to communicate effectively More ❯
DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative mindset with the ability to communicate effectively More ❯
DevOps pipelines, and Infrastructure as Code. Professional-level skills in Python/Pyspark development, database management, and data warehousing. Professional experience with Snowflake, Azure Data Factory for pipelining, Azure Databricks for data processing, and PowerBI for reporting and analytics. Good knowledge in streaming and interfacing technologies including Kafka, Terraform, and Azure Microservices. Collaborative mindset with the ability to communicate effectively More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
technologies in data engineering, and continuously improve your skills and knowledge. Profile The Data Engineer will have mastery of data management and processing tools, including Power BI, Data Factory, Databricks, SQL Server, and Oracle. Proficient in SQL and experienced in database administration. Familiarity with cloud platforms such as Azure and AWS. Excellent problem-solving and analytical skills, with strong attention More ❯
solutions: acquisition, engineering, modelling, analysis, and visualisation Leading client workshops and translating business needs into technical solutions Designing and implementing scalable ETL/ELT pipelines using Azure tools (Fabric, Databricks, Synapse, Data Factory) Building data lakes with medallion architecture Migrating legacy on-prem data systems to the cloud Creating impactful dashboards and reports using Power BI Supporting and evolving data More ❯
Leicester, Leicestershire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
solutions: acquisition, engineering, modelling, analysis, and visualisation Leading client workshops and translating business needs into technical solutions Designing and implementing scalable ETL/ELT pipelines using Azure tools (Fabric, Databricks, Synapse, Data Factory) Building data lakes with medallion architecture Migrating legacy on-prem data systems to the cloud Creating impactful dashboards and reports using Power BI Supporting and evolving data More ❯