Data Architect/Senior Data Architect £600 - £650 P.D - Outside IR35 6 Month contract 1 day a week onsite in central London Overview: We're working with a Data Architect to provide expert-level consulting services over a 6-month More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
next-generation data platform. Working as part of a growing data team, you will play a critical role in designing and deploying scalable data pipelines and solutions using Azure Databricks and related technologies. This is an opportunity to contribute to a cloud-first, modern data strategy within a collaborative and forward-thinking environment. Key Responsibilities: Design and develop end-to … end data pipelines (batch and streaming) using Azure Databricks, Spark, and Delta Lake. Implement the Medallion Architecture and ensure consistency across raw, enriched, and curated data layers. Build and optimise ETL/ELT processes using Azure Data Factory and PySpark. Enforce data governance through Azure Purview and Unity Catalog. Apply DevOps and CI/CD practices using Git and Azure … analysts and business stakeholders to ensure data quality and usability. Contribute to performance optimisation and cost efficiency across data solutions. Required Skills & Experience: Proven hands-on experience with Azure Databricks, Data Factory, Delta Lake, and Synapse. Strong proficiency in Python, PySpark, and advanced SQL. Understanding of Lakehouse architecture and medallion data patterns. Familiarity with data governance, lineage, and access control More ❯
reference data) Understanding of regulatory reporting processes Proven ability to work directly with demanding front office stakeholders Experience with real-time data feeds and low-latency requirements Preferred Skills Databricks experience Capital markets knowledge (equities, fixed income, derivatives) Experience with financial data vendors (Bloomberg, Reuters, MarkIt) Cloud platforms (Azure preferred) and orchestration tools Understanding of risk metrics and P&L More ❯
. Oversee the development of dashboards, KPIs, and operational reports that support real-time and batch analytics across front-to-back processes. Ensure integration with enterprise data platforms (e.g. Databricks) and data warehouses. Monitor system performance, identify bottlenecks, and optimise data processing workflows for improved efficiency and scalability. Team Leadership & Governance Manage and mentor a team of BI developers, delivery More ❯
DATA ENGINEER - DBT/AIRFLOW/DATABRICKS 4-MONTH CONTRACT £450-550 PER DAY OUTSIDE IR35 This is an exciting opportunity for a Data Engineer to join a leading media organisation working at the forefront of data innovation. You'll play a key role in designing and building the data infrastructure that supports cutting-edge machine learning and LLM initiatives. … workflows with DBT and Airflow Collaborating with AI/ML teams to support data readiness for experimentation and inference Writing clean, modular SQL and Python code for use in Databricks Contributing to architectural decisions around pipeline scalability and performance Supporting the integration of diverse data sources into the platform Ensuring data quality, observability, and cost-efficiency KEY SKILLS AND REQUIREMENTS … Strong experience with DBT, Airflow, and Databricks Advanced SQL and solid Python scripting skills Solid understanding of modern data engineering best practices Ability to work independently and communicate with technical and non-technical stakeholders Experience in fast-paced, data-driven environments DESIRABLE SKILLS Exposure to LLM workflows or vector databases Experience in the media, content, or publishing industries Familiarity with More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Robert Half
Proven experience in a Data Engineer role with a successful track record Proficiency in SQL and experience with database administration Mastery of tools such as Power BI, Data Factory, Databricks, SQL Server, and Oracle Familiarity with acting as a data engineer within cloud environments like Azure. Strong analytical, problem-solving, and attention-to-detail skills Effective communication and teamwork skills More ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows … and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure … access control, and metadata management across data assets. Ensure data security best practices, such as encryption at rest and in transit, and role-based access control (RBAC) within Azure Databricks and Azure services. Performance Tuning & Optimization: Optimize Spark jobs for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and More ❯
Data Pipeline Development: Design and implement end-to-end data pipelines in Azure Databricks, handling ingestion from various data sources, performing complex transformations, and publishing data to Azure Data Lake or other storage services. Write efficient and standardized Spark SQL and PySpark code for data transformations, ensuring data integrity and accuracy across the pipeline. Automate pipeline orchestration using Databricks Workflows … and unstructured data from various sources (APIs, databases, file systems). Implement data transformation logic using Spark, ensuring data is cleaned, transformed, and enriched according to business requirements. Leverage Databricks features such as Delta Lake to manage and track changes to data, enabling better versioning and performance for incremental data loads. Data Publishing & Integration: Publish clean, transformed data to Azure … access control, and metadata management across data assets. Ensure data security best practices, such as encryption at rest and in transit, and role-based access control (RBAC) within Azure Databricks and Azure services. Performance Tuning & Optimization: Optimize Spark jobs for performance by tuning configurations, partitioning data, and caching intermediate results to minimize processing time and resource consumption. Continuously monitor and More ❯
join a skilled and collaborative analytics team of eight, working closely with senior leadership. The team has already laid the foundations for a modern data platform using Azure and Databricks and is now focused on building out scalable ETL processes, integrating AI tools, and delivering bespoke analytics solutions across the organisation. THE ROLE As a Data Engineer, you'll play … a pivotal role in designing and implementing robust data pipelines, supporting the migration from legacy Azure systems to Databricks, and working closely with stakeholders to deliver tailored data solutions. This role combines hands-on development with collaborative architecture design, and offers the opportunity to contribute to AI readiness within a fast-paced business. KEY RESPONSIBILITIES Develop and maintain ETL pipelines … loads Connect and integrate diverse data sources across cloud platforms Collaborate with analytics and design teams to create bespoke, scalable data solutions Support data migration efforts from Azure to Databricks Use Terraform to manage and deploy cloud infrastructure Build robust data workflows in Python (e.g., pandas, PySpark) Ensure the platform is scalable, efficient, and ready for future AI use cases More ❯
Role with 2 days per week onsite in Central London. Skillset required: * Data Pipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and … automation using Databricks Workflows. * Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. * Data Governance & Security: Familiarity with managing data governance and security using Databricks Unity Catalog, ensuring data is appropriately organized, secured, and accessible to authorized users. * Optimization & Performance Tuning: Proven experience in More ❯
Role with 2 days per week onsite in Central London. Skillset required: * Data Pipeline Expertise: Extensive experience in designing and implementing scalable ETL/ELT data pipelines in Azure Databricks, transforming raw data into usable datasets for analysis. * Azure Databricks Proficiency: Strong knowledge of Spark (SQL, PySpark) for data transformation and processing within Databricks, along with experience building workflows and … automation using Databricks Workflows. * Azure Data Services: Hands-on experience with Azure services like Azure Data Lake, Azure Blob Storage, and Azure Synapse for data storage, processing, and publication. * Data Governance & Security: Familiarity with managing data governance and security using Databricks Unity Catalog, ensuring data is appropriately organized, secured, and accessible to authorized users. * Optimization & Performance Tuning: Proven experience in More ❯
London, South East, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
both remotely and onsite. The contract has potential for extension or transition onto other exciting projects. Key Responsibilities:* Design, build, and maintain scalable data pipelines using Azure Data Factory, Databricks, and Azure Synapse.* Collaborate with cross-functional teams to ensure data quality and integration.* Support data ingestion, transformation, and storage in cloud environments.* Troubleshoot and optimise data workflows for performance … Maintain compliance with security protocols and clearance requirements. Essential Skills & Experience:* Must hold NPPV3 + SC clearance (this is a mandatory requirement).* Proven expertise in Azure Data Factory, Databricks, and Azure Synapse Analytics.* Strong experience in building and managing cloud-based data solutions.* Solid understanding of data modeling, ETL/ELT processes, and data warehousing.* Excellent communication skills and More ❯
Tech stack Python (pandas, NumPy, scikit-learn, PyTorch/TensorFlow)SQL (Redshift, Snowflake or similar)AWS SageMaker Azure ML migration, with Docker, Git, Terraform, Airflow/ADFOptional extras: Spark, Databricks, Kubernetes. What you'll bring 3-5+ years building optimisation or recommendation systems at scale. Strong grasp of mathematical optimisation (e.g., linear/integer programming, meta-heuristics) as well More ❯
for every Technology and Data initiative in the backlog working the stakeholders as appropriate Has worked on Agile and Waterfall methodologies. Experience in Data Platform is build with ADF, Databricks, Power BI, Azure D, Waterfall, Agile, RAID, Azure Devops, Data Factory, Details: Inside IR35 2-3 days in the London office 3 Months with potential extension Please send me your More ❯
for every Technology and Data initiative in the backlog working the stakeholders as appropriate Has worked on Agile and Waterfall methodologies. Experience in Data Platform is build with ADF, Databricks, Power BI, Azure D, Waterfall, Agile, RAID, Azure Devops, Data Factory, Details: Inside IR35 2-3 days in the London office 3 Months with potential extension Please send me your More ❯
wide data strategy aligned with business objectives, focusing on governance, scalability, and innovation Lead the evolution of a modern data platform using cloud-native technologies (e.g. Snowflake, Azure, GCP, Databricks) Manage and mentor cross-functional teams across data engineering, analytics, governance, and data science Promote a data-as-a-product culture, enabling self-serve capabilities and empowering business users Introduce More ❯
new platform team! What you'll need to succeed Great experience as a Data Engineer in Financial Services or Insurance - Strong experience working in regulated environments. Vast expertise with Databricks - possess Databricks certification (deseritable). Experience with setting up Databricks solutions from scratch. Strong experience with Python. Expertise with modern data pipelines and running into production. Strong stakeholder management experience More ❯
new platform team! What you'll need to succeed Great experience as a Data Engineer in Financial Services or Insurance - Strong experience working in regulated environments. Vast expertise with Databricks - possess Databricks certifications (desirable) Strong experience with Python. Expertise with modern data pipelines and ran things in production. Strong stakeholder management experience! What you'll get in return Flexible working More ❯
new platform team! What you'll need to succeed Great experience as a Data Engineer in Financial Services or Insurance - Strong experience working in regulated environments. Vast expertise with Databricks - possess Databricks certification. Strong experience with Python. Expertise with modern data pipelines and running into production. Strong stakeholder management experience - you will help stakeholders use the platform. What you'll More ❯
Employment Type: Contract
Rate: £600.0 - £650.0 per day + £600-650 Per Day (Inside IR35)
new platform team! What you'll need to succeed Great experience as a Data Engineer in Financial Services or Insurance - Strong experience working in regulated environments. Vast expertise with Databricks - possess Databricks certification. (highly desirable for the position). Strong experience with Python. Expertise with modern data pipelines and running into production. Strong stakeholder management experience - you will help stakeholders More ❯
Databricks Trainer – Contract Vacancy Rate: £300 - £350 p/d IR35 Status: Outside Working Location: Remote My client has a requirement for a Databricks Trainer on a contract basis. The client is an training provider and consultancy. You will be supporting on a 4 week databricks program which is due to start either the end of August or middle of … September. All the training materials are already created. Trainers must have experience working with Databricks commerical or training Databricks. If you're suitably skilled and avaiable to deliver, please apply now. As an industry leading, nationwide Marketing, Digital, Analytics, IT and Design recruitment agency, we are continually receiving new assignments to work on, so keep a close eye on our More ❯
We are looking for candidates who have experience in delivering training and documentation on the following technologies: Datastorage and serialisation Databricks developer Apache Spaark Airflow DBT basics Databricks You must have experience delivering training on these technologies the work is 4 week delivered training some is hands some is self led and some is instructor led. Apply if this is More ❯
Azure Architect- Databricks, Cloud, Design, Hybrid, A Logistics business that has recently undertaken a significant transformation programme is looking to invest in its growing technology function and bring in, on a contract basis, a solid and reliable Azure Architect. Our client's working arrangements require three to four days per week in the office. This is aligned with their emphasis … complex data concepts to non-technical stakeholders. Expertise in designing and documenting data architectures (e.g., data warehouses, lakehouses, master/reference data models). Hands-on experience with Azure Databricks, including: Workspace and cluster configuration. Delta Lake table design and optimization. Integration with Unity Catalog for metadata management. Proficiency with Unity Catalog, including: Setting up data lineage and governance policies. … tools (e.g., Erwin, PowerDesigner, SQL DBML). Familiarity with data governance frameworks and tools. Understanding of cloud security and compliance (e.g., GDPR, ISO 27001) in Azure environments. Azure Architect- Databricks, Cloud, Design, Hybrid, McGregor Boyall is an equal opportunity employer and do not discriminate on any grounds. More ❯
Key Requirements: Commercial experience as a Senior Data Engineer, with an active SC + NPPV3 Clearance Experience with Azure Data Factory for building and orchestrating data pipelines Experience with Databricks for Data transformation Experience with Azure Synapse Familiar with Azure Data Lake Active SC Clearance +NPPV3 Clearance Nice to have: Immediate availability Hays Specialist Recruitment Limited acts as an employment More ❯
Key Requirements: Commercial experience as a Senior Data Engineer, with an active SC + NPPV3 Clearance Experience with Azure Data Factory for building and orchestrating data pipelines Experience with Databricks for Data transformation Experience with Azure Synapse Familiar with Azure Data Lake Active SC Clearance +NPPV3 Clearance Nice to have: Immediate availability Hays Specialist Recruitment Limited acts as an employment More ❯