Reddit, and Lyft rely on Zip to manage billions of dollars in spend. We're a fast-growing team that helped scale category-defining companies like Airbnb, Meta, Salesforce, Databricks, Ramp, Apple, and Google. With a $2.2 billion valuation and $370 million in funding from Y Combinator, BOND, DST Global, and CRV, we're focused on developing cutting-edge technology More ❯
hiring for Technical Data Architect location: Central London Type : Permanent Hybrid role (2-3 days from client location) We are seeking a highly skilled TechnicalData Architect- with expertise in Databricks, PySpark, and modern data engineering practices. The ideal candidate will lead the design, development, and optimization of scalable data pipelines, while ensuring data accuracy, consistency, and performance across the enterprise … consistency, row counts, and KPIs during migration and transformation. Collaborate with Data engineer , BI Engineers, and Security teams to define data standards, governance, and compliance. Optimize Spark jobs and Databricks clusters for performance and cost efficiency. Support real-time and batch data processing for downstream systems (e.g., BI tools, APIs, reporting consumers). Mentor junior engineers, conduct code reviews, and … engineering, cloud platforms, and analytics. ________________________________________ Required Skills & Qualifications 10-12 years of experience in data engineering, with at least 3+ years in a technical lead role. Strong expertise in Databricks , PySpark , and Delta Lake . DBT Advanced proficiency in SQL, ETL/ELT pipelines, and data modelling. Experience with Azure Data Services (ADLS, ADF, Synapse) or other major cloud platforms More ❯
you can work under our company INFINITE'S W2. USC and GC preferred Job Summary: We are looking for an experienced Data Engineer with strong hands-on expertise in Databricks, Apache Spark, and BigQuery (BQ) to build and optimize scalable data pipelines across multi-cloud environments (GCP and AWS). The ideal candidate will work closely with data scientists, analysts … architects to design, develop, and maintain robust data infrastructure supporting analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and maintain data pipelines and ETL/ELT workflows using Databricks and Apache Spark. Build scalable and efficient data solutions integrating both AWS and GCP platforms. Develop and optimize data processing frameworks for structured and unstructured data. Work extensively with BigQuery … and DevOps/data operations processes. Required Skills & Qualifications: Min 5-8 years of experience as a Data Engineer or in a similar data-centric role. Strong proficiency in Databricks and Apache Spark (PySpark or Scala). Hands-on experience in GCP (Google Cloud Platform) and AWS (Amazon Web Services). Expertise in BigQuery (BQ) for data warehouse design, query More ❯
and ETL processes using Python and SQL. Contribute to the rollout and optimisation of Microsoft Fabric, shaping the next-generation data platform. Work across Azure services (Data Factory, Synapse, Databricks, Data Lake, Functions) to deliver secure and scalable data solutions. Optimise data workflows for performance, reliability, and cost efficiency. Collaborate with analysts, data scientists, and business stakeholders to ensure high … role. Must have financial services experience Strong Python coding skills and advanced SQL expertise (query optimisation, data modelling). Hands-on experience with Azure data ecosystem (Data Factory, Synapse, Databricks, etc.). Exposure to or keen interest in Microsoft Fabric and its integration into enterprise data strategies. Solid understanding of data architecture, pipelines, and cloud-based ETL/ELT frameworks. More ❯
and ETL processes using Python and SQL. Contribute to the rollout and optimisation of Microsoft Fabric, shaping the next-generation data platform. Work across Azure services (Data Factory, Synapse, Databricks, Data Lake, Functions) to deliver secure and scalable data solutions. Optimise data workflows for performance, reliability, and cost efficiency. Collaborate with analysts, data scientists, and business stakeholders to ensure high … role. Must have financial services experience Strong Python coding skills and advanced SQL expertise (query optimisation, data modelling). Hands-on experience with Azure data ecosystem (Data Factory, Synapse, Databricks, etc.). Exposure to or keen interest in Microsoft Fabric and its integration into enterprise data strategies. Solid understanding of data architecture, pipelines, and cloud-based ETL/ELT frameworks. More ❯
best practices, and drive continuous improvement to enhance data quality and accessibility across the business. Key Responsibilities Data Engineering: Design, build, and maintain Azure data pipelines using Data Factory, Databricks, and related services. Data Architecture: Develop and optimise scalable data models, warehouses, and lakes (Azure Synapse, Data Lake Storage). Governance & Security: Enforce compliance and data protection standards (GDPR, DPA … promoting best practices. Innovation: Explore new Azure technologies to enhance platform capabilities and analytics. Documentation: Maintain clear technical documentation and share knowledge across teams. Skills & Experience Expert in Azure Databricks (Unity Catalog, DLT, cluster management). Strong experience with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, Event Hubs. Proficient in Python, Scala, C#, .NET, and SQL (T More ❯
Burton-on-Trent, Staffordshire, England, United Kingdom
Crimson
best practices, and drive continuous improvement to enhance data quality and accessibility across the business. Key Responsibilities Data Engineering: Design, build, and maintain Azure data pipelines using Data Factory, Databricks, and related services. Data Architecture: Develop and optimise scalable data models, warehouses, and lakes (Azure Synapse, Data Lake Storage). Governance & Security: Enforce compliance and data protection standards (GDPR, DPA … promoting best practices. Innovation: Explore new Azure technologies to enhance platform capabilities and analytics. Documentation: Maintain clear technical documentation and share knowledge across teams. Skills & Experience Expert in Azure Databricks (Unity Catalog, DLT, cluster management). Strong experience with Azure Data Factory, Synapse Analytics, Data Lake Storage, Stream Analytics, Event Hubs. Proficient in Python, Scala, C#, .NET, and SQL (T More ❯
years of experience developing data pipelines and data warehousing solutions using Python and libraries such as Pandas, NumPy, PySpark, etc. 3+ years hands-on experience with cloud services, especially Databricks, for building and managing scalable data pipelines 3+ years of proficiency in working with Snowflake or similar cloud-based data warehousing solutions 3+ years of experience in data development and … Knowledge of regulatory requirements in the financial industry Tasks: Collaborating with cross-functional teams to understand data requirements, and design efficient, scalable, and reliable ETL processes using Python and Databricks Developing and deploying ETL jobs that extract data from various sources, transforming them to meet business needs. Taking ownership of the end-to-end engineering lifecycle, including data extraction, cleansing … to meet customer needs and use cases, spanning from streaming to data lakes, analytics, and beyond within a dynamically evolving technical stack. Collaborate seamlessly across diverse technical stacks, including Databricks, Snowflake, etc. Developing various components in Python as part of a unified data pipeline framework. Contributing towards the establishment of best practices for the optimal and efficient usage of data More ❯
infrastructure, improve data quality, and enable data-driven decision-making across the organization. Core Duties and Responsibilities Design, build, and maintain large-scale data pipelines using Microsoft Fabric and Databricks Develop and implement data architectures that meet business requirements and ensure data quality, security, and compliance Collaborate with wider Product & Engineering teams to integrate data pipelines with machine learning models … and cloud computing Skills Capabilities and Attributes Essential: Good experience in data engineering, with a focus on cloud-based data pipelines and architectures Strong expertise in Microsoft Fabric and Databricks, including data pipeline development, data warehousing, and data lake management Proficiency in Python, SQL, Scala, or Java Experience with data processing frameworks such as Apache Spark, Apache Beam, or Azure … with Azure Synapse Analytics, Azure Data Lake Storage, or other Azure data services Experience with agile development methodologies and version control systems such as Git Certification in Microsoft Azure, Databricks, or other relevant technologies What We Offer Save For Your Future - Equiniti Pension Plan; Equiniti matches your pension contributions up to 10% All Employee Long Term Incentive Plan (LTIP) - Gives More ❯
Kent, England, United Kingdom Hybrid/Remote Options
Harnham
delivery best practices. Designing, developing, and optimising data pipelines, integration frameworks, and ETL processes in Azure. Building and maintaining scalable solutions using Azure Data Lake, Synapse, Data Factory, and Databricks . Establishing robust data governance, quality, and lineage frameworks across all environments. Collaborating closely with data architecture, analytics, and IT to ensure a seamless transition and platform stability. Managing delivery … we’re looking for: Proven track record leading data engineering teams through large-scale transformations. Strong, hands-on understanding of Azure data services — e.g. Synapse, Data Factory, Data Lake, Databricks, Purview . Direct experience migrating from on-premise environments to Azure , ideally within a regulated or financial services context. Deep technical knowledge of data architecture, ETL/ELT, and data More ❯
Kent, England, United Kingdom Hybrid/Remote Options
Harnham - Data & Analytics Recruitment
delivery best practices. Designing, developing, and optimising data pipelines, integration frameworks, and ETL processes in Azure. Building and maintaining scalable solutions using Azure Data Lake, Synapse, Data Factory, and Databricks . Establishing robust data governance, quality, and lineage frameworks across all environments. Collaborating closely with data architecture, analytics, and IT to ensure a seamless transition and platform stability. Managing delivery … we're looking for: Proven track record leading data engineering teams through large-scale transformations. Strong, hands-on understanding of Azure data services - e.g. Synapse, Data Factory, Data Lake, Databricks, Purview . Direct experience migrating from on-premise environments to Azure , ideally within a regulated or financial services context. Deep technical knowledge of data architecture, ETL/ELT, and data More ❯
data from multiple internal systems (trading, CRM, RUM, finance) into a central Data Lakehouse. Implement best practices for data ingestion, validation, transformation, and storage using modern cloud tools (e.g., Databricks, S3 Data Lakes, Spark, Redshift, AWS Glue). Working closely with data analysts to enable operational/regulatory reporting and data insights/visualisation. Data Governance & Quality Define and enforce … is highly desirable. Required Skills Hands-on experience designing and implementing ETL pipelines and data lake/warehouses in cloud environments. Advanced knowledge of AWS or Azure data services (Databricks, Glue, Redshift, S3 Data Lakes, Spark, or equivalent). Strong programming and data manipulation skills (Python, SQL). Solid background in financial reporting, trading data, or analytics within financial markets. More ❯
Databricks & Microsoft Fabric Consultant We're seeking an experienced data consultant for a high-impact, short-term engagement with a leading organisation in a regulated industry. This strategic project involves conducting a comprehensive skills assessment for a large data We're looking for s proven professional who can dive deep into a bespoke, in-house data platform built on Databricks … and Microsoft Fabric, evaluate team capabilities, and craft a development strategy that aligns with business goals. Key Responsibilities: Assess the skills of a diverse data team working with Databricks and Microsoft Fabric. Conduct discovery conversations and surveys to uncover learning needs and aspirations. Analyse team interaction with a bespoke data platform to inform recommendations. Develop user personas to map current … a strategic, real-world training and development plan beyond generic certifications. Provide a Rough Order of Magnitude (ROM) cost for implementing the proposed roadmap. Essential Deep expertise in the Databricks Lakehouse Platform, including Python, PySpark, and advanced SQL. Strong practical knowledge of Microsoft Fabric. Proven experience in senior, client-facing roles with a consultancy mindset. Background in technical coaching, mentorship More ❯
high-impact AI capabilities. Responsibilities • Design & Development: Architect and develop end-to-end AI/ML solutions using Azure AI services (e.g., Azure Machine Learning, Azure Cognitive Services, Azure Databricks, Azure Synapse Analytics). • Model Deployment & Management: Implement robust MLOps practices for continuous integration, continuous delivery (CI/CD), monitoring, and retraining of machine learning models in Azure. • Data Pipelining … Machine Learning, including MLOps concepts, model registration, deployment (AKS, ACI), and monitoring. • Hands-on experience with Azure data services such as Azure Data Lake Storage, Azure Data Factory, Azure Databricks, or Azure Synapse Analytics. • Solid understanding of software engineering principles, including version control (Git), testing, and code review. • Experience with containerization technologies (Docker, Kubernetes). • Strong problem-solving skills and More ❯
high-impact AI capabilities. Responsibilities • Design & Development: Architect and develop end-to-end AI/ML solutions using Azure AI services (e.g., Azure Machine Learning, Azure Cognitive Services, Azure Databricks, Azure Synapse Analytics). • Model Deployment & Management: Implement robust MLOps practices for continuous integration, continuous delivery (CI/CD), monitoring, and retraining of machine learning models in Azure. • Data Pipelining … Machine Learning, including MLOps concepts, model registration, deployment (AKS, ACI), and monitoring. • Hands-on experience with Azure data services such as Azure Data Lake Storage, Azure Data Factory, Azure Databricks, or Azure Synapse Analytics. • Solid understanding of software engineering principles, including version control (Git), testing, and code review. • Experience with containerization technologies (Docker, Kubernetes). • Strong problem-solving skills and More ❯
Optimize applications for performance and responsiveness. Stay Up to Date with Technology: Keep yourself and the team updated on the latest Python technologies, frameworks, and tools like Apache Spark, Databricks, Apache Pulsar, Apache Airflow, Temporal, and Apache Flink, sharing knowledge and suggesting improvements. Documentation: Contribute to clear and concise documentation for software, processes, and systems to ensure team alignment and … or Ansible. Real-time Data Streaming: Experience with Apache Pulsar or similar systems for real-time messaging and stream processing is a plus. Data Engineering: Experience with Apache Spark, Databricks, or similar big data platforms for processing large datasets, building data pipelines, and machine learning workflows. Workflow Orchestration: Familiarity with tools like Apache Airflow or Temporal for managing workflows and More ❯
Data Engineer/Data Engineering/Data Consultant/Lakehouse/Delta Lake/Data Warehousing/ETL/Azure/Azure Databricks/Python/SQL/Based in the West Midlands/Solihull/Birmingham area, Permanent role, £ + car/allowance (£5,000) + 15% bonus. One of our leading clients is looking to recruit a Data … week so locally based) Permanent role Salary £ + car/allowance + bonus Experience: Experience in a Data Engineer/Data Engineering role Large and complex datasets Azure, Azure Databricks Microsoft SQL Server Lakehouse, Delta Lake Data Warehousing ETL Database Design Python/PySpark Azure Blob Storage Azure Data Factory Desirable: Exposure ML/Machine Learning/AI/Artificial More ❯
Senior Data Engineer in Databricks - Lead the Future of Data in Financial Services Location: Hybrid 2-3 days/week on site Offices: South East and North West Salary: Up to 80,000 + Car Allowance + Bonus + Benefits Note: Visa sponsorship is not available for this role. Be the Technical Leader in a Bold Data Transformation Our client … agile environments and wants to make a real impact. Key Responsibilities Lead and mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development … production. Support Terraform adoption and infrastructure-as-code practices. Why This Role Stands Out Leadership and impact: Influence technical direction, team culture, and delivery standards. Modern tech stack: Azure, Databricks, Unity Catalog, SQL, Python, Terraform. Attractive package: Up to 80k base + car allowance + bonus + benefits. Hybrid working: Offices across the South East and North West, with More ❯
South East, England, United Kingdom Hybrid/Remote Options
Akkodis
Senior Data Engineer in Databricks - Lead the Future of Data in Financial Services Location: Hybrid | 2-3 days/week on site Offices: South East and North West Salary: Up to £80,000 + Car Allowance + Bonus + Benefits Note: Visa sponsorship is not available for this role. Be the Technical Leader in a Bold Data Transformation Our client … agile environments and wants to make a real impact. Key Responsibilities Lead and mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development … production. Support Terraform adoption and infrastructure-as-code practices. Why This Role Stands Out Leadership and impact: Influence technical direction, team culture, and delivery standards. Modern tech stack: Azure, Databricks, Unity Catalog, SQL, Python, Terraform. Attractive package: Up to £80k base + car allowance + bonus + benefits. Hybrid working: Offices across the South East and North West, with More ❯
Senior Data Engineer in Databricks - Lead the Future of Data in Financial Services Location: Hybrid | 2-3 days/week on site Offices: Banbury, Milton Keynes, Bedford, Northampton, Manchester Salary: Up to £80,000 + Car Allowance + Bonus + Benefits Note: Visa sponsorship is not available for this role. Be the Technical Leader in a Bold Data Transformation Our … agile environments and wants to make a real impact. Key Responsibilities Lead and mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development … production. Support Terraform adoption and infrastructure-as-code practices. Why This Role Stands Out Leadership and impact: Influence technical direction, team culture, and delivery standards. Modern tech stack: Azure, Databricks, Unity Catalog, SQL, Python, Terraform. Attractive package: Up to £80k base + car allowance + bonus + benefits. Hybrid working: Offices across the South East and North West, with More ❯
Employment Type: Permanent
Salary: £70000 - £80000/annum Car Allowance + Bonus
Manchester, Lancashire, England, United Kingdom Hybrid/Remote Options
Akkodis
Senior Data Engineer in Databricks - Lead the Future of Data in Financial Services Location: Hybrid | 2-3 days/week on site Offices: Banbury, Milton Keynes, Bedford, Northampton, Manchester Salary: Up to £80,000 + Car Allowance + Bonus + Benefits Note: Visa sponsorship is not available for this role. Be the Technical Leader in a Bold Data Transformation Our … agile environments and wants to make a real impact. Key Responsibilities Lead and mentor two Data Engineers, providing technical guidance and coaching. Design and deploy modern data pipelines using Databricks and Azure. Transform unstructured external data into clean, business-ready models. Build semantic layers and embed Unity Catalog for robust data governance. Guide the team in taking Databricks from development … production. Support Terraform adoption and infrastructure-as-code practices. Why This Role Stands Out Leadership and impact: Influence technical direction, team culture, and delivery standards. Modern tech stack: Azure, Databricks, Unity Catalog, SQL, Python, Terraform. Attractive package: Up to £80k base + car allowance + bonus + benefits. Hybrid working: Offices across the South East and North West, with More ❯
specifications, and process descriptions. What We’re Looking For: Proven proficiency in Python programming with an emphasis on writing clean, efficient, and maintainable code. Hands-on experience working with DataBricks and cloud services for building scalable data pipelines. Strong knowledge of cloud data warehousing solutions such as Snowflake or similar. Solid understanding of ETL workflows, data modeling, data warehousing, and More ❯
understanding of data warehousing, data lakes, and ETL processes. Programming: Experience with programming languages like Python, Java, or Scala. Cloud Data Platform: Strong experience with Cloud Data Platform like Databricks (preferred) , Snowflake, Redshift, etc. Databricks Expertise: Deep understanding of Databricks architecture, including clusters, jobs, notebooks, and workspaces. Database Technologies: Knowledge of relational and NoSQL databases. Infrastructure as Code: Experience with More ❯
Nottingham, Nottinghamshire, East Midlands, United Kingdom Hybrid/Remote Options
Client Server
Senior Data Engineer (Databricks SQL Azure) Nottingham/WFH to £65k Opportunity to progress your career in a senior, hands-on Data Engineer role at a SaaS tech company. As a Senior Data Engineer you'll join a newly formed team that deals with customer facing reporting on big data sets, they process 120 billion lines of data per day. … You'll be primarily working with advanced SQL with Databricks in Azure including data modelling and low level data design work. As a senior member of the team you'll also contribute to technical discussions, strategic decision making and help to mentor more mid-level data engineers. Location/WFH: There's a remote interview and onboarding process and you … from most of the time, meeting up with the team for constructive meetings once a month/quarter in the Nottingham office. About you: You have advanced SQL and Databricks experience You have experience in cloud based environments, Azure preferred You have strong analysis and problem solving skills You have experience of working in Agile development environments You're collaborative More ❯
London, South East, England, United Kingdom Hybrid/Remote Options
Opus Recruitment Solutions Ltd
Date: ASAP Interview Process: One stage Core Skills Required Strong expertise in Power BI – dashboarding, reporting, and data visualisation Advanced SQL skills for querying and data manipulation Experience with Databricks for scalable data processing Desirable Skills Familiarity with PySpark for distributed data processing More ❯