woking, south east england, united kingdom Hybrid / WFH Options
McLaren Racing
entire Microsoft Office 365 suite. In-depth understanding and experience with multiple OS platforms such as Mac OS & Windows 10. Experience with enterprise level infrastructure such as Azure AD, Windows Server, Hyper-V, SSO, SCCM, Group Policy, Security. Good networking understanding, VLANs, VPN, DHCP, DNS, LAN, WAN, Cisco Meraki. Some previous knowledge of hardware procurement and asset More ❯
guildford, south east england, united kingdom Hybrid / WFH Options
McLaren Racing
entire Microsoft Office 365 suite. In-depth understanding and experience with multiple OS platforms such as Mac OS & Windows 10. Experience with enterprise level infrastructure such as Azure AD, Windows Server, Hyper-V, SSO, SCCM, Group Policy, Security. Good networking understanding, VLANs, VPN, DHCP, DNS, LAN, WAN, Cisco Meraki. Some previous knowledge of hardware procurement and asset More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Rsgroup
process and the value of organisational efficiency. Desirable skills and experience Certifications in relevant data and analytics technologies (e.g., AWS Certified Big Data Specialty, Snowflake, Microsoft Certified: Azure Data Engineer Associate) are desirable. Bachelor's or master's degree in computer science, information systems, or a related field. Strong influencing and negotiation skills. Who we are At More ❯
process and the value of organisational efficiency. Desirable skills and experience Certifications in relevant data and analytics technologies (e.g., AWS Certified Big Data Specialty, Snowflake, Microsoft Certified: Azure Data Engineer Associate) are desirable. Bachelor's or master's degree in computer science, information systems, or a related field. Strong influencing and negotiation skills. Who we are At More ❯
Corby, Northamptonshire, United Kingdom Hybrid / WFH Options
Rsgroup
process and the value of organisational efficiency. Desirable skills and experience Certifications in relevant data and analytics technologies (e.g., AWS Certified Big Data Specialty, Snowflake, Microsoft Certified: Azure Data Engineer Associate) are desirable. Bachelor's or master's degree in computer science, information systems, or a related field. Strong influencing and negotiation skills. Who we are At More ❯
Azure DevOps Engineer | Location: Tarporley 2-3 days on-site | Permanent/Full-time | NO VISA NO PSW VISA We’re looking for a Senior DevOps Engineer to design, build, and automate secure, scalable infrastructure and CI/CD pipelines in Azure . You’ll work closely with the DevOps Manager and development teams to streamline … deployments, improve delivery speed, and maintain ISO 27001 compliance. Key Responsibilities: Build and manage infrastructure using Terraform and Azure Pipelines Optimize Azure environments for security, performance, and cost Maintain CI/CD pipelines, testing frameworks, and configuration management Deploy and manage containers using Docker , Kubernetes (AKS) , and Helm Drive automation and monitoring best practices Requirements: Proven … DevOps experience in Azure environments Strong skills in Terraform , Kubernetes , Docker , and Azure DevOps Pipelines Proficiency in Git , scripting ( Bash , PowerShell , or Python ) Experience with IaC , automation, and cloud security best practices If you’re passionate about automation, cloud-native systems, and improving delivery pipelines, we’d love to hear from you. More ❯
Azure DevOps Engineer | Location: Tarporley 2-3 days on-site | Permanent/Full-time | NO VISA NO PSW VISA We’re looking for a Senior DevOps Engineer to design, build, and automate secure, scalable infrastructure and CI/CD pipelines in Azure . You’ll work closely with the DevOps Manager and development teams to streamline … deployments, improve delivery speed, and maintain ISO 27001 compliance. Key Responsibilities: Build and manage infrastructure using Terraform and Azure Pipelines Optimize Azure environments for security, performance, and cost Maintain CI/CD pipelines, testing frameworks, and configuration management Deploy and manage containers using Docker , Kubernetes (AKS) , and Helm Drive automation and monitoring best practices Requirements: Proven … DevOps experience in Azure environments Strong skills in Terraform , Kubernetes , Docker , and Azure DevOps Pipelines Proficiency in Git , scripting ( Bash , PowerShell , or Python ) Experience with IaC , automation, and cloud security best practices If you’re passionate about automation, cloud-native systems, and improving delivery pipelines, we’d love to hear from you. More ❯
to help shape delivery. The Desirables: Certification or demonstrated knowledge of one of the major Cloud Service Providers - Google Cloud Platform (GCP), Amazon Web Services (AWS) or MicrosoftAzure Platform or equivalent. Proven adaptability and ability to shift focus and work with varied subject matters and work across different projects and project types. Experience of banking products More ❯
to help shape delivery. The Desirables: Certification or demonstrated knowledge of one of the major Cloud Service Providers - Google Cloud Platform (GCP), Amazon Web Services (AWS) or MicrosoftAzure Platform or equivalent. Proven adaptability and ability to shift focus and work with varied subject matters and work across different projects and project types. Experience of banking products More ❯
warrington, cheshire, north west england, united kingdom
Solvex Solutions
Azure DevOps Engineer | Location: Tarporley 2-3 days on-site | Permanent/Full-time | NO VISA NO PSW VISA We’re looking for a Senior DevOps Engineer to design, build, and automate secure, scalable infrastructure and CI/CD pipelines in Azure . You’ll work closely with the DevOps Manager and development teams to streamline … deployments, improve delivery speed, and maintain ISO 27001 compliance. Key Responsibilities: Build and manage infrastructure using Terraform and Azure Pipelines Optimize Azure environments for security, performance, and cost Maintain CI/CD pipelines, testing frameworks, and configuration management Deploy and manage containers using Docker , Kubernetes (AKS) , and Helm Drive automation and monitoring best practices Requirements: Proven … DevOps experience in Azure environments Strong skills in Terraform , Kubernetes , Docker , and Azure DevOps Pipelines Proficiency in Git , scripting ( Bash , PowerShell , or Python ) Experience with IaC , automation, and cloud security best practices If you’re passionate about automation, cloud-native systems, and improving delivery pipelines, we’d love to hear from you. More ❯
Azure DevOps Engineer | Location: Tarporley 2-3 days on-site | Permanent/Full-time | NO VISA NO PSW VISA We’re looking for a Senior DevOps Engineer to design, build, and automate secure, scalable infrastructure and CI/CD pipelines in Azure . You’ll work closely with the DevOps Manager and development teams to streamline … deployments, improve delivery speed, and maintain ISO 27001 compliance. Key Responsibilities: Build and manage infrastructure using Terraform and Azure Pipelines Optimize Azure environments for security, performance, and cost Maintain CI/CD pipelines, testing frameworks, and configuration management Deploy and manage containers using Docker , Kubernetes (AKS) , and Helm Drive automation and monitoring best practices Requirements: Proven … DevOps experience in Azure environments Strong skills in Terraform , Kubernetes , Docker , and Azure DevOps Pipelines Proficiency in Git , scripting ( Bash , PowerShell , or Python ) Experience with IaC , automation, and cloud security best practices If you’re passionate about automation, cloud-native systems, and improving delivery pipelines, we’d love to hear from you. More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Syntax Consultancy Limited
/day (Inside IR35) Databricks Engineer needed with active SC Security Clearance for 6 Month Contract based in Central London (Hybrid). Developing a cutting-edge Azure Databricks platform for economic data modelling, analysis, and forecasting. Start ASAP in Nov/Dec 2025. Hybrid Working - 2 days/week remote (WFH), and 3 days/week working on … Central London office. A chance to work with a leading global IT and Digital transformation business specialising in Government projects: In-depth Data Engineering + strong hands-on Azure Databricks expertise. Azure Data Services, Azure Data Factory, Azure Blob Storage + Azure SQL Database. Designing, developing, building + optimising data … data transformations using Spark, PySpark or Scala + working with SQL/MySQL databases. Experience with data quality, data governance processes, Git version control + Agile development environments. Azure Data Engineer certification preferred -eg- Azure Data Engineer Associate. Advantageous skills: Azure Event Hubs, Kafka, data visualisation tools, Power BI, Tableau, Azure DevOps More ❯
implementations for Mid market and Enterprise customers 1+ years of experience in product management A strong technical acumen and familiarity working with Jira, REST API, JSON, SAML SSO, Azure DevOps Demonstrated skills in project management and managing customer relationships for a managed services and/or SaaS organization Strong oral and written communication skills with the ability to More ❯
East London, London, United Kingdom Hybrid / WFH Options
InfinityQuest Ltd,
with trading background) Location: Canary Wharf, UK (Hybrid) (3 days onsite & 2 days remote) Role Type: 6 Months Contract with possibility of extensions Mandatory Skills : Python, Postgres SQL, Azure Databricks, AWS(S3), Git, Azure DevOps CICD, Apache Airflow, Energy Trading experience Job Description:- Data Engineer (Python enterprise developer): 6+ years of experience in python scripting. Proficient … pandas, beautiful soup, Selenium, pdfplumber, Requests etc. Proficient in SQL programming, Postgres SQL. Knowledge on DevOps like CI/CD, Jenkins, Git. Experience working with AWS(S3) and Azure Databricks. Have experience in delivering project with Agile and Scrum methodology. Able to co-ordinate with Teams across multiple locations and time zones Strong interpersonal and communication skills with More ❯
Data Engineer - Azure Databricks , Apache Kafka Permanent Basingstoke (Hybrid - x2 PW) Circa £70,000 + Excellent Package Overview We're looking for a skilled Data Analytics Engineer to help drive the evolution of our clients data platform. This role is ideal for someone who thrives on building scalable data solutions and is confident working with modern tools such … as Azure Databricks , Apache Kafka , and Spark . In this role, you'll play a key part in designing, delivering, and optimising data pipelines and architectures. Your focus will be on enabling robust data ingestion and transformation to support both operational and analytical use cases. If you're passionate about data engineering and want to make a meaningful … impact in a collaborative, fast-paced environment, we want to hear from you !! Role and Responsibilities Designing and building scalable data pipelines using Apache Spark in Azure Databricks Developing real-time and batch data ingestion workflows, ideally using Apache Kafka Collaborating with data scientists, analysts, and business stakeholders to build high-quality data products Supporting the deployment and More ❯
offices + Flexible Hybrid working options A Software Engineer opportunity for a business developing innovative clean energy products - the business has several projects running utilising the .Net/Azure Tech stack with a preference for candidates who have exposure to building scalable software in fast paced Data heavy environment. An excellent opportunity for candidates who want to work … in a modern .Net stack who want to join a collaborative team. Required skills: 5 years+ commercial experience developing Applications/API's with .Net (8+), C#, SQL, Azure Commercial exposure to DevOps, CI/CD pipelines and Azure DevOps Full software life cycle and Unit testing experience Agile processes and software engineering best practices, TDD More ❯
Employment Type: Permanent
Salary: £55000 - £70000/annum Benefits + hybrid working
Insight Global’s client is looking for a Senior Data Engineer to join their Finance and Operations team, responsible for designing and maintaining Azure-based data pipelines and APIs, building and optimizing ETL processes, managing large datasets, troubleshooting data issues, and documenting technical solutions. The ideal candidate will have strong coding skills in Python and SQL, experience with … dbt, Azure DevOps, and CI/CD best practices, and a solid understanding of data warehousing principles. Success in this role requires excellent communication, a collaborative mindset, and proactive problem-solving to mitigate blockers and deliver scalable solutions. Candidates with experience in tools like Snowflake, Airflow, or Terraform, familiarity with infrastructure as code, and exposure to financial and … month contract-to-hire position and would require you to be on-site 5 days a week out of the London office. Day to Day: Develop and maintain Azure-based data pipelines for Finance and Operations. Build and optimize ETL workflows using SQL and dbt. Write Python scripts for data transformation and automation. Deploy infrastructure as code and More ❯
Insight Global’s client is looking for a Senior Data Engineer to join their Finance and Operations team, responsible for designing and maintaining Azure-based data pipelines and APIs, building and optimizing ETL processes, managing large datasets, troubleshooting data issues, and documenting technical solutions. The ideal candidate will have strong coding skills in Python and SQL, experience with … dbt, Azure DevOps, and CI/CD best practices, and a solid understanding of data warehousing principles. Success in this role requires excellent communication, a collaborative mindset, and proactive problem-solving to mitigate blockers and deliver scalable solutions. Candidates with experience in tools like Snowflake, Airflow, or Terraform, familiarity with infrastructure as code, and exposure to financial and … month contract-to-hire position and would require you to be on-site 5 days a week out of the London office. Day to Day: Develop and maintain Azure-based data pipelines for Finance and Operations. Build and optimize ETL workflows using SQL and dbt. Write Python scripts for data transformation and automation. Deploy infrastructure as code and More ❯
Insight Global’s client is looking for a Senior Data Engineer to join their Finance and Operations team, responsible for designing and maintaining Azure-based data pipelines and APIs, building and optimizing ETL processes, managing large datasets, troubleshooting data issues, and documenting technical solutions. The ideal candidate will have strong coding skills in Python and SQL, experience with … dbt, Azure DevOps, and CI/CD best practices, and a solid understanding of data warehousing principles. Success in this role requires excellent communication, a collaborative mindset, and proactive problem-solving to mitigate blockers and deliver scalable solutions. Candidates with experience in tools like Snowflake, Airflow, or Terraform, familiarity with infrastructure as code, and exposure to financial and … month contract-to-hire position and would require you to be on-site 5 days a week out of the London office. Day to Day: Develop and maintain Azure-based data pipelines for Finance and Operations. Build and optimize ETL workflows using SQL and dbt. Write Python scripts for data transformation and automation. Deploy infrastructure as code and More ❯
Insight Global’s client is looking for a Senior Data Engineer to join their Finance and Operations team, responsible for designing and maintaining Azure-based data pipelines and APIs, building and optimizing ETL processes, managing large datasets, troubleshooting data issues, and documenting technical solutions. The ideal candidate will have strong coding skills in Python and SQL, experience with … dbt, Azure DevOps, and CI/CD best practices, and a solid understanding of data warehousing principles. Success in this role requires excellent communication, a collaborative mindset, and proactive problem-solving to mitigate blockers and deliver scalable solutions. Candidates with experience in tools like Snowflake, Airflow, or Terraform, familiarity with infrastructure as code, and exposure to financial and … month contract-to-hire position and would require you to be on-site 5 days a week out of the London office. Day to Day: Develop and maintain Azure-based data pipelines for Finance and Operations. Build and optimize ETL workflows using SQL and dbt. Write Python scripts for data transformation and automation. Deploy infrastructure as code and More ❯
london (city of london), south east england, united kingdom
Insight Global
Insight Global’s client is looking for a Senior Data Engineer to join their Finance and Operations team, responsible for designing and maintaining Azure-based data pipelines and APIs, building and optimizing ETL processes, managing large datasets, troubleshooting data issues, and documenting technical solutions. The ideal candidate will have strong coding skills in Python and SQL, experience with … dbt, Azure DevOps, and CI/CD best practices, and a solid understanding of data warehousing principles. Success in this role requires excellent communication, a collaborative mindset, and proactive problem-solving to mitigate blockers and deliver scalable solutions. Candidates with experience in tools like Snowflake, Airflow, or Terraform, familiarity with infrastructure as code, and exposure to financial and … month contract-to-hire position and would require you to be on-site 5 days a week out of the London office. Day to Day: Develop and maintain Azure-based data pipelines for Finance and Operations. Build and optimize ETL workflows using SQL and dbt. Write Python scripts for data transformation and automation. Deploy infrastructure as code and More ❯
design and implementation of a next-generation data platform within a secure government environment. The role focuses on architecting, migrating, and optimising data platforms using Databricks on AWS (Azure experience also valued). You'll work closely with enterprise architecture, data engineering, and cloud infrastructure teams to design and deliver a scalable, compliant, and high-performing data ecosystem … that supports advanced analytics and AI workloads. Key Responsibilities Lead the architecture, design and implementation of Databricks-based data platforms in cloud environments (AWS or Azure). Define migration strategies from Legacy data environments to Databricks. Design data ingestion, transformation, and orchestration workflows to meet performance and security standards. Provide technical leadership on data lake, data warehouse, and … to architecture documentation, patterns, and reference models for reuse across the programme. Essential Skills & Experience Proven experience as a Data/Solution Architect specialising in Databricks (AWS preferred; Azure considered). Demonstrable experience in end-to-end implementation or migration to Databricks. Deep understanding of cloud-native data architectures , particularly in AWS (S3, Glue, EMR, Lambda) or AzureMore ❯
global logistics business, based in central London, is undergoing an exciting data transformation programme as it invests in a new team charged with building and implementing a new Azure Databricks platform. Working five days a week in the central London office, you'll join this growing team and play a key role in building and deploying modern Azure … have the opportunity build your skills in analytics engineering, responding to business and project needs rather than operating as a narrow silo. You'll work hands-on with Azure Databricks, Azure Data Factory, Delta Lake, and Power BI to design scalable data pipelines, implement efficient data models, and ensure high-quality data delivery. This is a … focused on building solutions for the business. In addition, you'll be responsible for the following: Designing, developing, and optimizing end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake. Implementing Medallion Architecture and building scalable ETL/ELT processes with Azure Data Factory and PySpark. Partner with the data architecture function More ❯
global logistics business, based in central London, is undergoing an exciting data transformation programme as it invests in a new team charged with building and implementing a new Azure Databricks platform. Working five days a week in the central London office, you'll join this growing team and play a key role in building and deploying modern Azure … have the opportunity build your skills in analytics engineering, responding to business and project needs rather than operating as a narrow silo. You'll work hands-on with Azure Databricks, Azure Data Factory, Delta Lake, and Power BI to design scalable data pipelines, implement efficient data models, and ensure high-quality data delivery. This is a … focused on building solutions for the business. In addition, you'll be responsible for the following: Designing, developing, and optimizing end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake. Implementing Medallion Architecture and building scalable ETL/ELT processes with Azure Data Factory and PySpark. Partner with the data architecture function More ❯
global logistics business, based in central London, is undergoing an exciting data transformation programme as it invests in a new team charged with building and implementing a new Azure Databricks platform. Working five days a week in the central London office, you'll join this growing team and play a key role in building and deploying modern Azure … have the opportunity build your skills in analytics engineering, responding to business and project needs rather than operating as a narrow silo. You'll work hands-on with Azure Databricks, Azure Data Factory, Delta Lake, and Power BI to design scalable data pipelines, implement efficient data models, and ensure high-quality data delivery. This is a … focused on building solutions for the business. In addition, you'll be responsible for the following: Designing, developing, and optimizing end-to-end data pipelines (batch & streaming) using Azure Databricks, Spark, and Delta Lake. Implementing Medallion Architecture and building scalable ETL/ELT processes with Azure Data Factory and PySpark. Partner with the data architecture function More ❯