Areti is currently seeking five Senior Data Scientists/Analysts to join one of the UK’s fastest-growing Series A-funded tech startups based in London . This is an exciting opportunity to be part of a dynamic, mission-driven company working on high-impact Defence and Government projects. Extensive training provided – including Palantir – making this a career … defining role . 🌳🌳🌳 The Role: You will work within a tight-knit, agile team tackling real-world problems using cutting-edge data science and engineering techniques. Key Responsibilities: Develop and deploy machine learning models Solve complex data problems in a bigdata environment Build scalable data pipelines and APIs Ideal Candidate Will Have: Previous experience … as a Data Scientist or Data Engineer Strong command of Python (including libraries such as scikit-learn, NumPy, matplotlib) Experience in deep learning frameworks such as TensorFlow or PyTorch Knowledge of SQL and relational databases; experience with BigData environments Familiarity with API development and NoSQL databases Understanding of CI/CD practices , AWS , and GIT More ❯
Areti is currently seeking five Senior Data Scientists/Analysts to join one of the UK’s fastest-growing Series A-funded tech startups based in London . This is an exciting opportunity to be part of a dynamic, mission-driven company working on high-impact Defence and Government projects. Extensive training provided – including Palantir – making this a career … defining role . 🌳🌳🌳 The Role: You will work within a tight-knit, agile team tackling real-world problems using cutting-edge data science and engineering techniques. Key Responsibilities: Develop and deploy machine learning models Solve complex data problems in a bigdata environment Build scalable data pipelines and APIs Ideal Candidate Will Have: Previous experience … as a Data Scientist or Data Engineer Strong command of Python (including libraries such as scikit-learn, NumPy, matplotlib) Experience in deep learning frameworks such as TensorFlow or PyTorch Knowledge of SQL and relational databases; experience with BigData environments Familiarity with API development and NoSQL databases Understanding of CI/CD practices , AWS , and GIT More ❯
Areti is currently seeking five Senior Data Scientists/Analysts to join one of the UK's fastest-growing Series A-funded tech startups based in London . This is an exciting opportunity to be part of a dynamic, mission-driven company working on high-impact Defence and Government projects. Extensive training provided - including Palantir - making this a career … defining role . The Role: You will work within a tight-knit, agile team tackling real-world problems using cutting-edge data science and engineering techniques. Key Responsibilities: Develop and deploy machine learning models Solve complex data problems in a bigdata environment Build scalable data pipelines and APIs Ideal Candidate Will Have: Previous experience … as a Data Scientist or Data Engineer Strong command of Python (including libraries such as scikit-learn, NumPy, matplotlib) Experience in deep learning frameworks such as TensorFlow or PyTorch Knowledge of SQL and relational databases; experience with BigData environments Familiarity with API development and NoSQL databases Understanding of CI/CD practices , AWS , and GIT More ❯
Areti is currently seeking five Senior Data Scientists/Analysts to join one of the UK’s fastest-growing Series A-funded tech startups based in London . This is an exciting opportunity to be part of a dynamic, mission-driven company working on high-impact Defence and Government projects. Extensive training provided – including Palantir – making this a career … defining role . 🌳🌳🌳 The Role: You will work within a tight-knit, agile team tackling real-world problems using cutting-edge data science and engineering techniques. Key Responsibilities: Develop and deploy machine learning models Solve complex data problems in a bigdata environment Build scalable data pipelines and APIs Ideal Candidate Will Have: Previous experience … as a Data Scientist or Data Engineer Strong command of Python (including libraries such as scikit-learn, NumPy, matplotlib) Experience in deep learning frameworks such as TensorFlow or PyTorch Knowledge of SQL and relational databases; experience with BigData environments Familiarity with API development and NoSQL databases Understanding of CI/CD practices , AWS , and GIT More ❯
london (city of london), south east england, united kingdom
Areti Group | B Corp™
Areti is currently seeking five Senior Data Scientists/Analysts to join one of the UK’s fastest-growing Series A-funded tech startups based in London . This is an exciting opportunity to be part of a dynamic, mission-driven company working on high-impact Defence and Government projects. Extensive training provided – including Palantir – making this a career … defining role . 🌳🌳🌳 The Role: You will work within a tight-knit, agile team tackling real-world problems using cutting-edge data science and engineering techniques. Key Responsibilities: Develop and deploy machine learning models Solve complex data problems in a bigdata environment Build scalable data pipelines and APIs Ideal Candidate Will Have: Previous experience … as a Data Scientist or Data Engineer Strong command of Python (including libraries such as scikit-learn, NumPy, matplotlib) Experience in deep learning frameworks such as TensorFlow or PyTorch Knowledge of SQL and relational databases; experience with BigData environments Familiarity with API development and NoSQL databases Understanding of CI/CD practices , AWS , and GIT More ❯
Areti is currently seeking five Senior Data Scientists/Analysts to join one of the UK’s fastest-growing Series A-funded tech startups based in London . This is an exciting opportunity to be part of a dynamic, mission-driven company working on high-impact Defence and Government projects. Extensive training provided – including Palantir – making this a career … defining role . 🌳🌳🌳 The Role: You will work within a tight-knit, agile team tackling real-world problems using cutting-edge data science and engineering techniques. Key Responsibilities: Develop and deploy machine learning models Solve complex data problems in a bigdata environment Build scalable data pipelines and APIs Ideal Candidate Will Have: Previous experience … as a Data Scientist or Data Engineer Strong command of Python (including libraries such as scikit-learn, NumPy, matplotlib) Experience in deep learning frameworks such as TensorFlow or PyTorch Knowledge of SQL and relational databases; experience with BigData environments Familiarity with API development and NoSQL databases Understanding of CI/CD practices , AWS , and GIT More ❯
Birmingham, West Midlands, England, United Kingdom
TXP
Data Engineer Hybrid (West Midlands) Permanent Competitive Salary + Benefits Are you passionate about building scalable data solutions and working with cutting-edge technologies like Microsoft Fabric, Azure, and Power BI? TXP is looking for a talented Data Engineer to join our growing team and help shape the future of data-driven decision-making. We are … power of technology and people, and we help everyone here to succeed. At TXP, you can multiply your potential. What You'll Be Doing Designing, developing, and maintaining robust data pipelines. Building and optimising infrastructure for scalable data solutions. Integrating data from diverse sources across the business. Ensuring data integrity, security, and compliance with best practices. … Collaborating with cross-functional teams including analysts, developers, and business stakeholders. Leveraging Microsoft Fabric (Lakehouse, Data Pipelines) to modernise our data architecture. Supporting CI/CD and DevOps practices for data workflows. What We're Looking For Proven experience as a Data Engineer or in a similar role. Strong coding skills in SQL , Python , and PySpark More ❯
We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team, you will contribute to large-scale transformation initiatives within the P&C Insurance space by developing robust data models, optimising data flows, and ensuring the accuracy and accessibility of critical information. The position requires close collaboration with both technical and business stakeholders, and is an excellent opportunity to join a team that invests in your … growth, with comprehensive training and certification programs and a real opportunity to showcase your talents. Key Responsibilities Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic using SQL, Python, and PySpark More ❯
We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team, you will contribute to large-scale transformation initiatives within the P&C Insurance space by developing robust data models, optimising data flows, and ensuring the accuracy and accessibility of critical information. The position requires close collaboration with both technical and business stakeholders, and is an excellent opportunity to join a team that invests in your … growth, with comprehensive training and certification programs and a real opportunity to showcase your talents. Key Responsibilities Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic using SQL, Python, and PySpark More ❯
We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team, you will contribute to large-scale transformation initiatives within the P&C Insurance space by developing robust data models, optimising data flows, and ensuring the accuracy and accessibility of critical information. The position requires close collaboration with both technical and business stakeholders, and is an excellent opportunity to join a team that invests in your … growth, with comprehensive training and certification programs and a real opportunity to showcase your talents. Key Responsibilities Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic using SQL, Python, and PySpark More ❯
We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team, you will contribute to large-scale transformation initiatives within the P&C Insurance space by developing robust data models, optimising data flows, and ensuring the accuracy and accessibility of critical information. The position requires close collaboration with both technical and business stakeholders, and is an excellent opportunity to join a team that invests in your … growth, with comprehensive training and certification programs and a real opportunity to showcase your talents. Key Responsibilities Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic using SQL, Python, and PySpark More ❯
We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team, you will contribute to large-scale transformation initiatives within the P&C Insurance space by developing robust data models, optimising data flows, and ensuring the accuracy and accessibility of critical information. The position requires close collaboration with both technical and business stakeholders, and is an excellent opportunity to join a team that invests in your … growth, with comprehensive training and certification programs and a real opportunity to showcase your talents. Key Responsibilities Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic using SQL, Python, and PySpark More ❯
london (city of london), south east england, united kingdom
Calibre Candidates
We are seeking a Data Engineer (Databricks) to support the growth of a global technology provider within the Insurtech space. The role focuses on designing and delivering ETL pipelines and scalable solutions using the Azure ecosystem, with an emphasis on enabling advanced analytics and data-driven decision-making. As a key player in a high-performing data engineering team, you will contribute to large-scale transformation initiatives within the P&C Insurance space by developing robust data models, optimising data flows, and ensuring the accuracy and accessibility of critical information. The position requires close collaboration with both technical and business stakeholders, and is an excellent opportunity to join a team that invests in your … growth, with comprehensive training and certification programs and a real opportunity to showcase your talents. Key Responsibilities Design and develop ETL pipelines using Azure Data Factory for data ingestion and transformation Work with Azure Data Lake, Synapse, and SQL DW to manage large volumes of data Develop data transformation logic using SQL, Python, and PySpark More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Peaple Talent
GCP Data Engineer | London | £50,000 - £70,000 Peaple Talent have partnered with a specialist data consultancy delivering services across data engineering, data strategy, data migration, BI & analytics. With a diverse portfolio of clients and cutting-edge projects, our client is a trusted name in the data consulting space. Due to exciting growth plans … we are now looking for a Data Engineer , specialising in Google Cloud Platform (GCP) and BigQuery . We are looking for: Demonstrable data engineering/BI skills, with a focus on having delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with BigData technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications What’s in it for you: 📍 Location: London (Hybrid) 💻 Remote working: Occasional office visits each month 💰 Salary: £50,000–£70,000 DOE 🤝 Collaborative culture More ❯
GCP Data Engineer | London | £50,000 - £70,000 Peaple Talent have partnered with a specialist data consultancy delivering services across data engineering, data strategy, data migration, BI & analytics. With a diverse portfolio of clients and cutting-edge projects, our client is a trusted name in the data consulting space. Due to exciting growth plans … we are now looking for a Data Engineer , specialising in Google Cloud Platform (GCP) and BigQuery . We are looking for: Demonstrable data engineering/BI skills, with a focus on having delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with BigData technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications What’s in it for you: 📍 Location: London (Hybrid) 💻 Remote working: Occasional office visits each month 💰 Salary: £50,000–£70,000 DOE 🤝 Collaborative culture More ❯
london, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
GCP Data Engineer | London | £50,000 - £70,000 Peaple Talent have partnered with a specialist data consultancy delivering services across data engineering, data strategy, data migration, BI & analytics. With a diverse portfolio of clients and cutting-edge projects, our client is a trusted name in the data consulting space. Due to exciting growth plans … we are now looking for a Data Engineer , specialising in Google Cloud Platform (GCP) and BigQuery . We are looking for: Demonstrable data engineering/BI skills, with a focus on having delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with BigData technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications What’s in it for you: 📍 Location: London (Hybrid) 💻 Remote working: Occasional office visits each month 💰 Salary: £50,000–£70,000 DOE 🤝 Collaborative culture More ❯
london (city of london), south east england, united kingdom Hybrid / WFH Options
Peaple Talent
GCP Data Engineer | London | £50,000 - £70,000 Peaple Talent have partnered with a specialist data consultancy delivering services across data engineering, data strategy, data migration, BI & analytics. With a diverse portfolio of clients and cutting-edge projects, our client is a trusted name in the data consulting space. Due to exciting growth plans … we are now looking for a Data Engineer , specialising in Google Cloud Platform (GCP) and BigQuery . We are looking for: Demonstrable data engineering/BI skills, with a focus on having delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with BigData technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications What’s in it for you: 📍 Location: London (Hybrid) 💻 Remote working: Occasional office visits each month 💰 Salary: £50,000–£70,000 DOE 🤝 Collaborative culture More ❯
slough, south east england, united kingdom Hybrid / WFH Options
Peaple Talent
GCP Data Engineer | London | £50,000 - £70,000 Peaple Talent have partnered with a specialist data consultancy delivering services across data engineering, data strategy, data migration, BI & analytics. With a diverse portfolio of clients and cutting-edge projects, our client is a trusted name in the data consulting space. Due to exciting growth plans … we are now looking for a Data Engineer , specialising in Google Cloud Platform (GCP) and BigQuery . We are looking for: Demonstrable data engineering/BI skills, with a focus on having delivered solutions in Google Cloud Platform (GCP) Strong experience designing and delivering data solutions using BigQuery Proficient in SQL and Python Experience working with BigData technologies such as Apache Spark or PySpark Excellent communication skills, with the ability to engage effectively with senior stakeholders Nice to haves: GCP Data Engineering certifications BigQuery or other GCP tool certifications What’s in it for you: 📍 Location: London (Hybrid) 💻 Remote working: Occasional office visits each month 💰 Salary: £50,000–£70,000 DOE 🤝 Collaborative culture More ❯
City of London, London, United Kingdom Hybrid / WFH Options
Peaple Talent
Databricks Data Engineer | London | £50,000 - £70,000 🧱 Peaple Talent have partnered with a specialist data consultancy delivering services across data engineering, data strategy, data migration, BI & analytics. As a trusted Microsoft Partner, my client are leaders in the industry, with a diverse portfolio of clients and projects. Due to exciting growth plans we are … now looking for a Data Engineer, specialising in Databricks. We are looking for: Demonstratable data engineering/BI skills, with a focus on having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using BigData technologies such as Apache Spark or PySpark Great communication … skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional office visits each month 💰Salary: £50,000-£70,000 DOE 🤝Collaborative culture and great team support 📈Vast L&D opportunities both internally and externally More ❯
Databricks Data Engineer | London | £50,000 - £70,000 🧱 Peaple Talent have partnered with a specialist data consultancy delivering services across data engineering, data strategy, data migration, BI & analytics. As a trusted Microsoft Partner, my client are leaders in the industry, with a diverse portfolio of clients and projects. Due to exciting growth plans we are … now looking for a Data Engineer, specialising in Databricks. We are looking for: Demonstratable data engineering/BI skills, with a focus on having delivered in Microsoft Azure Strong experience designing and delivering data solutions in Databricks Proficient with SQL and Python Experience using BigData technologies such as Apache Spark or PySpark Great communication … skills, effectively participating with Senior Stakeholders Nice to haves: Azure Data Engineering certifications Databricks certifications What's in it for you: 📍Location: London (Hybrid) 💻Remote working: Occasional office visits each month 💰Salary: £50,000-£70,000 DOE 🤝Collaborative culture and great team support 📈Vast L&D opportunities both internally and externally More ❯
Job Title: Azure Databricks Data Engineer Primary skills: Advanced SQL, Azure Databricks, Azure Data Factory, Azure Datalake. Secondary skills: Azure SQL, PySpark, Azure Synapse. Experience: 10+ Years of Experience Hybrid - 3 Days/Week onsite is MUST About the job We are looking for an experienced Databricks Data Engineer to design, develop, and manage data pipelines … using Azure services such as Databricks, Data Factory, and Datalake. The role involves building scalable ETL solutions, collaborating with cross-functional teams, and processing large volumes of data. You will work closely with business and technical teams to deliver robust data models and transformations in support of analytics and reporting needs. • Responsibilities: • Design and develop ETL pipelines using … ADF for data ingestion and transformation. • Collaborate with Azure stack modules like Data Lakes and SQL DW to handle large volumes of data. • Write SQL, Python, and PySpark code to meet data processing and transformation needs. • Understand business requirements and create data flow processes that meet them. • Develop mapping documents and transformation business rules. • Ensure continuous More ❯
Job Title: Azure Databricks Data Engineer Primary skills: Advanced SQL, Azure Databricks, Azure Data Factory, Azure Datalake. Secondary skills: Azure SQL, PySpark, Azure Synapse. Experience: 10+ Years of Experience Hybrid - 3 Days/Week onsite is MUST About the job We are looking for an experienced Databricks Data Engineer to design, develop, and manage data pipelines … using Azure services such as Databricks, Data Factory, and Datalake. The role involves building scalable ETL solutions, collaborating with cross-functional teams, and processing large volumes of data. You will work closely with business and technical teams to deliver robust data models and transformations in support of analytics and reporting needs. • Responsibilities: • Design and develop ETL pipelines using … ADF for data ingestion and transformation. • Collaborate with Azure stack modules like Data Lakes and SQL DW to handle large volumes of data. • Write SQL, Python, and PySpark code to meet data processing and transformation needs. • Understand business requirements and create data flow processes that meet them. • Develop mapping documents and transformation business rules. • Ensure continuous More ❯
Job Title: Azure Databricks Data Engineer Primary skills: Advanced SQL, Azure Databricks, Azure Data Factory, Azure Datalake. Secondary skills: Azure SQL, PySpark, Azure Synapse. Experience: 10+ Years of Experience Hybrid - 3 Days/Week onsite is MUST About the job We are looking for an experienced Databricks Data Engineer to design, develop, and manage data pipelines … using Azure services such as Databricks, Data Factory, and Datalake. The role involves building scalable ETL solutions, collaborating with cross-functional teams, and processing large volumes of data. You will work closely with business and technical teams to deliver robust data models and transformations in support of analytics and reporting needs. • Responsibilities: • Design and develop ETL pipelines using … ADF for data ingestion and transformation. • Collaborate with Azure stack modules like Data Lakes and SQL DW to handle large volumes of data. • Write SQL, Python, and PySpark code to meet data processing and transformation needs. • Understand business requirements and create data flow processes that meet them. • Develop mapping documents and transformation business rules. • Ensure continuous More ❯
Job Title: Azure Databricks Data Engineer Primary skills: Advanced SQL, Azure Databricks, Azure Data Factory, Azure Datalake. Secondary skills: Azure SQL, PySpark, Azure Synapse. Experience: 10+ Years of Experience Hybrid - 3 Days/Week onsite is MUST About the job We are looking for an experienced Databricks Data Engineer to design, develop, and manage data pipelines … using Azure services such as Databricks, Data Factory, and Datalake. The role involves building scalable ETL solutions, collaborating with cross-functional teams, and processing large volumes of data. You will work closely with business and technical teams to deliver robust data models and transformations in support of analytics and reporting needs. • Responsibilities: • Design and develop ETL pipelines using … ADF for data ingestion and transformation. • Collaborate with Azure stack modules like Data Lakes and SQL DW to handle large volumes of data. • Write SQL, Python, and PySpark code to meet data processing and transformation needs. • Understand business requirements and create data flow processes that meet them. • Develop mapping documents and transformation business rules. • Ensure continuous More ❯
london (city of london), south east england, united kingdom
ValueMomentum
Job Title: Azure Databricks Data Engineer Primary skills: Advanced SQL, Azure Databricks, Azure Data Factory, Azure Datalake. Secondary skills: Azure SQL, PySpark, Azure Synapse. Experience: 10+ Years of Experience Hybrid - 3 Days/Week onsite is MUST About the job We are looking for an experienced Databricks Data Engineer to design, develop, and manage data pipelines … using Azure services such as Databricks, Data Factory, and Datalake. The role involves building scalable ETL solutions, collaborating with cross-functional teams, and processing large volumes of data. You will work closely with business and technical teams to deliver robust data models and transformations in support of analytics and reporting needs. • Responsibilities: • Design and develop ETL pipelines using … ADF for data ingestion and transformation. • Collaborate with Azure stack modules like Data Lakes and SQL DW to handle large volumes of data. • Write SQL, Python, and PySpark code to meet data processing and transformation needs. • Understand business requirements and create data flow processes that meet them. • Develop mapping documents and transformation business rules. • Ensure continuous More ❯