employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks, and with a large amount of interesting data, this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use data in a fast paced … organisation. You will join as a Senior Platform DataEngineer providing technical leadership to the Data Engineering team. You will work closely with our Data Scientists and business stakeholders to ensure value is delivered through our solutions. Job Accountabilities Develop robust, scalable data pipelines to serve the easyJet analyst and data science community. Highly … competent hands-on experience with relevant Data Engineering technologies, such as Databricks, Spark, Spark API, Python, SQL Server, Scala. Work with data scientists, machine learning engineers and DevOps engineers to develop, develop and deploy machine learning models and algorithms aimed at addressing specific business challenges and opportunities. Coach and mentor the team (including contractors) to improve development standards. More ❯
Other - EU work permit required: Yes col-narrow-right Job Views: 3 Posted: 10.06.2025 Expiry Date: 25.07.2025 col-wide Job Description: Key Responsibilities Lead the technical delivery of complex data engineering projects, ensuring solutions are scalable, secure, and aligned with our delivery framework, and client goals. Design and build high-quality data pipelines and integration workflows, setting the … followed throughout the development lifecycle. Collaborate with multidisciplinary teams, including a wide range of other roles, to shape solutions that meet both technical and business requirements. Mentor and support data engineering teams, fostering a culture of continuous improvement, knowledge sharing, and technical excellence. Support testing activities by ensuring pipelines are testable, observable, and reliable; work with QA and analysts … to define test strategies, implement automated tests, and validate data quality and integrity. Contribute to technical planning, including estimation, risk assessment, and defining delivery approaches for client engagements and new opportunities. Engage with clients and stakeholders, translating data requirements into technical solutions and communicating complex ideas clearly and effectively. Champion engineering standards, contributing to the development and adoption More ❯
Luton, Bedfordshire, South East, United Kingdom Hybrid / WFH Options
Anson Mccade
GCP DataEngineer £100,000 - 115,000 GBP Hybrid WORKING Location: United Kingdom (Greater London) Type: Permanent GCP DataEngineer London (Hybrid) Salary: Up to £115,000 + benefits A global consultancy is looking for a Google Cloud Platform (GCP) DataEngineer to join a cross-functional data team delivering high-impact projects … across multiple industries. This role focuses on building and managing end-to-end data solutions within the GCP ecosystem, working with or building Agentic AI systems - supporting clients with scalable, cloud-native platforms and tools. Key Responsibilities: Design and implement data platforms using GCP Manage data workflows from ingestion to reporting Collaborate across teams to ensure seamless … data integration Evaluate and implement the right technologies for each use case Support knowledge sharing and technical best practice Essential Experience: Proven expertise in building data warehouses and ensuring data quality on GCP Strong hands-on experience with BigQuery, Dataproc, Dataform, Composer, Pub/Sub Skilled in PySpark, Python and SQL Solid understanding of ETL/ELT More ❯
wide Job Description: Location: Remote (United Kingdom) About The Company: We have partnered with a company that empowers underwriters to serve their insureds more effectively. They are using advanced data intelligence tools to rebuild the way that underwriters share and exchange risk. With a current focus on the small and medium-sized businesses that power our global economy and … each policy to deliver unprecedented insight into insurance pools, and their speciality portfolio is fully diversified with very low catastrophe, aggregation or systemic risk. The Role: Designing and implementing data pipelines and models, ensuring data quality and integrity. Solving challenging data integration problems, utilising optimal patterns, frameworks, query techniques, sourcing from vast and varying data sources. … Building, maintaining, and optimising our Data Warehouse to support reporting and analytics needs. Collaborating with product managers, business stakeholders and engineers to understand the data needs, representing key data insights in a meaningful way. Staying up-to-date with industry trends and best practices in data modelling, database development, and analytics. Optimising pipelines, frameworks, and systems More ❯
we employ over 10,000 people. Its big-scale stuff and we’re still growing. Job Purpose With a big investment into Databricks and a large amount of interesting data this is the chance for you to come and be part of an exciting transformation in the way we store, analyse and use data in a fast paced … organisation. You will join as a Data Platform Engineer joining a team of committed data specialists. As our data strategy begins to take shape, we aim to become the most data driven airline in the world. To achieve this goal, we want to use our data to improve all our business processes and that … means allowing our teams to experiment and innovate as they deem fit. Within data platform engineering, this translates to building the frameworks and tooling data engineers, analysts and scientists utilise, enabling platform users to focus on value delivery Job Accountabilities Support the business in harnessing the power of data within easyJet. Work in a fast-paced agile More ❯
ideal candidate will have a minimum of 5+ years of experience with strong expertise in Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines. Proficiency in Snowflake data warehouse architecture and the ability to design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake. Experience with DBT for data transformation … and modeling, implementing data transformation workflows using DBT (core/cloud). Strong Python programming skills for automation and data processing, leveraging Python to create automation scripts and optimize data processing tasks. Proficiency in SQL performance tuning and query optimization techniques using Snowflake. Troubleshooting and optimizing DBT models and Snowflake performance. Knowledge of CI/CD and … version control (Git) tools. Experience with orchestration tools such as Airflow. Strong analytical and problem-solving skills, with the ability to work independently in an agile development environment. Ensure data quality, reliability, and consistency across different environments. Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering More ❯
The Green Recruitment Company is working with an Environmental and Sustainability Business that supports and empower their customers journey to net zero. To join their Data-Technology team, we have an opportunity for a DataEngineer to help support the wider business (cross-functional) to meet their reporting and data requirements. About the role: The DataEngineer will be working with the latest innovative technologies, to design, build and maintain data solutions, constructing process to surface data both internally for reporting purposes and externally through the customer portal. The DataEngineer will be responsible for developing scalable data pipelines to integrate diverse data sources whilst ensuring data quality under the framework of a new Data Platform for real time application integration and reporting, and will work closely with business stakeholders to support data-driven decision making by delivering clean, well-structured datasets that can be utlised for reporting purpose in a performent, secure way. Key responsibilities: Taking full ownership of assigned projects and BAU More ❯
collaboration, recognition and inclusivity and offer a wide range of benefits to support staff wellbeing. Your Future Starts Here PURPOSE OF JOB: We’re looking for an experienced Azure Data & AI Engineer with a strong focus on advanced analytics, machine learning, and applied AI - particularly Generative AI. This role will suit a technically capable professional who combines real … Key Vault), and telemetry via Application Insights or Log Analytics. KEY RESPONSIBILITIES: Have 5+ years of technical consulting or a similar chronology of demonstrable success in coding and deploying Data and AI models onto Azure environments. At least one year using Azure Fabric Create scalable and reliable data pipelines for data processing, transformation, and storage. Establish and … enforce data governance policies and be ‘hands-on’ in exploratory data analysis to identify and measure quality gaps that are a block to Data and Ai implementation. Implement robust security measures to protect sensitive data within Azure environments. Integrate AI and machine learning models into data pipelines and applications. Develop and deploy AI solutions using More ❯
Luton, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
you want to login/join with: South Manchester – Remote (On-site just once a month MAX) Up to £60,000 + Bonus + Benefits Are you a skilled DataEngineer ready to shape the future of data in a rapidly scaling tech business? This is a great opportunity to join a leading SaaS provider generating close … whose platform supports millions of users across sectors including the NHS, logistics, public safety, and retail. With their tech function now fully in-house, they’re investing heavily in data—and you’ll be at the forefront of that transformation. You’ll work directly with the CTO and Head of Engineering, helping define the company’s data architecture … pipelines, and reporting solutions from the ground up. What You’ll Be Doing: Designing, building, and optimising scalable data pipelines, data warehouse structures, and integration processes. Implementing robust ETL solutions to consolidate data from multiple sources into the central data warehouse. Working with a fully cloud-based Azure ecosystem (including Azure Data Factory and Databricks More ❯