London, England, United Kingdom Hybrid / WFH Options
Cynergy Bank
data engineering best practices & guardrails. DataPipeline Development: Design, develop, and maintain datapipelines to efficiently extract, transform, and load data from disparate sources into our data warehouse or data lakes. Ensure data integrity, quality … datasets and expand the scope of insights. Performance Optimization: Continuously monitor and optimize the performance of datapipelines and queries to reduce latency and improve overall data processing efficiency. Automation: Identify opportunities to automate data workflows, data validation, and monitoring processes to … their data needs and support their data-related requirements. Documentation: Maintain comprehensive documentation for datapipelines, ETL processes, and data architecture to ensure knowledge transfer and maintain system stability. Stay Updated: Keep abreast of industry trends, best practices, and emerging technologies in more »
for a Data Engineer, who will be responsible for expanding and optimizing the company's data and datapipeline architecture, as well as optimizing data flow and collection for cross-functional teams. The ideal candidate is an experienced datapipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support the software developers and head up data initiatives and will ensure optimal data delivery architecture … primarily utilizing Data Factory, Databricks, and Azure SQL. Experience in building and optimizing 'big data' datapipelines, architectures, and data sets. Experience in performing root cause analysis on internal and external data and processes to answer specific business questions and more »
Director, Enterprise Data Architect The Director, Enterprise Data Architect will set the organisations data vision, strategy and roadmap to enable us as a data driven organisation where data is a strategic enterprise asset that used by everyone in the organisation … to derive insights specific to their own context. You will develop complex conceptual and logical data definitions and specifications for the organisation and ensuring the data architecture is optimised to meet the requirements and needs of the business. Working closely with executive level leaders across Digital … Azure DW/Synapse, Spark, ADL, Power BI. Talend/Streamsets is a must along with Machine Learning Familiarity with ETL, datapipeline development and data cleansing with large-scale, complex data-sets Ability to work in close partnership with other IT functions more »
South West London, London, United Kingdom Hybrid / WFH Options
Triad Group Plc
Data Engineer, AWS, Python. LOCATION : Based at client locations or working remotely JOB TYPE : PERMANENT SALARY: £50,000 - £65,000 Frustrated that your bright ideas arent being listened to? Feeling like you arent being given the platform to make the difference you crave? A career at Triad could … in the UK public sector? Description DataPipeline Development: Design, develop, and maintain datapipelines to efficiently extract, transform, and load (ETL) data from various sources into our data lake or data warehouse on AWS. Data Modeling … monitoring and performance optimisation strategies to ensure datapipelines run smoothly and efficiently. Documentation: Maintain detailed documentation of datapipelines, data models, and procedures for data engineering best practices. Key Requirements: Experience Collaborate with cross-functional teams, including scientists, developers and business more »
and quality control . CORE WORK AREAS Lead the design and implement of scalable, efficient, and robust datapipelines to extract, transform, and load (ETL) data from various sources into data storage and platform solutions. Engage and collaborate with cross-functional teams, to understand … Experience demonstrating proficiency and able to give examples of successful, hands-on implementation of ETL processes and delivery of datapipelines into various data management architecture Proficient working with data classification and standards Demonstrating detailed skills in cleaning, integrating and scaling data sets and pipelines, fixing problems in data sets, from low performance to bugs and outages Proficiency in programming languages such as Python, Java, or Scala, used for scripting and automation Proficiency in using query languages such as SQL, Hive, R, with any ability to work with more »
a Data Engineer, you will play a crucial role in designing, implementing, and maintaining datapipelines and systems that enable efficient and accurate data processing, storage, and analysis. Key Responsibilities: 1. AWS Expertise: • Design, deploy, and maintain data solutions on the AWS … to handle large-scale data processing from various sources to target systems. • Ensure datapipelines are reliable, scalable, and adhere to best practices. 3. Data Storage and Management: • Implement and manage data storage solutions on AWS, including data warehousing, data … indexing strategies and optimize search performance. 7. Performance Optimization: • Monitor and troubleshoot datapipelines, identifying bottlenecks and areas for improvement. • Optimize database queries and infrastructure for maximum efficiency. 8. Data Quality and Validation: • Implement data validation and data quality checks to ensure more »
both internal and external to the Force. CORE WORK AREAS Design and implement scalable, efficient, and robust datapipelines to extract, transform, and load (ETL) data from various sources into data storage and platform solutions. Engage and collaborate with cross-functional teams, to understand … optimally, leading on extracting data, joining and helping analysts to derive insights Build and optimise the performance of pipelines and APIs that connect operational systems, data for analytics and business intelligence (BI) systems, troubleshooting and resolving issues as they arise, to ensure the timely delivery of … Demonstrable experience showing competency working with data classification and standards Competent ability to clean, integrate and scale data sets and pipeline, fixing problems in data sets, from low performance to bugs and outages Proficiency in programming languages such as Python, Java, or Scala more »
My client is a dynamic and innovative consulting firm at the forefront of leveraging data to drive business success. They are seeking a highly skilled and experienced Senior Data Architect to join their growing team and play a pivotal role in shaping and optimizing the enterprise … data architecture. Key Responsibilities: As a Senior Data Architect you will be responsible for: • Designing and Implementing Enterprise Data Architectures: • Utilize your extensive experience to design, implement, and optimize enterprise data architectures. • Lead efforts in data modeling, database design, ETL … Integration Strategies: • Create strategies for integrating data from diverse sources, including designing datapipelines and ETL processes. • Cloud Expertise: • Demonstrate in-depth knowledge of cloud-based data platforms, with a primary focus on AWS and Azure. • Leverage cloud services for efficient datamore »
Wigston, Leicestershire, East Midlands, United Kingdom
CROMWELL GROUP HOLDINGS LTD
thing in common: delivering exceptional service for our customers. And we do this through our purpose of Keeping Industry Working . Spearheading our data science initiatives to develop and implement business data-driven solutions and improve processes. Your responsibilities will encompass predictive and prescriptive analytics, developing … data-driven strategies, and fostering a culture of data curiosity and innovation within the business. What's in it for you? Competitive Salary Company Bonus Competitive annual leave allowance with annual purchase scheme Group Personal Pension Company Funded Healthcare Cash Plan Commitment to employee development plans … to empower them to leverage data effectively. Collaborate with Data Engineers to design and maintain datapipelines, ensuring data availability and reliability. Provide guidance on data storage, data warehousing, and data infrastructure decisions to support analytics more »
Employment Type: Permanent
Salary: Competitive Salary+Bonus+30 Days Leave+Pension
London, England, United Kingdom Hybrid / WFH Options
Amber Labs
analysts, and support staff. Our clients have the opportunity to earn R&D credits that can be used towards our areas of expertise: Data, Governance, and Cloud Engineering, allowing us to drive customer-focused innovation. Our work extends across both the public and private sectors, providing our colleagues … clients. 3. Consistent investment in our ADM (Amber Labs Delivery Methodology, underpinned by Agile Methodology) to ensure maximum velocity, quality, and value. ROLE: Data Engineer (DV Cleared) LOCATION: London (Hybrid) Data Architecture: Collaborate with stakeholders to define data requirements … and create data architecture solutions that align with business objectives. Design and implement data models, datapipelines, and data warehouses that are optimized for performance and scalability. Data Integration: Integrate data from various sources, including structured and unstructured more »
South East London, England, United Kingdom Hybrid / WFH Options
Amber Labs
analysts, and support staff. Our clients have the opportunity to earn R&D credits that can be used towards our areas of expertise: Data, Governance, and Cloud Engineering, allowing us to drive customer-focused innovation. Our work extends across both the public and private sectors, providing our colleagues … clients. 3. Consistent investment in our ADM (Amber Labs Delivery Methodology, underpinned by Agile Methodology) to ensure maximum velocity, quality, and value. ROLE: Data Engineer (DV Cleared) LOCATION: London (Hybrid) Data Architecture: Collaborate with stakeholders to define data requirements … and create data architecture solutions that align with business objectives. Design and implement data models, datapipelines, and data warehouses that are optimized for performance and scalability. Data Integration: Integrate data from various sources, including structured and unstructured more »
Aldershot, Hampshire, South East, United Kingdom Hybrid / WFH Options
Searchability
from various sources. Your responsibilities include building and delivering Azure Data Engineering solutions, creating modern datapipelines, and developing processes for Data Modelling, Data Mining, and Data Warehousing solutions. You'll collaborate with cross-functional teams, providing innovative solutions … Engineering solutions. Assemble large, complex data sets to meet business requirements. Develop and maintain datapipelines using Azure/AWS tools. Enhance data set processes for Data Modelling, Data Mining, and Data Warehousing. Build infrastructure for optimal … Degree or equivalent in a relevant subject (e.g., Computer Science, Information Systems). Deep knowledge of SQL, Cloud-based datapipelines, architectures, and data sets. Experience with big data tools such as Hadoop, Spark. Proficiency in working with large data sets, datamore »
Manchester, England, United Kingdom Hybrid / WFH Options
Starling Bank
mandate how much you visit the office and work from home, that's to be agreed upon between you and your manager. Our Data Environment Our Data teams are excited about the value of data within the business, powers our product decisions to improve … may be. Hear from the team in our latest blogs or our case studies with Women in Tech. We are looking for talented data professionals at all levels to join the team. We value people being engaged and caring about customers, caring about the code they write and … BI, data visualisation Third-party data and relationship management Data management for ML models Datapipelines, data integration, ETL Data modelling Risk management Interview process Interviewing is a two way process and we want you to have the more »
Manchester, North West, United Kingdom Hybrid / WFH Options
DRAGOONIS TECHNOLOGIES LIMITED
Reference: DT-259p Title : Lead Data Engineer Job Type: Permanent Salary: £80,000 - £100,000 Location: Fully Remote, UK The Client Industry : Global Aviation and Fuel Management technology business. Purpose : Rationalise and support the use of fuel management for sustainability. This will add environmental impacts and financial impacts … requirements. Ensure data accuracy and integrity throughout all data warehouses. DataPipelines Design and Operation: Develop, construct, test, and maintain architectures, such as Data Warehouse, Pipeline Orchestration, Data Transformation tooling systems. Collaborate with data analytics engineers to … of CI/CD processes for datapipelines and infrastructure. Guide the team on best practices related to version control, testing, and deployment. Collaborate with cross-functional teams to ensure smooth deployment and scaling of data solutions. BI Reporting System Implementation and Maintenance: Partner with business more »
Data Architect Salary Range: Negotiable Dependent on Experience Location: Eastleigh - Hybrid Working Hours: 37 ½ per week, Monday - Friday Contract: Full Time, Permanent Do you have a deep passion for data and analytics? Have you honed your skills in crafting and executing data solutions for … If your response is a resounding "yes" to these questions, then an exciting opportunity awaits you Our client is in search of a Data Architect to join their dedicated Enterprise Data Services team based in Eastleigh. As part of your regular workday, you will engage in … Collaborate with the rest of the Springboard Architecture squad on overall solution design. Design and govern Central Data Platform datapipeline patterns. High-level ETL design/review. Collaborate on conceptual data modelling (dimensional) Review logical data models. Present designs to more »
in a relevant subject e.g. Computer Science, Information Systems, or related Technical Discipline. Deep knowledge of SQL, Cloud based datapipelines, architectures, and data sets. Experience working with big data tools such as Hadoop, Spark is required. Experience working with large data sets, datapipeline and workflow management tools, Azure/AWS cloud services, and stream-processing system. Experience with Agile methodology. Background in programming in Python, Scala, C, C++, or Java would be beneficial. Good written and verbal communication skills along with strong desire to work … data sets that meet functional and non-functional business requirements. Create and maintain modern datapipelines using Azure/AWS tools. Develop and improve data set processes for data modelling, mining, and production. Build the infrastructure required for optimal extraction, transformation, and more »
Data Engineering Lead London 3 days per week Up to £110,000 Salary + Bonus OVERVIEW Multi-billion-pound property company are hiring for a Data Engineering Lead/… Data Lead to spearhead their Data team in London! Interesting and varied role where you can work across the whole pipeline on a breadth of impactful projects! Great scope for career progression & you'll be leading a team of Data Engineers/Analysts … Data Engineering work to contribute towards the Data Science/Analytics and modelling. Work across the whole Datapipeline (Data Engineering, Data Warehousing, Data Science etc) Help shape the strategy and roadmap for whole Datamore »
Responsibilities: As an integral member of their team, you will: Design, develop, and maintain end-to-end datapipelines: Leverage your expertise in Azure Data Factory to create efficient and scalable pipelines for ingesting, transforming, and loading data from diverse sources. Collaborate with cross … and validation processes. Monitor and optimise pipeline performance: Proactively manage and optimise datapipelines to ensure optimal throughput and minimal latency, contributing to the overall efficiency of their data ecosystem. Troubleshoot and resolve issues: Address datapipeline failures and performance bottlenecks with … in Python, C#, or PowerShell to enhance and optimize data processes. Excellent problem-solving skills: Ability to troubleshoot complex datapipeline issues with precision. Strong communication skills: Effectively collaborate with both technical and non-technical stakeholders. Optional but a plus: Azure certifications, such as Microsoft more »
quality, and driving data-driven decisions across customer enterprises. The Opportunity : · Architect and implement scalable datapipelines and data warehouses to efficiently collect, process, and store large volumes of data from various customer provided data sources · Establish and … the accuracy, consistency, and security of our data. · Identify performance bottlenecks and optimize datapipelines and systems to ensure maximum efficiency and responsiveness. · Lead and manage a team of data engineers, provide mentorship, guidance, and support to foster a collaborative and high-performing work environment. · Oversee … architecture, with a track record of leading successful data engineering projects · Strong expertise in designing and implementing datapipelines, data warehouses, and ETL processes. · Experience with cloud platforms like AWS, Azure, or Google Cloud and familiarity with related services such as S3, Redshift more »
Lead Data Engineer London x3 days per week Up to £110,000 Salary + Bonus OVERVIEW Multi-billion-pound property company are hiring for a Data Engineering Lead/… Data Lead to spearhead their Data team in London! Interesting and varied role where you can work across the whole pipeline on a breadth of impactful projects! Great scope for career progression & you’ll be leading a team of Data Engineers/Analysts … Data Engineering work to contribute towards the Data Science/Analytics and modelling. Work across the whole Datapipeline (Data Engineering, Data Warehousing, Data Science etc) Help shape the strategy and roadmap for whole Datamore »
Lincoln, Lincolnshire, United Kingdom Hybrid / WFH Options
Adecco
We are currently recruiting for an experienced Data Architect to work for Lincolnshire Police at their headquarters in Nettleham. This role will be hybrid working Monday to Friday 37 hours a week. PLEASE NOTE DUE TO POLICE VETTING CRITERIA YOU MUST HAVE RESIDED WITHIN THE UK CONTINUOUSLY FOR … YEARS AT THE TIME OF APPLICATION. UNFORTUNATELY ANYTHING LESS THAN THIS WILL NOT BE CONSIDERED. JOB PURPOSE AND SCOPE The Force's Data Architect is accountable for the design and build of the Force's overall data architecture, taking full responsibility for its implementation and its … and lead on its implementation. Leads the design and build scalable and efficient datapipelines and platforms (be these fabric or mesh, whichever is strategically appropriate) that enables seamless data flow and more effective combination, analysis and sharing of multiple data sources, across the more »
Rothersthorpe, Northampton, East Midlands, NN4 9BS, UK
Avery Healthcare Ltd
the MS Azure platform. You will collaborate closely with the Technology and Healthcare teams to build and maintain robust datapipelines, optimize data flows, and assist in data analysis and reporting processes.The role will beHybridwith1-2 daysnegotiable,at our office based at Group Support … where people feel valued. Responsibilities : Collaborate with the Technology, Operations, Finance and Project teams to develop and maintain datapipelines, ensuring smooth and efficient data flow from multiple sources to various systems. Design, develop, and implement data integration solutions using Microsoft products such as … with Microsoft products, including MySQL, Power BI, and the MS Azure platform. Experience with data integration, ETL processes, and datapipeline development. Proficiency in SQL and database query optimization. Knowledge of data warehousing concepts and dimensional data modeling. Understanding of GDPR more »
Northampton, Northamptonshire, East Midlands, United Kingdom Hybrid / WFH Options
The White Company
on our Data platform architecture. Working with Azure based technologies, you will be building and maintaining datapipelines, querying and analysing data and optimizing data flows. You will support broader development teams to deliver data solutions for key business requirements. … What youll be doing Create and maintain optimal datapipelines/ETL processes using A Data Factory Assemble large, complex data sets using Azure Data lake that meet functional/non-functional business requirements. Architect and extend data models for … data visualisation, analytics and data transfer Analyse and tune performance of data delivery and ensure scalability of data processes Work with data and analytics experts to strive for greater functionality in our data systems. Support in datamore »
London, England, United Kingdom Hybrid / WFH Options
Cera
do Data Ingestion Management: You’ll collaborate with cross-functional teams to design, implement, and maintain data ingestion pipelines for various data sources, including External APIs, Internal Databases, Event Stores, Web Apps, Google Sheets, and more. Utilise tools such as Airflow, Datastream, Pub/… Airflow, Terraform, and UI Path. Collaborate with the team to automate deployment pipelines using Cloud Build and ensure smooth integration with Composer. VPC and Security Management: You’ll ensure the secure operation of our data platform by managing resources inside the Cera Private VPC, by implementing and maintaining … high standard/best practice for our data warehouse and datapipelines You’re an evangelist. You will build support for the importance of collecting and leveraging data, be open and willing to engage with everyone to discuss, to share, to inquire, and to more »
London, England, United Kingdom Hybrid / WFH Options
Cera
/testing practices, and ensure optimal and standardized approaches are followed by the engineering team You’ll engineer datapipelines to collect and protect data from multiple sources (e.g. Relational and non-relational data stores, APIs, Google Drive, event delivery pipelines), into a single … in an autonomous environment. 5+ years SQL experience developing datapipelines in a cloud based database environment (e.g.BigQuery), including scheduling your own queries (e.g. using Airflow or Stitch) 3+ years experience delivering data products in a modern BI technology (ideally Looker) or open source data … continuous delivery pipelines to automate builds, tests, and deployments in one release workflow, using modern tools and systems (e.g. Git, Jenkins) Strong technical calibre and team lead skills to ensure the best quality of work is delivered, consistently Strong executional capacity in dev/data/security operations more »