Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
exciting and critical challenges to the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite role with … the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with security policies and data governance standards. Manage and … experience working as a Data Engineer in secure or regulated environments. Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization. Strong experience with Apache NiFi for building and managing complex data flows and integration processes. Knowledge of security practices for handling sensitive data, including encryption, anonymization, and access control. Familiarity with data governance More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
working with relational and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation: Good knowledge of containers … AWS, or Azure . Good understanding of cloud storage, networking, and resource provisioning. It would be great if you had... Certification in GCP “Professional Data Engineer”. Certification in Apache Kafka (CCDAK). Proficiency across the data lifecycle. Working for us: Our focus is to ensure we are inclusive every day, building an organisation that reflects modern society and More ❯
City Of Bristol, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
systems Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo SpA
researching new technologies and software versions Working with cloud technologies and different operating systems Working closely alongside Data Engineers and DevOps engineers Working with big data technologies such as spark Demonstrating stakeholder engagement by communicating with the wider team to understand the functional and non-functional requirements of the data and the product in development and its relationship to … networks into production Experience with Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
LHH
large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with Apache NiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write efficient … Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with Apache NiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in … stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet the … Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet the … Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, ApacheSpark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
hays-gcj-v4-pd-online
engineers within the team. You need to be a hands-on technical data engineer with experience in modern technologies. Experience with Microsoft Fabric is preferable, and good knowledge of ApacheSpark (PySpark, SparkSQL) and T-SQL is essential. You should have experience designing scalable cloud data solutions in Azure Data Lake, Synapse, and Delta Lake. Knowledge of data More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: ApacheSpark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
ll need to be a hands-on technical data engineer with experience across a range of modern technologies. Experience with Microsoft fabric is preferable and good working knowledge on ApacheSpark is essential - PySpark, SparkSQL and T-SQL. You'll need good working knowledge of cloud data, specifically designing scalable solutions in Azure Data Lake, Synapse & Delta Lake. More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
VC Evidensia UK
should expect to provide technical leadership to teams working with neighboring technologies, most notably Microsoft’s Power Platform and Azure infrastructure. Experience across Microsoft Azure cloud platform = DataBricks/Spark The administration of database systems, primarily Microsoft SQL Server. Data Warehousing/Data Lakes/Master Data Management/Artificial Intelligence (AI)/Data Governance. The use of REST More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
hays-gcj-v4-pd-online
are valued at Young Lives vs Cancer. To succeed, you should demonstrate: Hands-on experience with data engineering tools and technologies (databases, data warehouses, data integration solutions, SQL, Python, Spark, Hadoop, etc.) Knowledge of data architecture and best practices in data integration Experience with Microsoft Data solutions like Fabric, Snowflake, Redshift Stakeholder engagement skills, supporting understanding of data technologies More ❯