City Of Bristol, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
systems Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
researching new technologies and software versions Working with cloud technologies and different operating systems Working closely alongside Data Engineers and DevOps engineers Working with big data technologies such as spark Demonstrating stakeholder engagement by communicating with the wider team to understand the functional and non-functional requirements of the data and the product in development and its relationship to … networks into production Experience with Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
LHH
large-scale data pipelines in secure or regulated environments Ingest, process, index, and visualise data using the Elastic Stack (Elasticsearch, Logstash, Kibana) Build and maintain robust data flows with Apache NiFi Implement best practices for handling sensitive data, including encryption, anonymisation, and access control Monitor and troubleshoot real-time data pipelines to ensure high performance and reliability Write efficient … Skills and Experience: 3+ years’ experience as a Data Engineer in secure, regulated, or mission-critical environments Proven expertise with the Elastic Stack (Elasticsearch, Logstash, Kibana) Solid experience with Apache NiFi Strong understanding of data security, governance, and compliance requirements Experience building real-time, large-scale data pipelines Working knowledge of cloud platforms (AWS, Azure, or GCP), particularly in … stakeholder management skills Detail-oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data More ❯
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site) Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi to … impact-ideal for professionals with a strong track record in regulated sectors. What You'll Be Doing Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi Handling Real Time data ingestion and transformation with an emphasis on integrity and availability Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs Monitoring … Minimum 3 years' experience as a Data Engineer in sensitive or regulated industries Proficiency in the full Elastic Stack for data processing, analytics, and visualisation Hands-on expertise with Apache NiFi in designing sophisticated data workflows Solid Scripting capabilities using Python, Bash, or similar Familiarity with best practices in data protection (encryption, anonymisation, access control) Experience managing large-scale More ❯
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site) Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi to … impact-ideal for professionals with a strong track record in regulated sectors. What You'll Be Doing Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi Handling real-time data ingestion and transformation with an emphasis on integrity and availability Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs Monitoring … Minimum 3 years' experience as a Data Engineer in sensitive or regulated industries Proficiency in the full Elastic Stack for data processing, analytics, and visualisation Hands-on expertise with Apache NiFi in designing sophisticated data workflows Solid scripting capabilities using Python, Bash, or similar Familiarity with best practices in data protection (encryption, anonymisation, access control) Experience managing large-scale More ❯
DV (MOD) Cleared Data Engineer - Elastic Stack & Apache NiFi Location: Bristol | Contract Type: £430.00 pd (Outside IR35) | Working Pattern: Hybrid (3 - 4 days on-site) Are you a contract Data Engineer with a knack for designing secure, high-performance data solutions? We're on the lookout for a technical expert in the Elastic Stack and Apache NiFi to … impact-ideal for professionals with a strong track record in regulated sectors. What You'll Be Doing Designing and deploying scalable, secure data pipelines using Elasticsearch, Logstash, Kibana, and Apache NiFi Handling real-time data ingestion and transformation with an emphasis on integrity and availability Collaborating with architects and cybersecurity stakeholders to align with governance and compliance needs Monitoring … Minimum 3 years' experience as a Data Engineer in sensitive or regulated industries Proficiency in the full Elastic Stack for data processing, analytics, and visualisation Hands-on expertise with Apache NiFi in designing sophisticated data workflows Solid scripting capabilities using Python, Bash, or similar Familiarity with best practices in data protection (encryption, anonymisation, access control) Experience managing large-scale More ❯
systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: ApacheSpark and the Hadoop Ecosystem Edge technologies e.g. NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable for Data More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet the … Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best More ❯
Stroud, England, United Kingdom Hybrid / WFH Options
Data Engineer
for best practice and technical excellence and be a person that actively looks for continual improvement opportunities. Knowledge and skills Experience as a Data Engineer or Analyst Databricks/ApacheSpark SQL/Python BitBucket/GitHub. Advantageous dbt AWS Azure Devops Terraform Atlassian (Jira, Confluence) About Us What's in it for you... Healthcare plan, life assurance More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Gemba Advantage
Java, TypeScript, Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. relational databases, Elasticsearch, Mongo Why join Gemba More ❯
Salisbury, Wiltshire, South West, United Kingdom Hybrid / WFH Options
Anson Mccade
modern data lake/lakehouse architectures Strong grasp of cloud data platforms (AWS, Azure, GCP, Snowflake) Understanding of Data Mesh , Data Fabric , and data product-centric approaches Familiarity with ApacheSpark , Python , and ETL/ELT pipelines Strong knowledge of data governance, lifecycle management, and compliance (e.g. GDPR) Consulting experience delivering custom data solutions across sectors Excellent leadership More ❯
to ensure code is fit for purpose Experience that will put you ahead of the curve Experience using Python on Google Cloud Platform for Big Data projects, BigQuery, DataFlow (Apache Beam), Cloud Run Functions, Cloud Run, Cloud Workflows, Cloud Composure SQL development skills Experience using Dataform or dbt Demonstrated strength in data modelling, ETL development, and data warehousing Knowledge … to be a part of it! Our Future, Our Responsibility - Inclusion and Diversity at Future We embrace and celebrate diversity, making it part of who we are. Different perspectives spark ideas, fuel creativity, and push us to innovate. That's why we're building a workplace where everyone feels valued, respected, and empowered to thrive. When it comes to More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: ApacheSpark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue More ❯
Bristol, Avon, England, United Kingdom Hybrid / WFH Options
Hays Specialist Recruitment Limited
ll need to be a hands-on technical data engineer with experience across a range of modern technologies. Experience with Microsoft fabric is preferable and good working knowledge on ApacheSpark is essential - PySpark, SparkSQL and T-SQL. You'll need good working knowledge of cloud data, specifically designing scalable solutions in Azure Data Lake, Synapse & Delta Lake. More ❯
Principal Data Engineer, Consulting Bristol Based You must be eligible for SC Clearance Role Overview The Principal Data Engineer will be responsible for designing and implementing cloud-based data solutions using a range of AWS services. This role involves working More ❯