Azure technologies. Proficiency in Azure data services (Azure SQL Database, Azure Synapse Analytics, Azure Data Factory, Azure Databricks). Experience with data modeling, data warehousing, and big data processing (Hadoop, Spark, Kafka). Strong understanding of SQL and NoSQL databases, data modeling, and ETL/ELT processes. Proficiency in at least one programming language (Python, C#, Java). Experience More ❯
for the last 10 years, and ability to obtain security clearance. Preferred Skills Experience with cloud platforms (IBM Cloud, AWS, Azure). Knowledge of big data frameworks (Apache Spark, Hadoop). Experience with data warehousing tools like IBM Cognos or Tableau. Certifications in relevant technologies are a plus. Additional Details Seniority level: Mid-Senior level Employment type: Full-time More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo SpA
Job Description: The Opportunity: The Leonardo Cyber & Security Division, one of the three divisions in Leonardo UK, is a pivotal innovator, helping customers deliver and secure their digital transformation. The Cyber & Security Division is at the forefront of supplying technology More ❯
City Of Bristol, England, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools to More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Leonardo UK Ltd
Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don't tick every box. The More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo SpA
Docker Experience with NLP and/or computer vision Exposure to cloud technologies (eg. AWS and Azure) Exposure to Big data technologies Exposure to Apache products eg. Hive, Spark, Hadoop, NiFi Programming experience in other languages This is not an exhaustive list, and we are keen to hear from you even if you don't tick every box. The More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Lloyds Bank plc
exciting new technologies to design and build scalable real-time data applications. Spanning the full data lifecycle and experience using a mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you’ll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in … and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation: Good knowledge of containers ( Docker, Kubernetes etc More ❯
evaluating exciting new technologies to design and build scalable real time data applications. Spanning the full data lifecycle and experience using mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you'll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in … and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures. Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ). Proficiency in infrastructure as code (IaC) using Terraform . Experience with CI/CD pipelines and related tools/frameworks. Containerisation Good knowledge of containers ( Docker, Kubernetes etc More ❯
evaluating exciting new technologies to design and build scalable real time data applications. Spanning the full data lifecycle and experience using mix of modern and traditional data platforms (e.g. Hadoop, Kafka, GCP, Azure, Teradata, SQL server) you'll get to work building capabilities with horizon-expanding exposure to a host of wider technologies and careers in data. Helping in … and non-relational databases to build data solutions, such as SQL Server/Oracle , experience with relational and dimensional data structures Experience in using distributed frameworks ( Spark, Flink, Beam, Hadoop ) Proficiency in infrastructure as code (IaC) using Terraform Experience with CI/CD pipelines and related tools/frameworks Containerisation Good knowledge of containers ( Docker, Kubernetes etc) Cloud Experience More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
build scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet … R; Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
LHH
oriented with a strong focus on data accuracy, quality, and reliability Desirable (Nice to Have): Background in defence, government, or highly regulated sectors Familiarity with Apache Kafka, Spark, or Hadoop Experience with Docker and Kubernetes Use of monitoring/alerting tools such as Prometheus, Grafana, or ELK Understanding of machine learning algorithms and data science workflows Proven ability to More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
Ripjar
in production and have a curiosity and interest in learning more. In this role, you will be using python (specifically pyspark) and Node.js for processing data, backed by various Hadoop stack technologies such as HDFS and HBase. MongoDB and Elasticsearch are used for indexing smaller datasets. Airflow & Nifi are used to co-ordinate the processing of data, while Jenkins … software systems in production and have a curiosity and interest in learning more. You will be using Python (specifically pyspark) and Node.js for processing data You will be using Hadoop stack technologies such as HDFS and HBase Experience using MongoDB and Elasticsearch for indexing smaller datasets would be beneficial Experience using Airflow & Nifi to co-ordinate the processing of More ❯
visualization tools such as QuickSight, Tableau, Looker, or QlikSense . Ability to create well-documented, scalable, and reusable data solutions. Desirable Skills Experience with big data technologies such as Hadoop, MapReduce, or Spark. Exposure to microservice-based data APIs . Familiarity with data solutions in other public cloud platforms . AWS certifications (e.g., Solutions Architect Associate , Big Data Specialty More ❯
with data visualization tools such as QuickSight, Tableau, Looker, or QlikSense . Ability to create well-documented, scalable, and reusable data solutions. Experience with big data technologies such as Hadoop, MapReduce, or Spark. Exposure to microservice-based data APIs . Familiarity with data solutions in other public cloud platforms . AWS certifications (e.g., Solutions Architect Associate , Big Data Specialty More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: Apache Spark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue technology e.g. ActiveMQ NiFi Rego More ❯
Bristol, England, United Kingdom Hybrid / WFH Options
hays-gcj-v4-pd-online
valued at Young Lives vs Cancer. To succeed, you should demonstrate: Hands-on experience with data engineering tools and technologies (databases, data warehouses, data integration solutions, SQL, Python, Spark, Hadoop, etc.) Knowledge of data architecture and best practices in data integration Experience with Microsoft Data solutions like Fabric, Snowflake, Redshift Stakeholder engagement skills, supporting understanding of data technologies, opportunities More ❯