a Data Engineer: Design and architect modern data solutions that align with business objectives and technical requirements Design and implement advanced ETL/ELT pipelines using Python, SQL, and Apache Airflow Build highly scalable and performant data solutions leveraging cloud platforms and technologies Develop complex data models to handle enterprise-level analytical needs Make critical technical decisions on tools … Services (SSIS) Good experience with ETL - SSIS, SSRS, T-SQL (On-prem/Cloud) Strong proficiency in SQL and Python for handling complex data problems Hands-on experience with ApacheSpark (PySpark or Spark SQL) Experience with the Azure data stack Knowledge of workflow orchestration tools like Apache Airflow Experience with containerisation technologies like Docker Proficiency … platforms Experience with data quality frameworks and implementation Understanding of data lineage and metadata management Experience with technical project management Experience with data visualisation tools like Power BI or Apache Superset Experience with other cloud data platforms like AWS, GCP or Oracle Experience with modern unified data platforms like Databricks or Microsoft Fabric Experience with Kubernetes for container orchestration More ❯
Greater Bristol Area, United Kingdom Hybrid / WFH Options
ADLIB Recruitment | B Corp™
systems Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools More ❯
Bristol, Avon, South West, United Kingdom Hybrid / WFH Options
ADLIB Recruitment
systems Clear communicator, able to translate complex data concepts to cross-functional teams Bonus points for experience with: DevOps tools like Docker, Kubernetes, CI/CD Big data tools (Spark, Hadoop), ETL workflows, or high-throughput data streams Genomic data formats and tools Cold and hot storage management, ZFS/RAID systems, or tape storage AI/LLM tools More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Made Tech Limited
Strong experience in Infrastructure as Code (IaC) and deploying infrastructure across environments Managing cloud infrastructure with a DevOps approach Handling and transforming various data types (JSON, CSV, etc.) using ApacheSpark, Databricks, or Hadoop Understanding modern data system architectures (Data Warehouse, Data Lakes, Data Meshes) and their use cases Creating data pipelines on cloud platforms with error handling More ❯
systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Monitoring utilising products such as: Prometheus, Grafana, ELK, filebeat etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: ApacheSpark and the Hadoop Ecosystem Edge technologies e.g. NGINX, HAProxy etc. Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable for Data More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Curo Resourcing Ltd
etc. Infrastructure as Code and CI/CD paradigms and systems such as: Ansible, Terraform, Jenkins, Bamboo, Concourse etc. Observability - SRE Big Data solutions (ecosystems) and technologies such as: ApacheSpark and the Hadoop Ecosystem Excellent knowledge of YAML or similar languages The following Technical Skills & Experience would be desirable: Jupyter Hub Awareness RabbitMQ or other common queue More ❯
Bristol, Gloucestershire, United Kingdom Hybrid / WFH Options
Actica Consulting Limited
scalable data infrastructure, develop machine learning models, and create robust solutions that enhance public service delivery. Working in classified environments, you'll tackle complex challenges using tools like Hadoop, Spark, and modern visualisation frameworks while implementing automation that drives government efficiency. You'll collaborate with stakeholders to transform legacy systems, implement data governance frameworks, and ensure solutions meet the … Collaborative, team-based development; Cloud analytics platforms e.g. relevant AWS and Azure platform services; Data tools hands on experience with Palantir ESSENTIAL; Data science approaches and tooling e.g. Hadoop, Spark; Data engineering approaches; Database management, e.g. MySQL, Postgress; Software development methods and techniques e.g. Agile methods such as SCRUM; Software change management, notably familiarity with git; Public sector best More ❯