3 of 3 Data Ingestion Jobs in Bristol

Senior Data Engineer

Hiring Organisation
Advanced Resource Managers
Location
Greater Bristol Area, United Kingdom
Senior Data Engineer Bristol 12-Month Contract Paying up to £79p/h (Outside IR35) Role Overview: Our client, a large Aerospace company, is looking for an experienced Senior Data Engineer to assist with building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana … Apache NiFi Key Responsibilities: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance ...

Systems Engineer

Hiring Organisation
Synergize Consulting
Location
Greater Bristol Area, United Kingdom
DESCRIPTION This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a UK based onsite … role with the option of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data ...

Data Engineer

Hiring Organisation
Venn Group
Location
Bristol, Avon, England, United Kingdom
Employment Type
Contractor
Contract Rate
£225 - £249 per day
West of England Combined Authority are seeking a highly skilled Data Engineer to support the optimisation, documentation, and enhancement of our existing data pipelines within the Databricks Lakehouse environment. This role will involve hands-on development, reverse-engineering of current solutions, and close collaboration with internal teams … debug Databricks notebooks, Delta Lake workflows, and Genie-bot automations Design, maintain, and enhance ETL pipelines , ensuring reliability and performance Validate, transform, and orchestrate data using Python and SQL within the Lakehouse environment Manage data ingestion and scheduling processes via Azure Data Factory Reverse-engineer, document ...