Bristol, England, United Kingdom Hybrid / WFH Options
Leonardo
the UK’s digital landscape. This role requires strong expertise in building and managing data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. The successful candidate will design, implement, and maintain scalable, secure data solutions, ensuring compliance with strict security standards and regulations. This is a … of compressed hours. The role will include: Design, develop, and maintain secure and scalable data pipelines using the Elastic Stack (Elasticsearch, Logstash, Kibana) and Apache NiFi. Implement data ingestion, transformation, and integration processes, ensuring data quality and security. Collaborate with data architects and security teams to ensure compliance with … Engineer in secure or regulated environments Expertise in the Elastic Stack (Elasticsearch, Logstash, Kibana) for data ingestion, transformation, indexing, and visualization Strong experience with ApacheNiFi for building and managing complex data flows and integration processes Knowledge of security practices for handling sensitive data, including encryption, anonymization, and More ❯
learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Responsibilities: Work with stakeholders to understand their data needs - researches and provides solutions to meet future … learning and data visualization as applicable. The ability to use/code in a language applicable to the project or task order such as Apache Hadoop, Python, and advanced knowledge of machine learning. Experience in building and maintaining of an enterprise data model Experience in implementing data pipelines using … ETL and ELT technologies such as Apache Kafka and ApacheNifi Experienced in data architecture and management tools such as ER/Studio, Alation, and DataHub Experience with data modeling, data warehousing, and data analytics Experience with cloud technologies and cloud computing platforms Experience with security and More ❯
Experience with data access control, specifically Role-Based Access Control (RBAC) and Attribute-Based Access Control (ABAC) Familiarity with data science platforms (Anaconda, Jupyter, NiFi) Experience with Python, C#, or similar languages will be beneficial Knowledge of modern web development frameworks (Node.js, React, Angular) Experience developing and maintaining complex More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Gaming Innovation Group
Object-oriented programming (Java) Data modeling using various database technologies ETL processes (transferring data in-memory, moving away from traditional ETLs) and experience with Apache Spark or ApacheNiFi Applied understanding of CI/CD in change management Dockerized applications Using distributed version control systems Being an More ❯
Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Gaming Innovation Group
at: Object oriented programming (Java) Data modelling using any database technologies ETL processes (ETLs are oldschool, we transfer in memory now) and experience with Apache Spark or ApacheNiFi Applied understanding of CI\CD in change management Dockerised applications Used distributed version control systems Excellent team player More ❯
processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres … sql server) Data Integration Tools: Knowledge of platforms like Airflow, ApacheNiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience More ❯
cloud application deployment across AWS, Google Cloud, and Microsoft Azure -Design, build, and maintain extract, transform and load (ETL) workflows using tools such as ApacheNiFi -Develop visualization suite using open source and licensed metrics and monitoring tools -Integrate complex applications with external systems, services, and data providers … in a team environment. -Excellent verbal and written communication skills. Preferred Qualifications: -TS/SCI w/Poly -Experience with relational databases -Experience with Apache Spark -Experience in an Agile development environment -AWS Cloud Certifications (Associate or Professional More ❯
Python, and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies like NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, including infrastructure as code and GitOps Database technologies, e.g., relational More ❯
London, England, United Kingdom Hybrid / WFH Options
Gemba Advantage
and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. More ❯
processes. Technical Skills Programming: Proficiency in Python, Java, Scala, or similar languages. Big Data Technologies: Hands-on experience with big data tools e.g. (Databricks, Apache Spark, Hadoop). Cloud Platforms: Familiarity with AWS, Azure, GCP, or other cloud ecosystems for data engineering tasks. Expertise in relational databases (e.g. postgres … sql server) Data Integration Tools: Knowledge of platforms like Airflow, ApacheNiFi, or Talend. Data Storage and Modelling: Experience with data warehousing tools (e.g. Snowflake, Redshift, BigQuery) and schema design. Version Control and CI/CD: Familiarity with Git, Docker, and CI/CD pipelines for deployment. Experience More ❯
and other Kafka ecosystem tools Kafka knowledge of/with: Kafka Schema Security Manager, Kafka Schema Registry, and Kafka architecture and internals Experience with ApacheNiFI Experience with AWS Experience with MongoDB, Redis Experience with agile development methodologies Data Pipeline Experience: You will be involved with acquiring and More ❯
Cheltenham, Gloucestershire, United Kingdom Hybrid / WFH Options
Gemba Advantage
and Go Web libraries and frameworks such as React and Angular Designing, building, and maintaining CI/CD pipelines Big data technologies, such as NiFi, Hadoop, Spark Cloud and containerization technologies such as AWS, OpenShift, Kubernetes, Docker DevOps methodologies, such as infrastructure as code and GitOps Database technologies, e.g. More ❯
week. Flexibility is key to accommodate any schedules changes per the customer. Preferred Requirements Experience with big data technologies like: Hadoop, Accumulo, Ceph, Spark, NiFi, Kafka, PostgreSQL, ElasticSearch, Hive, Drill, Impala, Trino, Presto, etc. Experience with containers, EKS, Diode, CI/CD, and Terraform are a plus. Work could More ❯
is required. Application Development and Deployment: Utilize NodeJS, ReactJS, and RESTful APIs to develop and deploy scalable applications. Ensure seamless integration with Elasticsearch and Apache NiFi. CI/CD Pipelines: Design and implement CI/CD pipelines using Jenkins, Kubernetes, and OpenShift. Automate testing, building, and deployment processes to … for orchestration. Solid understanding of CI/CD principles and experience with Jenkins. Strong scripting skills (Bash, Python, Ruby). Familiarity with Elasticsearch and ApacheNiFi is desirable. Desirable Certifications AWS Certifications (preferably AWS Certified DevOps Engineer - Professional or higher). Red Hat Certified Engineer (RHCE) or equivalent More ❯
ElasticSearch, MapReduce, and HBase.1. Demonstrated experience working with big data processing and NoSQL databases such as MongoDB, ElasticSearch, MapReduce, and HBase. Demonstrated experience with Apache NiFi. Demonstrated experience with the Extract, Transform, and Load (ETL) processes. Demonstrated experience managing and mitigating IT security vulnerabilities using Plans of Actions and More ❯
BE NICE FOR THE BIG DATA ENGINEER TO HAVE.... Cloud based architectures Microservice architecture or server-less architecture Messaging/routing technologies such as ApacheNifi/RabbitMQ Experience of DevSecOps automated deployment tools such as Jenkins, Ansible, Docker TO BE CONSIDERED.... Please either apply by clicking online … DEVELOPPED VETTING/DEVELOPED VETTED/DEEP VETTING/DEEP VETTED/SC CLEARED/SC CLEARANCE/SECURITY CLEARED/SECURITY CLEARANCE/NIFI/CLOUDERA/HADOOP/KAFKA/ELASTIC SEARCH #J-18808-Ljbffr More ❯
with Maven, Rancher, Sonarqube Experience with GIT, Ruby, ArgoCD Experience with Terraform AWS Cloud Computing Skills, AWS CloudWatch, Elastic Search, Prometheus, Grafana Familiarity with NIFI, Apache, Kafka Kalman Filtering Education/Experience Requirements SME Level: Expert consultant to top management typically with an advanced degree and 13+ years More ❯
CD: Jenkins, GitLab CI/CD, Terraform Cloud & Infrastructure: AWS Testing & Quality: Cucumber, SonarQube Monitoring & Logging: ELK Stack (Elasticsearch, Logstash, Kibana), Grafana Dataflow & Integration: ApacheNiFi Experience across multiple areas is desirable; we don't expect you to know everything but a willingness to learn and contribute across More ❯
Spark, Kafka). • Familiarity with cloud-based data platforms and services (e.g., AWS, Azure, Google Cloud). • Expertise in ETL tools and processes (e.g., ApacheNiFi, Talend, Informatica). • Proficiency in data integration tools and technologies. • Familiarity with data visualization and reporting tools (e.g., Tableau, Power BI) is More ❯
tools/data Desired Skills: Agile experience delivering on agile teams (Participates in scrum and PI Planning) Docker, Jenkins, Hadoop/Spark, Kibana, Kafka, NiFi, ElasticSearch More ❯
Jira and Confluence Preferred Qualifications: - Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, - Working knowledge with public keys and digital certificates - Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG - Experience More ❯
Jira and Confluence Preferred Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience More ❯
Jira and Confluence Desired Qualifications: Working knowledge with software platforms and services, such as, Docker, Kubernetes, JMS/SQS/SNS, Kafka, AWS Lambda, NiFi, Working knowledge with public keys and digital certificates Experience with automated testing patterns and tools, such as, Mocha/Chai, JUnit, NUnit, TestNG Experience More ❯