As a leading global animal health company, Elanco delivers innovative products and services to improve the health of pets and farm animals around the world because we believe making animals' lives better, makes life better. Since 1954, we have provided more »
a similar role. > Proficiency in programming languages such as Python, Java, or Scala. > Strong experience with data processing frameworks such as Apache Spark, ApacheFlink, or Hadoop. > Hands-on experience with cloud platforms such as AWS, Google Cloud, or Azure. > Experience with data warehousing technologies such as Redshift, BigQuery more »
AWS Certified Data Analytics Specialty, or AWS Certified Big Data Specialty. Experience with other big data and streaming technologies such as Apache Spark, ApacheFlink, or Apache Beam. Knowledge of containerization and orchestration technologies such as Docker and Kubernetes. Experience with data lakes, NoSQL databases, and other data management more »
Kubernetes and Cloud services. Experience with Azure stack will be an asset. Experience designing and implementing event-driven/microservices applications using Apache Kafka, Flink, etc. Exposure to model deployment and serving tools like Seldon Core, KServe, etc. Experience with drift detection and adaptation techniques as well as evaluating more »
including re-architecting Prior experience working on data focused projects e.g. data warehousing, big data, data streaming Proficiency with Apache Kafka, Apache Spark, ApacheFlink etc. We are an equal opportunities employer and welcome applications from all suitably qualified persons regardless of their race, sex, disability, religion/belief more »
and transitioning towards event-driven architecture Utilizing Java 17+ with Spring/Spring Boot 3+, SQL/Oracle Effective experience with Redis, Apache Kafka, Flink or similar Striving for Serverless solutions utilizing Linux, virtualization, containers, docker, Kubernetes, potentially in production environments, but at least for testing purposes Production monitoring more »
Manchester, North West, United Kingdom Hybrid / WFH Options
Searchability NS&D Ltd
components including Data Ingest, Data Stores and REST APIs. THE DEVOPS ENGINEER SHOULD HAVE…. You must have an active eDV Clearance. Apache Nifi Flink Java Ansible Docker Kubernetes ELK stack Linux Sys Admin for deployed Clusters (10's of servers) Jenkins Pipeline development Integration/debugging Understanding complex … KEY SKILLS: SOFTWARE DEVELOPER/SOFTWARE ENGINEER/SENIOR SOFTWARE DEVELOPER/SENIOR SOFTWARE ENGINEER/DEVOPS ENGINEER/DEVOPS/APACHE NIFI/FLINK/JAVA/ANSIBLE/DOCKER/KUBERNETES/ELK STACK/TERRAFORM/LINUX/GIT more »
services on Kubernetes with Helm/Terraform Good to have prior experience dealing with streaming and batch compute frameworks like Spring Kafka, Kafka Streams, Flink, Spark Streaming, Spark Experience with large-scale computing platforms, such as Hadoop, Hive, Spark, NoSQL stores Experience with developing large-scale data pipelines is more »
production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including ApacheFlink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool for more »
experience in setting up data platforms, setting standards - not just pipelines. Preferably experience in a distributed data processing environment/framework (e.g. Spark or Flink). Technologies: Java, Kotlin, Python (candidate is not expected to be proficient in one, and open to learn about the other) Kubernetes Apache Pulsar more »
production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including ApacheFlink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool for more »