development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise scheduling tools (e.g. Apache Airflow, Spring DataFlow, Control-M) · Experience more »
certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, and more »
of AWS services: EC2, Lambda, Aurora, S3Competency in containerization technologies (e.g., AWS ECS, Kubernetes)Understanding data paradigms like stream processing (Kafka/Kinesis, ApacheFlink)Familiarity with AWS security practices, IAM, encryption, and network security configurationsExperience managing data engineering pipelines using Apache AirflowProficiency in CI/CD pipelines and more »
AWS Certified Data Analytics Specialty, or AWS Certified Big Data Specialty. Experience with other big data and streaming technologies such as Apache Spark, ApacheFlink, or Apache Beam. Knowledge of containerization and orchestration technologies such as Docker and Kubernetes. Experience with data lakes, NoSQL databases, and other data management more »
experience in setting up data platforms, setting standards - not just pipelines. Preferably experience in a distributed data processing environment/framework (e.g. Spark or Flink). Technologies: Java, Kotlin, Python (candidate is not expected to be proficient in one, and open to learn about the other) Kubernetes Apache Pulsar more »
container technologies Strong communication and interpersonal skills Experience managing projects and working with external third party teams Ideally experience with Apache Spark or ApacheFlink (but not essential) Please note, this role is unable to provide sponsorship. If this role sounds of interest and you think your skills match more »
production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including ApacheFlink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool for more »
production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including ApacheFlink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool for more »
Kubernetes and Cloud services. Experience with Azure stack will be an asset. Experience designing and implementing event-driven/microservices applications using Apache Kafka, Flink, etc. Exposure to model deployment and serving tools like Seldon Core, KServe, etc. Experience with drift detection and adaptation techniques as well as evaluating more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
the trading floor. Cutting-Edge Technology Stack: Work with a very modern Java 17-based tech stack, incorporating state-of-the-art Apache Kafka, Flink, Ignite, and Angular 16 technologies. Strategic Advantage: Benefit from the organization's late adoption of prime services business, allowing the team to develop the more »
As a leading global animal health company, Elanco delivers innovative products and services to improve the health of pets and farm animals around the world because we believe making animals' lives better, makes life better. Since 1954, we have provided more »