Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, ApacheFlink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required more »
Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, ApacheFlink etc. (required) Working knowledge of DevOps tools. Eg Terraform, Ansible, Jenkins, Kubernetes, Helm and CI/CD pipeline etc (required more »
Knowledge and experience with Snowflake and other databases (PostgreSQL, MS SQL Server, MySQL) Experience with Big Data Batch and Streaming technologies like Spark, Kafka, Flink, Beam, Kinesis SnowPro Certification or equivalent from AWS Comfort working within an agile development cycle and exposure to: Linux development Git and versioning software more »
AWS Certified Data Analytics Specialty, or AWS Certified Big Data Specialty. Experience with other big data and streaming technologies such as Apache Spark, ApacheFlink, or Apache Beam. Knowledge of containerization and orchestration technologies such as Docker and Kubernetes. Experience with data lakes, NoSQL databases, and other data management more »
certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, and more »
certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, and more »
Kubernetes and Cloud services. Experience with Azure stack will be an asset. Experience designing and implementing event-driven/microservices applications using Apache Kafka, Flink, etc. Exposure to model deployment and serving tools like Seldon Core, KServe, etc. Experience with drift detection and adaptation techniques as well as evaluating more »
strong experience in Java. Strong experience on data quality standards and contribution on defining and monitoring data quality metrics and KPIs Key Skills Needed: Flink Beam Kafka Connect Java more »
real-time analytics solutions, preferably with experience in time-series databases. • Proficiency in technologies relevant to real-time analytics, such as KX, Kafka, Spark, Flink, and real-time visualization tools. • Demonstrated ability to lead and mentor software engineering teams. • Excellent problem-solving skills and the ability to work collaboratively more »
At Bazaarvoice, we create smart shopping experiences. Through our expansive global network, product-passionate community & enterprise technology, we connect thousands of brands and retailers with billions of consumers. Our solutions enable brands to connect with consumers and collect valuable user more »
of AWS services: EC2, Lambda, Aurora, S3Competency in containerization technologies (e.g., AWS ECS, Kubernetes)Understanding data paradigms like stream processing (Kafka/Kinesis, ApacheFlink)Familiarity with AWS security practices, IAM, encryption, and network security configurationsExperience managing data engineering pipelines using Apache AirflowProficiency in CI/CD pipelines and more »
Stack We are cloud-native, born in AWS, and embrace their services across our platform. Our technology stack includes Python, Django, ECS, Kinesis, Cloudfront, Flink, Elasticsearch, Lambda, Amazon MQ, Terraform, and Postgres more »
As a leading global animal health company, Elanco delivers innovative products and services to improve the health of pets and farm animals around the world because we believe making animals' lives better, makes life better. Since 1954, we have provided more »
Chicago, Illinois, United States Hybrid / WFH Options
Request Technology - Robyn Honquest
Kafka (required) Experience with high speed distributed computing frameworks such as AWS EMR, Hadoop, HDFS, S3, MapReduce, Apache Spark, Apache Hive, Kafka Streams, ApacheFlink etc. (required) Experience working with various types of databases like Relational, NoSQL, Object-based, Graph (required) Working knowledge of DevOps tools. Eg Terraform, Ansible more »
impressive visualization (Power BI) · Experience in building large scale DW/BI systems for B2B SAAS companies · Experience with open-source tools like ApacheFlink and AWS tools like S3, Redshift, EMR and RDS. · Experience with AI/Machine Learning and Predictive Analytics · Experience in developing global products will more »
development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise scheduling tools (e.g. Apache Airflow, Spring DataFlow, Control-M) · Experience more »
certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, and more »
We are cloud-native, born in AWS, and embrace their services across our platform. Our technology stack includes Python, Django, ECS, Postgres, Kinesis, Cloudfront, Flink, Elastic Search, Lambda, Amazonmq, Terraform, and Postgres. Tooling includes Datadog, Linear, Slack, Notion and CircleCI. If you think you tick the boxes then apply more »
At Bazaarvoice, we create smart shopping experiences. Through our expansive global network, product-passionate community & enterprise technology, we connect thousands of brands and retailers with billions of consumers. Our solutions enable brands to connect with consumers and collect valuable user more »
container technologies Strong communication and interpersonal skills Experience managing projects and working with external third party teams Ideally experience with Apache Spark or ApacheFlink (but not essential) Please note, this role is unable to provide sponsorship. If this role sounds of interest and you think your skills match more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
the trading floor. Cutting-Edge Technology Stack: Work with a very modern Java 17-based tech stack, incorporating state-of-the-art Apache Kafka, Flink, Ignite, and Angular 16 technologies. Strategic Advantage: Benefit from the organization's late adoption of prime services business, allowing the team to develop the more »
experience in setting up data platforms, setting standards - not just pipelines. Preferably experience in a distributed data processing environment/framework (e.g. Spark or Flink). Technologies: Java, Kotlin, Python (candidate is not expected to be proficient in one, and open to learn about the other) Kubernetes Apache Pulsar more »
production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including ApacheFlink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool for more »
production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including ApacheFlink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool for more »