impressive visualization (Power BI) · Experience in building large scale DW/BI systems for B2B SAAS companies · Experience with open-source tools like ApacheFlink and AWS tools like S3, Redshift, EMR and RDS. · Experience with AI/Machine Learning and Predictive Analytics · Experience in developing global products will more »
Knowledge and experience with Snowflake and other databases (PostgreSQL, MS SQL Server, MySQL) Experience with Big Data Batch and Streaming technologies like Spark, Kafka, Flink, Beam, Kinesis SnowPro Certification or equivalent from AWS Comfort working within an agile development cycle and exposure to: Linux development Git and versioning software more »
AWS Certified Data Analytics Specialty, or AWS Certified Big Data Specialty. Experience with other big data and streaming technologies such as Apache Spark, ApacheFlink, or Apache Beam. Knowledge of containerization and orchestration technologies such as Docker and Kubernetes. Experience with data lakes, NoSQL databases, and other data management more »
development and release engineering practices (e.g. TDD, CI/CD). · Significant experience with Apache Spark or any other distributed data programming frameworks (e.g. Flink, Arrow, MapR). · Significant experience with SQL – comfortable writing efficient SQL. · Experience using enterprise scheduling tools (e.g. Apache Airflow, Spring DataFlow, Control-M) · Experience more »
certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, and more »
certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, and more »
certification Very Disrable to have hands-on experience with ETL tools, Hadoop-based technologies (e.g., Spark), and batch/streaming data pipelines (e.g., Beam, Flink etc) Proven expertise in designing and constructing data lakes and data warehouse solutions utilising technologies such as BigQuery, Azure Synapse, Redshift, Oracle, Teradata, and more »
Stack We are cloud-native, born in AWS, and embrace their services across our platform. Our technology stack includes Python, Django, ECS, Kinesis, Cloudfront, Flink, Elasticsearch, Lambda, Amazon MQ, Terraform, and Postgres more »
experience in setting up data platforms, setting standards - not just pipelines. Preferably experience in a distributed data processing environment/framework (e.g. Spark or Flink). Technologies: Java, Kotlin, Python (candidate is not expected to be proficient in one, and open to learn about the other) Kubernetes Apache Pulsar more »
production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including ApacheFlink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool for more »
production, providing subject matter expertise on the .Net stack and contributing to technical design discussions. You'll use a range of technology including ApacheFlink with Java for large scale data processing and will be able to assess and recommend new and emerging technologies, using the best tool for more »
container technologies Strong communication and interpersonal skills Experience managing projects and working with external third party teams Ideally experience with Apache Spark or ApacheFlink (but not essential) Please note, this role is unable to provide sponsorship. If this role sounds of interest and you think your skills match more »
Kubernetes and Cloud services. Experience with Azure stack will be an asset. Experience designing and implementing event-driven/microservices applications using Apache Kafka, Flink, etc. Exposure to model deployment and serving tools like Seldon Core, KServe, etc. Experience with drift detection and adaptation techniques as well as evaluating more »
services on Kubernetes with Helm/Terraform Good to have prior experience dealing with streaming and batch compute frameworks like Spring Kafka, Kafka Streams, Flink, Spark Streaming, Spark Experience with large-scale computing platforms, such as Hadoop, Hive, Spark, NoSQL stores Experience with developing large-scale data pipelines is more »
real-time analytics solutions, preferably with experience in time-series databases. • Proficiency in technologies relevant to real-time analytics, such as KX, Kafka, Spark, Flink, and real-time visualization tools. • Demonstrated ability to lead and mentor software engineering teams. • Excellent problem-solving skills and the ability to work collaboratively more »
We are cloud-native, born in AWS, and embrace their services across our platform. Our technology stack includes Python, Django, ECS, Postgres, Kinesis, Cloudfront, Flink, Elastic Search, Lambda, Amazonmq, Terraform, and Postgres. Tooling includes Datadog, Linear, Slack, Notion and CircleCI. If you think you tick the boxes then apply more »
London, England, United Kingdom Hybrid / WFH Options
McGregor Boyall
the trading floor. Cutting-Edge Technology Stack: Work with a very modern Java 17-based tech stack, incorporating state-of-the-art Apache Kafka, Flink, Ignite, and Angular 16 technologies. Strategic Advantage: Benefit from the organization's late adoption of prime services business, allowing the team to develop the more »
At Bazaarvoice, we create smart shopping experiences. Through our expansive global network, product-passionate community & enterprise technology, we connect thousands of brands and retailers with billions of consumers. Our solutions enable brands to connect with consumers and collect valuable user more »
in Java, Kafka Connect & Flink. Key skills: Expertise with Kafka, Kafka Connect and knowledge of RabbitMQ. Good experience with data process technologies like Apache (Flink/Beam/Spark), Oracle ODI, and Confluent’s Platform. Good knowledge of Hadoop Cluster Architecture and hands-on experience within Cloudera Hadoop ecosystems more »
As a leading global animal health company, Elanco delivers innovative products and services to improve the health of pets and farm animals around the world because we believe making animals' lives better, makes life better. Since 1954, we have provided more »
This is a Java Developer who will be sitting in QA team and will be hybrid in Manchester 2 days per week. Responsibilities will revolve around creating frameworks for testing in Java. Insight Global are looking to hire a Lead more »
We are hiring for Data Analyst at UK , it's going to be a hybrid role with 1 day Travel to Southampton in a week or less travels in a month. job location : Southampton UK (1 day Travel in a more »