and unstructured data. Develop, optimize and automate ETL workflows to extract data from diverse sources. Utilize big data technologies such as Spark, Kafka, and Flink for distributed data processing and analytics. Deploy and manage data solutions on cloud platforms such as AWS, Azure, or GCP. Implement and oversee data more »
advanced analytics infrastructure. Familiarity with infrastructure-as-code (IaC) tools such as Terraform or CloudFormation. Experience with modern data engineering technologies (e.g., Kafka, Spark, Flink, etc.). Why join YouLend? Award-Winning Workplace: YouLend has been recognised as one of the "Best Places to Work 2024" by the Sunday more »
oriented programming (OOP) principles & concepts Familiarity with advanced SQL techniques Familiarity with data visualization tools such as Tableau or Power BI Familiarity with ApacheFlink or Apache Storm Understanding of DevOps practices and tools for (CI/CD) pipelines. Awareness of data security best practices and compliance requirements (e.g. more »
development experience using SQL. Hands-on experience with MPP databases such as Redshift, BigQuery, or Snowflake, and modern transformation/query engines like Spark, Flink, Trino. Familiarity with workflow management tools (e.g., Airflow) and/or dbt for transformations. Comprehensive understanding of modern data platforms, including data governance and more »
technical leadership and people management responsibilities. Proven expertise with distributed real-time processing systems (e.g., Google Pub/Sub, Dataflow, Apache Beam, Apache Kafka, Flink, Spark Streaming). Proficiency in programming languages such as Java, Python, Rust or Go. Strong foundation in real-time data processing and optimization of more »
with dbt, Fivetran, Apache Airflow, Data Mesh, Data Vault 2.0, Fabric, and Apache Spark Experience working with streaming technologies such as Apache Kafka, ApacheFlink, or Google Cloud Dataflow Hands-on experience with modern data orchestration tools like Dagster or Prefect Knowledge of data governance and cataloging tools like more »
with dbt, Fivetran, Apache Airflow, Data Mesh, Data Vault 2.0, Fabric, and Apache Spark Experience working with streaming technologies such as Apache Kafka, ApacheFlink, or Google Cloud Dataflow Hands-on experience with modern data orchestration tools like Dagster or Prefect Knowledge of data governance and cataloging tools like more »
derivatives. Knowledge of additional data processing libraries and tools to enhance data engineering workfows. Expertise in real-time data processing frameworks such as ApacheFlink or Kafka Streams. Experience building event-driven and/or streaming data services. Soft Skills: Strong analytical and problem-solving skills, with the ability more »
derivatives. Knowledge of additional data processing libraries and tools to enhance data engineering workfows. Expertise in real-time data processing frameworks such as ApacheFlink or Kafka Streams. Experience building event-driven and/or streaming data services. Soft Skills: Strong analytical and problem-solving skills, with the ability more »
to some of the following technologies (or equivalent): Apache Spark, AWS Redshift, AWS S3, Cassandra (and other NoSQL systems), AWS Athena, Apache Kafka, ApacheFlink, AWS, and service-oriented architecture. What you'll get: Full responsibility for projects from day one, a collaborative team, and a dynamic work environment. more »
and/or serverless environment, hosted on public cloud (AWS EKS and Kubernetes preferred). Experience working with data pipeline technologies, such as Kafka, Flink, AWS SNS/SQS/Kinesis, etc. Solid understanding of CI/CD concepts, including version control, branching models, appropriate quality control techniques, secrets more »
Experience working with event streaming platforms (Kafka/Kinesis/SQS) Experience with distributed processing systems such as Apache Spark and/or ApacheFlink Ability to handle periodic on-call duty as well as out-of-band requests Ability to conduct technical deep dives into the code, cloud more »
to translate business requirements into impactful technical solutions. Develop and maintain robust, scalable machine learning pipelines using cloud services (e.g., AWS SageMaker, AWS Glue, Flink, EC2, S3, Lambda) and other relevant tools. Stay up-to-date with advancements in computer vision and machine learning, identifying opportunities to integrate innovative more »
unlock data ownership with a strong focus on decentralized technology. Are a passionate developer with extensive experience using Python, Typescript, Nodejs, Spark, Trino, dbt, Flink, AWS Data Suite tools like AWS Glue, AWS Athena, AWS QuickSight, AWS Redshift, etc. Other languages like CSharp, C++, GoLang, Rust for example are more »
Requirements Strong concurrent programming skills in at least one statically typed language (e.g., Java, Kotlin, Scala, Rust). Proven Stream Processing related experience (e.g., Flink, Kafka Streams, or similar). Experience with Distributed Messaging technologies (e.g., Kafka, Kinesis, or similar). Experience building Streaming Developer Tools & Platforms. Technical expertise more »
Requirements Strong concurrent programming skills in at least one statically typed language (e.g., Java, Kotlin, Scala, Rust). Proven Stream Processing related experience (e.g., Flink, Kafka Streams, or similar). Experience with Distributed Messaging technologies (e.g., Kafka, Kinesis, or similar). Experience building Streaming Developer Tools & Platforms. Technical expertise more »
Requirements Strong concurrent programming skills in at least one statically typed language (e.g., Java, Kotlin, Scala, Rust). Proven Stream Processing related experience (e.g., Flink, Kafka Streams, or similar). Experience with Distributed Messaging technologies (e.g., Kafka, Kinesis, or similar). Experience building Streaming Developer Tools & Platforms. Technical expertise more »
Grow with us. We are looking for a Machine Learning Engineer to work along the end-to-end ML lifecycle, alongside our existing Product & Engineering team. About Trudenty: The Trudenty Trust Network provides personalised consumer fraud risk intelligence for fraud more »
Grow with us. We are looking for a Machine Learning Engineer to work along the end-to-end ML lifecycle, alongside our existing Product & Engineering team. About Trudenty: The Trudenty Trust Network provides personalised consumer fraud risk intelligence for fraud more »
uv and Docker in Linux • High-performance data stores and query engines like Trino/Snowflake • Real-time streaming analytics tech like Kafka/Flink • Openshift container technologies and Observability stack like Grafana Skills desired: • Experience with financial concepts such as Equities/Options/Futures • Full stack HTML5 more »
technical efficiency. Ideally you will have experience in Java, Kotlin, Scala, Rust or other JVM languages. Prior experience with streaming technologies like Kafka Streams, Flink, or Spark is a plus, but not required. More important is having technical curiosity - a drive to explore where data originates from and to more »
technical efficiency. Ideally you will have experience in Java, Kotlin, Scala, Rust or other JVM languages. Prior experience with streaming technologies like Kafka Streams, Flink, or Spark is a plus, but not required. More important is having technical curiosity - a drive to explore where data originates from and to more »
years in Pre-Sales, Solution Architecture or Technical Sales in a leadership position Strong expertise in real-time data or distributed systems e.g., ApacheFlink, Kafka or Spark, etc Proven experience leading and scaling Pre-Sales teams A strategic thinker and strong communicator who has the interpersonal skills to more »
learning as core elements of scalable fraud detection platforms. Real-Time Platform Experience: Proven track record in building scalable, real-time platforms using Kafka , Flink , Python , Java , or similar technologies. Production Expertise: Hands-on experience transitioning platforms through POC, pilot, and production phases. Enterprise and Solutions Architecture: Strong enterprise more »
learning as core elements of scalable fraud detection platforms. Real-Time Platform Experience: Proven track record in building scalable, real-time platforms using Kafka , Flink , Python , Java , or similar technologies. Production Expertise: Hands-on experience transitioning platforms through POC, pilot, and production phases. Enterprise and Solutions Architecture: Strong enterprise more »