have experience architecting data pipelines and are self-sufficient in getting the data you need to build and evaluate models, using tools like Dataflow, ApacheBeam, or Spark You care about agile software processes, data-driven development, reliability, and disciplined experimentation You have experience and passion for fostering … scalable Machine learning frameworks Experience with building data pipelines and getting the data you need to build and evaluate your models, using tools like ApacheBeam/Spark Where You'll Be We offer you the flexibility to work where you work best! For this role, you can More ❯
with TensorFlow, PyTorch, Scikit-learn, etc. is a strong plus. You have some experience with large scale, distributed data processing frameworks/tools like ApacheBeam, Apache Spark, or even our open source API for it - Scio, and cloud platforms like GCP or AWS. You care about More ❯
architectures for ML frameworks in complex problem spaces in collaboration with product teams Experience with large scale, distributed data processing frameworks/tools like ApacheBeam, Apache Spark, and cloud platforms like GCP or AWS Where You'll Be We offer you the flexibility to work where More ❯
years hands-on with Google Cloud Platform. Strong experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow. Proficient in SQL, Python, and Apache Beam. Familiarity with DevOps and CI/CD pipelines in cloud environments. Experience with Terraform, Cloud Build, or similar tools for infrastructure automation. Understanding of … available) Responsibilities: Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP) Develop ETL processes using tools like Cloud Dataflow, ApacheBeam, BigQuery, and Cloud Composer Collaborate with data analysts, scientists, and business stakeholders to understand data requirements Optimize performance and cost-efficiency of More ❯
London, England, United Kingdom Hybrid / WFH Options
Focus on SAP
years hands-on with Google Cloud Platform. Strong experience with BigQuery, Cloud Storage, Pub/Sub, and Dataflow. Proficient in SQL, Python, and Apache Beam. Familiarity with DevOps and CI/CD pipelines in cloud environments. Experience with Terraform, Cloud Build, or similar tools for infrastructure automation. Understanding of … available) Responsibilities: Design, build, and maintain scalable and reliable data pipelines on Google Cloud Platform (GCP) Develop ETL processes using tools like Cloud Dataflow, ApacheBeam, BigQuery, and Cloud Composer Collaborate with data analysts, scientists, and business stakeholders to understand data requirements Optimize performance and cost-efficiency of More ❯
includes a firm-wide data catalogue detailing key data elements. Skills Required Java and/or Scala Backends Distributed compute concepts: Spark, Data Bricks, ApacheBeam etc Full Stack experience Azure Cloud Technologies Angular/Similar front end Cloud DB/Relational DB: Snowflake/Sybase Unix experience More ❯
guidance and support to development operations teams. Staying updated with the latest Kafka features, updates and industry practices. Required Skills Experience : Extensive Experience with Apache Kafka and real-time architecture including event driven frameworks. Strong Knowledge of Kafka Streams, Kafka Connect, Spark Streaming, Schema Registry, Flink and Beam. Experience More ❯
et al.) and a clear understanding of when not to use them. Experience with message queues (SQS, PubSub, RabbitMQ etc.) and data pipelines (Kafka, Beam, Kinesis, etc.) You are an effective team player with effective communication, presentation and influencing skills. You have a passion for improving coding and development More ❯
or all of the services below would put you at the top of our list Google Cloud Storage Google Data Transfer Service Google Dataflow (ApacheBeam) Google PubSub Google CloudRun BigQuery or any RDBMS Python Debezium/Kafka dbt (Data Build tool) Interview process Interviewing is a two More ❯
AWS, or Azure. Experience with CI/CD pipelines for machine learning (e.g., Vertex AI). Experience with data processing frameworks and tools, particularly ApacheBeam/Dataflow is highly desirable. Knowledge of monitoring and maintaining models in production. Proficiency in employing containerization tools, including Docker, to streamline More ❯
and testing. Contribute to cloud migration for the Treasury platform. Requirements: 3-5 years software development experience. Hands-on expereince with Java, Scala, Spark, Beam and SQL (Sybase) Strong understanding of multi-threading and high-volume server-side development. Familiarity with Snowflake, PowerBI, and cloud platforms (plus). Good More ❯
Are You have proven experience in data engineering, including creating reliable, efficient, and scalable data pipelines using data processing frameworks such as Scio, DataFlow, Beam or equivalent. You are comfortable working with large datasets using SQL and data analytics platforms such as BigQuery. You are knowledgeable in cloud-based More ❯
Are You have proven experience in data engineering, including creating reliable, efficient, and scalable data pipelines using data processing frameworks such as Scio, DataFlow, Beam or equivalent. You are comfortable working with large datasets using SQL and data analytics platforms such as BigQuery. You are knowledgeable in cloud-based More ❯