management and associated tools such as Git/Bitbucket. Experience in the use of CI/CD tools such as Jenkins or an understanding of their role. Experience with ApacheSpark or Hadoop. Experience in building data pipelines. Experience of designing warehouses, ETL pipelines and data modelling. Good knowledge in designing, building, using, and maintaining REST APIs. Good More ❯
solutions. The team's expertise spans a wide range of technologies, including Java and Python-based MicroServices, AWS/GCP cloud backend systems, Big Data technologies like Hive and Spark, and modern Web applications. With a globally distributed presence across the US, India and Europe, the team thrives on collaboration, bringing together diverse perspectives to solve complex challenges. At … entrepreneurial spirit Excellent verbal and written communication skills BS or MS degree in Computer Science or equivalent Nice to Have Experience in distributed computing frameworks like - Hive/Hadoop, ApacheSpark Experience in developing Finance or HR related applications Experience with following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience … with Terraform Experience in creating workflows for Apache Airflow Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare More ❯
experience working as a Software Engineer on large software applications Proficient in many of the following technologies - Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems - DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools - JUnit, Mockito, PyTest, Selenium. Strong working knowledge More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
Northrop Grumman Corp. (JP)
developing & deploying scalable backend systems. Familiarity with CICD, containerisation, deployment technologies & cloud platforms (Jenkins, Kubernetes, Docker, AWS) or Familiarity with Big Data and Machine Learning technologies (NumPy, PyTorch, TensorFlow, Spark). Excellent communication, collaboration & problem solving skills, ideally with some experience in agile ways of working. Security clearance: You must be able to gain and maintain the highest level More ❯
Research/Statistics or other quantitative fields. Experience in NLP, image processing and/or recommendation systems. Hands on experience in data engineering, working with big data framework like Spark/Hadoop. Experience in data science for e-commerce and/or OTA. We welcome both local and international applications for this role. Full visa sponsorship and relocation assistance More ❯
Manchester, Lancashire, United Kingdom Hybrid / WFH Options
WorksHub
that help us achieve our objectives. So each team leverages the technology that fits their needs best. You'll see us working with data processing/streaming like Kinesis, Spark and Flink; application technologies like PostgreSQL, Redis & DynamoDB; and breaking things using in-house chaos principles and tools such as Gatling to drive load all deployed and hosted on More ❯
Manchester, Lancashire, England, United Kingdom Hybrid / WFH Options
Tenth Revolution Group
you prefer Exceptional Benefits : From unlimited holiday and private healthcare to stock options and paid parental leave. What You'll Be Doing: Build and maintain scalable data pipelines using Spark with Scala and Java, and support tooling in Python Design low-latency APIs and asynchronous processes for high-volume data. Collaborate with Data Science and Engineering teams to deploy … Contribute to the development of Gen AI agents in-product. Apply best practices in distributed computing, TDD, and system design. What We're Looking For: Strong experience with Python, Spark, Scala, and Java in a commercial setting. Solid understanding of distributed systems (e.g. Hadoop, AWS, Kafka). Experience with SQL/NoSQL databases (e.g. PostgreSQL, Cassandra). Familiarity with More ❯