Deep understanding in software architecture, object-oriented design principles, and data structures Extensive experience in developing microservices using Java, Python Experience in distributed computing frameworks like - Hive/Hadoop, Apache Spark. Good experience in Test driven development and automating test cases using Java/Python Experience in SQL/NoSQL (Oracle, Cassandra) database design Demonstrated ability to be proactive … HR related applications Experience with following cloud services: AWS Elastic Beanstalk, EC2, S3, CloudFront, RDS, DynamoDB, VPC, Elastic Cache, Lambda Working experience with Terraform Experience in creating workflows for Apache Airflow About Roku Roku pioneered streaming to the TV. We connect users to the streaming content they love, enable content publishers to build and monetize large audiences, and provide More ❯
maintain the data platform. You will be working on complex data problems in a challenging and fun environment, using some of the latest Big Data open-source technologies like Apache Spark, as well as Amazon Web Service technologies including Elastic MapReduce, Athena and Lambda to develop scalable data solutions. Key Responsibilities: Adhering to Company Policies and Procedures with respect … and specifications. Good interpersonal skills, positive attitude, willing to help other members of the team. Experience debugging and dealing with failures on business-critical systems. Preferred Qualifications: Exposure to Apache Spark, Apache Trino, or another big data processing system. Knowledge of streaming data principles and best practices. Understanding of database technologies and standards. Experience working on large and More ❯
Glue Catalog, and AWS Glue Databrew. They are experienced in developing batch and real-time data pipelines for Data Warehouse and Datalake, utilizing AWS Kinesis and Managed Streaming for Apache Kafka. They are also proficient in using open source technologies like Apache Airflow and dbt, Spark/Python or Spark/Scala on AWS Platform. The data engineer More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
QinetiQ
Are Some Things We’ve Worked On Recently That Might Give You a Better Sense Of What You’ll Be Doing Day To Day Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
experience working as a Software Engineer on large software applications Proficient in many of the following technologies – Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems – DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools – JUnit, Mockito, PyTest, Selenium. Strong working knowledge … experience working as a Software Engineer on large software applications Proficient in many of the following technologies – Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems – DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools – JUnit, Mockito, PyTest, Selenium. Strong working knowledge More ❯
experience working as a Software Engineer on large software applications Proficient in many of the following technologies – Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems – DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools – JUnit, Mockito, PyTest, Selenium. Strong working knowledge More ❯
MySQL, PostgreSQL, or Oracle Experience with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of the team Strong analytical and problem More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
strategies. Strong experience in IaC and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create data pipelines on a cloud More ❯
of security, scalability and high availability. Very strong technical background leading application development – with experience in some or all of the following technologies: Python, Java, Spring Boot, TensorFlow, PyTorch, Apache Spark, Kafka, Jenkins, Git/Bitbucket, Terraform, Docker, ECS/EKS, IntelliJ, JIRA, Confluence, React/Typescript, Selenium, Redux, Junit, Cucumber/Gherkin. About Datalex Datalex's purpose is More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
best practices. Automation & Monitoring: Implement and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions, and implement corrective actions, while More ❯
Salford, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
are some things we’ve worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
are some things we’ve worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data into a common data format More ❯
Azure) Experience managing PKI/X.509 certificate infrastructure. Extensive experience supporting and implementing TLS/SSL certificate management systems Proficient with Token-based authentication services, Perfect Forward Security (PFS), Apache, Nginx, HAProxy Solid knowledge of Linux security and system operations. Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support More ❯
tools, and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as Apache Hadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices More ❯
tools, and statistical packages. Strong analytical, problem-solving, and critical thinking skills. 8.Experience with social media analytics and understanding of user behaviour. 9.Familiarity with big data technologies, such as Apache Hadoop, Apache Spark, or Apache Kafka. 10.Knowledge of AWS machine learning services, such as Amazon SageMaker and Amazon Comprehend. 11.Experience with data governance and security best practices More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Gaming Innovation Group
You're really awesome at: Object-oriented programming (Java) Data modeling using various database technologies ETL processes (transferring data in-memory, moving away from traditional ETLs) and experience with Apache Spark or Apache NiFi Applied understanding of CI/CD in change management Dockerized applications Using distributed version control systems Being an excellent team player Meticulous and passionate More ❯
scalable machine learning systems in production. Strong programming skills in Python and experience with ML frameworks and tools (e.g., TensorFlow, PyTorch, MLFlow, Kubeflow, Jupyter, Azure ML Studio, Amazon Sagemaker, Apache Spark, Apache Flink). Expertise in containerization (e.g., Docker, Kubernetes) and automation (e.g., Jenkins, GitLab CI). Excellent problem-solving skills and the ability to work independently or More ❯
into epics and stories. Writing clean, secure code following a test-driven approach. Monitor and maintain - Monitor data systems for performance issues and make any necessary updates. Required Skills Apache Kafka Apache NiFI SQL and noSQL databases (e.g. MongoDB) ETL processing languages such as Groovy, Python or Java. Built on over a 60 year heritage, Roke offers specialist More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
multi-project environments experience. Expertise in ETL, data modelling, and Azure Data Services. Experience in designing and implementing data pipelines, data lakes, and data warehouses. Hands-on experience with Apache Spark and bonus points for Microsoft Fabric Any certifications are a bonus. Benefits: Competitive base salary Hybrid work once a week into their Central Manchester office 25 days holiday More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., Apache Airflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an Equal More ❯