Engineer on large software applications Proficient in many of the following technologies – Python, REST, PyTorch, TensorFlow, Docker, FastAPI, Selenium, React, TypeScript, Redux, GraphQL, Kafka, Apache Spark. Experience working with one or more of the following database systems – DynamoDB, DocumentDB, MongoDB Demonstrated expertise in unit testing and tools – JUnit, Mockito More ❯
and operating large scale data processing systems. Has successfully led data platform initiatives. A good understanding of data processing technologies and tools such as Apache Spark, Data Lake, Data Warehousing and SQL Databases. Proficiency in programming languages such as Python and CICD techniques to efficiently deliver change in a More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
QinetiQ
Worked On Recently That Might Give You a Better Sense Of What You’ll Be Doing Day To Day Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data More ❯
and operating large scale data processing systems. Has successfully led data platform initiatives. A good understanding of data processing technologies and tools such as Apache Spark, Data Lake, Data Warehousing and SQL Databases. Proficiency in programming languages such as Python and CICD techniques to efficiently deliver change in a More ❯
with big data technologies such as Hadoop, Spark, or Hive Familiarity with data warehousing and ETL tools such as Amazon Redshift, Google BigQuery, or Apache Airflow Proficiency in Python and at least one other programming language such as Java, or Scala Willingness to mentor more junior members of the More ❯
availability. Very strong technical background leading application development – with experience in some or all of the following technologies: Python, Java, Spring Boot, TensorFlow, PyTorch, Apache Spark, Kafka, Jenkins, Git/Bitbucket, Terraform, Docker, ECS/EKS, IntelliJ, JIRA, Confluence, React/Typescript, Selenium, Redux, Junit, Cucumber/Gherkin. About More ❯
working with data formats such as JSON, CSV, and Parquet. • Strong understanding of IT concepts, including security, IAM, Key Vault, and networking. • Exposure to Apache Airflow and DBT is a bonus. • Familiarity with agile principles and practices. • Experience with Azure DevOps pipelines. The "Nice to Haves": • Certification in Azure More ❯
Manchester, England, United Kingdom Hybrid / WFH Options
Made Tech
and able to guide how one could deploy infrastructure into different environments. Knowledge of handling and transforming various data types (JSON, CSV, etc) with Apache Spark, Databricks or Hadoop. Good understanding of possible architectures involved in modern data system design (Data Warehouse, Data Lakes, Data Meshes) Ability to create More ❯
Salford, England, United Kingdom Hybrid / WFH Options
QinetiQ Limited
worked on recently that might give you a better sense of what you’ll be doing day to day: Improving systems integration performance, using Apache Ni-Fi, by balancing scaling and flow improvements through chunking Implementing AWS Security Control Policies to manage global access privileges. Validating and Converting data More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Searchability®
data modeling Proficient in data warehouse technologies (such as Amazon Redshift, Google BigQuery, or Snowflake) Hands-on experience with ETL tools and frameworks, including Apache Airflow, Talend, or dbt Strong programming ability in Python or another data-focused language Knowledgeable about data management best practices, including governance, security, and More ❯
and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions More ❯
and support automated monitoring systems to detect data anomalies, system failures, and performance issues and leverage advanced scripting and orchestration tools (e.g., Python, Bash, Apache Airflow) to automate workflows and reduce operational overhead. Root Cause Analysis & Incident Management: Lead post-incident reviews, perform root cause analysis for data disruptions More ❯
X.509 certificate infrastructure. Extensive experience supporting and implementing TLS/SSL certificate management systems Proficient with Token-based authentication services, Perfect Forward Security (PFS), Apache, Nginx, HAProxy Solid knowledge of Linux security and system operations. Benefits Roku is committed to offering a diverse range of benefits as part of More ❯
Liverpool, England, United Kingdom Hybrid / WFH Options
Seven20
Tracking Systems (ATS) and recruitment platforms - such as Bullhorn and Vincere - to Seven20 (Salesforce) Design, build and execute ETL workflows using tools such as Apache Hop, Salesforce Dataloader and custom developed scripts/tooling. Collaborate with stakeholders to understand business requirements, data models and field mappings as part of More ❯
network design, administration, and troubleshooting. Knowledge of programming languages (e.g., JavaScript, Node.js, PHP). Experience with version control systems, ideally Git. Web server configuration (Apache, Nginx). Database management (MySQL, MongoDB), including high availability and backup solutions. Hands-on experience managing cloud providers, with significant experience in AWS and More ❯
Manchester Area, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions
in ETL, data modelling, and Azure Data Services. Experience in designing and implementing data pipelines, data lakes, and data warehouses. Hands-on experience with Apache Spark and bonus points for Microsoft Fabric Any certifications are a bonus. Benefits: Competitive base salary Hybrid work once a week into their Central More ❯
software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., Apache Airflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes More ❯
software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., Apache Airflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes More ❯