Newcastle upon Tyne, England, United Kingdom Hybrid / WFH Options
Partnerize
Senior Linux/ SysAdmin Engineer - Remote UK / Ireland Partnerize Newcastle Upon Tyne, England, United Kingdom Senior Linux/ SysAdmin Engineer - Remote UK / Ireland Partnerize Newcastle Upon Tyne, England, United Kingdom 4 days ago Be among the first 25 applicants Get AI-powered advice on this job and more exclusive features. Who We Are: The … the opportunity to be the best in the business, to exceed our clients' expectations, to innovate, to teach—and most importantly—to earn and maintain our clients' loyalty. Senior Linux/ SysAdmin Engineer Who We Are: The partnership channel offers scale and automation on a pay-for-performance model that delivers the operating leverage necessary for brand survival. Partnerize … technical estate, replace existing systems with new and scale and develop the platform. It's an exciting time to join the team in this captivating period. As a Senior Linux/ SysAdmin Engineer you will be part of the Partnerize Technical Operations team, who work with the business, development and IT functions. You will be responsible for ensuring the More ❯
detection and response (EDR) technologies and network detection and response (NDR) technologies. Detailed knowledge of Information Security standards including Cyber Essentials, Cyber Essentials Plus and ISO27001. Good understanding of Linux and database technologies such as Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Mercurial and Git. Good understanding of cyber security practices in relation to cloud hosting, preferably with experience of More ❯
London, England, United Kingdom Hybrid / WFH Options
OnHires
of users. This is an opportunity to contribute to a rapidly evolving platform and help shape its future growth. About the Role: We are seeking an experienced Senior Full StackPHP Developer to join our client's development team. You will work on complex, scalable applications for a high-traffic platform that provides innovative solutions in the digital … level knowledge of the Laravel framework. Strong understanding of the full HTTP workflow in multi-server environments. Experience with RESTful API design and implementation. Proficiency in both MySQL and PostgreSQL databases. System Administration & Infrastructure Working knowledge of Linux systems (command line, file permissions, process management). Hands-on experience with AWS services (EC2, S3, RDS, Lambda, etc.). Knowledge of … Familiarity with Elasticsearch or similar search engines. Experience with performance optimization and monitoring tools. DevOps experience and infrastructure as code (Terraform, CloudFormation). System administration experience with web servers (Apache, Nginx). What We Offer: Competitive salary and benefits package. Flexible working arrangements. Professional development opportunities. Modern tech stack and tools. Collaborative and innovative work environment. Opportunity to work More ❯
London, England, United Kingdom Hybrid / WFH Options
Aker Systems Limited
exploring new technologies and methodologies to solve complex data challenges. Proven experience leading data engineering projects or teams. Expertise in designing and building data pipelines using frameworks such as Apache Spark, Kafka, Glue, or similar. Solid understanding of data modelling concepts and experience working with both structured and semi-structured data. Strong knowledge of public cloud services, especially AWS (EC2 … Athena, Glue). Proficiency in programming / scripting languages such as Python. Experience working in Agile / Scrum environments. Familiarity with configuring and tuning both relational (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, DynamoDB). Test-Driven Development (TDD) using appropriate frameworks. Nice to Have Experience with CI / CD tools such as Jenkins or Drone. … Familiarity with Infrastructure as Code tools like Terraform, Ansible, and Packer. Hands-on experience with both Linux and Windows environments. Knowledge of data governance, security controls, and compliance in public sector environments. Aker Systems Attributes At Aker we work as a team, we are collaborative, hardworking, open, and delivery obsessed. There is no blame culture here: try things, and take More ❯
ensuring all issues / problems are addressed in a timely manner by the team. You should have a keen interest in problem-solving accompanied with experience in networking, linux systems and an experience in system designs, analysing what is currently implemented and offering improvements as well as ensuring the supporting documentation is in place. You'll be supporting and … setting up and maintaining multi-master replication, geo-replication, GTID and disaster recovery strategies. Proficient in resolving replication lag, failover issues, and ensuring data integrity across different database platforms. PostgreSQL/ NoSQL databases : Expertise in PostgreSQL replication (synchronous, asynchronous), logical replication, and managing replication lag. Ability to troubleshoot and resolve slow queries Experience in PostgreSQL schema optimisation (indexing, partitioning … systems, and handling post-upgrade issues. Expertise in managing database migrations , including: In-place migrations (within the same database platform). Cross-database migrations (e.g., migrating from MySQL to PostgreSQL or MongoDB to CouchDB). Migrating between cloud providers or on-prem to cloud for various database platforms. Automating and scripting the migration process where possible. Experience in versioning and More ❯
Cambridge, England, United Kingdom Hybrid / WFH Options
ZipRecruiter
global technology company is expanding its product portfolio and looking to grow its in-house software team. We’re hiring a Senior Full Stack Software Engineer with strong Linux-based server development experience to help deliver cutting-edge solutions. This is a hybrid role , with regular access to the Cambridge office required. What you’ll be doing: Reporting to … across the full development lifecycle, translating requirements into design, code, and tested solutions. What we’re looking for: Essential skills & experience: 5+ years’ experience in server-side development on Linux Solid experience with technologies such as PHP, JavaScript, HTML / CSS, Apache Strong database knowledge – MySQL or PostgreSQL Experience with websites and web services Familiarity with Git and More ❯
touristic offers only available for travel business employees. Police clearance certificate will be required as part of the infrastructure is considered high security. Requirements You are experienced in the Linux network / server administration field in general, good knowledge of Debian system is a plus You possess good knowledge and experience of firewalls (iptables / pf / … with at least one configuration management system (Ansible, etc) You can handle VMware (vCenter / ESXi) Experience in setup / maintenance of DNS (bind), LDAP(openldap), Webservers(Apache, Nginx), Databases (Postgresql, MySQL), mailserver (postfix), POP3 / IMAP Servers (dovecot) is a plus Knowledge of a scripting language (python, perl, sh, python preferred) for automation More ❯
market 3rd party tools and cloud technologies that can help to optimise the full data pipeline from scouting to trading. Our Technology Our systems are almost all running on Linux and most of our code is in Python, with the full scientific stack: numpy, scipy, pandas to name a few of the libraries we use extensively. We implement … the systems that require the highest data throughput in Java. Within Data Engineering we use Dataiku, Snowflake, Prometheus, and ArcticDB heavily. We use Kafka for data pipelines, Apache Beam for ETL, Bitbucket for source control, Jenkins for continuous integration, Grafana + Prometheus for metrics collection, ELK for log shipping and monitoring, Docker for containerisation, Kubernetes for container orchestration, OpenStack for … Python Knowledge of the challenges of dealing with large data sets, both structured and unstructured Knowledge of modern practices for ETL, data engineering and stream processing Proficient on Linux platforms with knowledge of various scripting languages Working knowledge of one or more relevant database technologies e.g. MongoDB, PostgreSQL, Snowflake, Oracle Proficient with a range of open source frameworks and More ❯
engineers around, this role would be a good fit. You’ll work everyday with an awesome technology stack consisting of Python, Django, Git, Debian, Redis, jQuery, Jenkins, Postgresql, Gunicorn and many other things. We place a high value on learning and personal growth so you’ll have time to learn new technologies and attend conferences at the company … at least one scripting language such as Python, Perl, PHP or Ruby and have experience with web frameworks and the MVC concept Have used MySQL or PostgreSQL extensively and you know your way around Apache, nginx or other server Get excited by the idea of scaling web apps to millions of users Often find yourself as the … company one day) Working with awesome technologies (Python/ Django / jQuery / Debian / Git / Redis / Jenkins /Postgresql/ Gunicorn) As part of one of the top technical teams in the UK, alongside super smart people who have a lot of fun, devoid of any politics With More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering: Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering: Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering: Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
london (city of london), south east england, united kingdom
AI71
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering: Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … with Kubernetes , and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering : Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering : Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering : Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering : Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
london (city of london), south east england, united kingdom
Robert Walters
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Database Engineering : Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
frameworks within transformation pipelines. Data Processing & Optimization: Build efficient, high-performance systems by leveraging techniques like data denormalization , partitioning , caching , and parallel processing . Develop stream-processing applications using Apache Kafka and optimize performance for large-scale datasets . Enable data enrichment and correlation across primary, secondary, and tertiary sources. Cloud, Infrastructure, and Platform Engineering: Develop and deploy data workflows … with Kubernetes , and ensure production-grade deployment. Collaborate with platform teams to ensure scalability, resilience, and observability of data pipelines. Write and optimize complex SQL queries on relational (Redshift, PostgreSQL) and NoSQL (MongoDB) databases. Work with ELK stack (Elasticsearch, Logstash, Kibana) for search, logging, and real-time analytics. Support Lakehouse architectures and hybrid data storage models for unified access … stack by adopting new technologies and automation strategies. Required Skills & Qualifications: 8+ years of experience in data engineering within a production environment. Advanced knowledge of Python and Linux shell scripting for data manipulation and automation. Strong expertise in SQL / NoSQL databases such as PostgreSQL and MongoDB. Experience building stream processing systems using Apache Kafka . Proficiency More ❯
Bracknell, Berkshire, United Kingdom Hybrid / WFH Options
Perforce Software, Inc
technical support required for successful implementation and usage of the Zend line of products. We expect the TSE to be prepared on day one with senior-level experince with Linux operating systems, web servers, and general desktop software. Our focus is on the LAMP stack, but exposure to Windows Server and IIS is a plus. Excellent troubleshooting and diagnostic … of Zend products. Participate in an on-call rotation Requirements: Experience using the AWS EC2 web console and APIs. Deep understanding of HTTP protocol, including web security, and troubleshooting. Apache or Nginx web server administration and configuration experience. Linux system administration experience (Red Hat, Rocky, Alma, Debian, Ubuntu, et. al.) Experience maintaining production RDBMS servers such as MySQL /More ❯
London, England, United Kingdom Hybrid / WFH Options
Modo Energy Limited
CD pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of … we expand our architecture globally. Qualifications Bachelor’s / Master’s degree in Computer Science, Information Technology or equivalent degree subject. Strong proficiency in Python. Confident in using Linux system and tools. Excellent problem-solving abilities and capacity to work autonomously and adapt to a flexible, evolving environment. Ability to adapt to a flexible, evolving environment. Some experience or More ❯
/ technical support role Excellent analytical and problem-solving skills Experience with customer service desk systems, such as Jira Service Management Strong experience with open-source software technologies (Apache, OpenLayers, QGIS) Experience with Kubernetes and Docker Experience with Linux System administration and bash scripting Experience with SQL and RDBMS technology (PostgreSQL) Knowledge of programming, scripting, or rules-based languages More ❯
CD pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of … we expand our architecture globally. Qualifications Bachelor's / Master's degree in Computer Science, Information Technology or equivalent degree subject. Strong proficiency in Python. Confident in using Linux system and tools. Excellent problem-solving abilities and capacity to work autonomously and adapt to a flexible, evolving environment. Ability to adapt to a flexible, evolving environment. Hybrid Work Environment More ❯
CD pipelines. Implement and optimize automation processes using infrastructure-as-code (Terraform) Build and maintain data pipelines using Airflow. Manage our tech stack including Python, Node.js, PostgreSQL, MongoDB, Kafka, and Apache Iceberg. Optimize infrastructure costs and develop strategies for efficient resource utilization. Provide critical support by monitoring services and resolving production issues. Contribute to the development of … we expand our architecture globally. Qualifications Bachelor's / Master's degree in Computer Science, Information Technology or equivalent degree subject. Strong proficiency in Python. Confident in using Linux system and tools. Excellent problem-solving abilities and capacity to work autonomously and adapt to a flexible, evolving environment. Ability to adapt to a flexible, evolving environment. Hybrid Work Environment More ❯