with a focus on delivering and operating large scale data processing systems. Has successfully led data platform initiatives. A good understanding of data processing technologies and tools such as Apache Spark, Data Lake, Data Warehousing and SQL Databases. Proficiency in programming languages such as Python and CICD techniques to efficiently deliver change in a consistent, controlled, and high-quality More ❯
scale observability solutions using open-source tools like ELK, Grafana, Prometheus, Nagios, Telegraf, and others. Hands-on experience with middleware technologies (Kafka, API gateways, etc.) and Data Engineering frameworks (Apache Spark, Airflow, Flink, Hadoop ecosystems). Understanding of network fundamentals, Data Structures, scalable system design, and the ability to translate information into structured solutions for product and engineering teams. More ❯
tools, particularly Terraform. Experience with network design, administration, and troubleshooting. Knowledge of programming languages (e.g., JavaScript, Node.js, PHP). Experience with version control systems, ideally Git. Web server configuration (Apache, Nginx). Database management (MySQL, MongoDB), including high availability and backup solutions. Hands-on experience managing cloud providers, with significant experience in AWS and Google Cloud Platform (GCP). More ❯
Warrington, England, United Kingdom Hybrid / WFH Options
JR United Kingdom
Engineering (open to professionals from various data eng. backgrounds — data pipelines, ML Eng, data warehousing, analytics engineering, big data, cloud etc.) Technical Exposure: Experience with tools like SQL, Python, Apache Spark, Kafka, Cloud platforms (AWS/GCP/Azure), and modern data stack technologies Formal or Informal Coaching Experience: Any previous coaching, mentoring, or training experience — formal or informal More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., Apache Airflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an Equal More ❯
common life sciences data acquisition software, such as Scientific Data Management Systems (SDMS) or Laboratory Information Management Systems (LIMS). Hands-on experience with data pipeline orchestration tools (e.g., Apache Airflow) and data parsing. Familiarity with cloud service models, SaaS infrastructure, and related SDLC. Familiarity with containerization and container orchestration tools (e.g., Docker, Kubernetes). ZONTAL is an Equal More ❯
or Informatica Cloud Solid understanding of data extraction, transformation, and loading (ETL) processes Proficiency in SSH and managing/troubleshooting virtual machine environments Familiarity with orchestration tools such as Apache Airflow Experience with AWS services (e.g., S3, EC2, RDS) Understanding of cloud-based environments (Google Cloud Platform experience a plus) Ability to collaborate across infrastructure, data engineering, and business More ❯
detection and response (NDR) technologies. Detailed knowledge of Information Security standards including Cyber Essentials, Cyber Essentials Plus and ISO27001. Good understanding of Linux and database technologies such as Ubuntu, Apache, PHP, MySQL, PostgreSQL, Nginx, Mercurial and Git. Good understanding of cyber security practices in relation to cloud hosting, preferably with experience of AWS. Good understanding of open-source risk More ❯