Bash, Python, Go, PowerShell Monitoring and logging tools such as Prometheus, Grafana, Dynatrace Solid understanding of networking and security (VPC, Nginx, AWS WAF, etc.) Database experience with DynamoDB, Aurora, Redshift, SQL Comfortable with Linux/Unix OS administration Ideally, AWS DevOps Engineer certification Exposure to Ping Identity (ForgeRock) is also desirable Business & People Skills Ability to work independently and More ❯
Bash, Python, Go, PowerShell Monitoring and logging tools such as Prometheus, Grafana, Dynatrace Solid understanding of networking and security (VPC, Nginx, AWS WAF, etc.) Database experience with DynamoDB, Aurora, Redshift, SQL Comfortable with Linux/Unix OS administration Ideally, AWS DevOps Engineer certification Exposure to Ping Identity (ForgeRock) is also desirable Business & People Skills Ability to work independently and More ❯
Reading, Berkshire, England, United Kingdom Hybrid / WFH Options
Opus Recruitment Solutions Ltd
pipelines and applications that process complex datasets from multiple operational systems.Key Responsibilities: Build and maintain AWS-based ETL/ELT pipelines using S3, Glue (PySpark/Python), Lambda, Athena, Redshift, and Step Functions Develop backend applications to automate and support compliance reporting Process and validate complex data formats including nested JSON, XML, and CSV Collaborate with stakeholders to deliver More ❯
South West London, London, United Kingdom Hybrid / WFH Options
JAM Recruitment Ltd
processes from APIs and internal systems, leveraging tools such as Kafka, Spark, or AWS-native services. Cloud Data Platforms - Develop and maintain data lakes and warehouses (e.g., AWS S3, Redshift). Data Quality & Governance - Implement automated validation, testing, and monitoring for data integrity. Performance & Troubleshooting - Monitor workflows, enhance logging/alerting, and fine-tune performance. Data Modelling - Handle schema … and GDPR-aligned data practices. Technical Skills & Experience Proficient in Python and SQL for data processing. Solid experience with Apache Airflow - writing and configuring DAGs. Strong AWS skills (S3, Redshift, etc.). Big data experience with Apache Spark. Knowledge of data modelling, schema design, and partitioning. Understanding of batch and streaming data architectures (e.g., Kafka). Experience with Docker More ❯