Bash, Python, Go, PowerShell Monitoring and logging tools such as Prometheus, Grafana, Dynatrace Solid understanding of networking and security (VPC, Nginx, AWS WAF, etc.) Database experience with DynamoDB, Aurora, Redshift, SQL Comfortable with Linux/Unix OS administration Ideally, AWS DevOps Engineer certification Exposure to Ping Identity (ForgeRock) is also desirable Business & People Skills Ability to work independently and More ❯
Bash, Python, Go, PowerShell Monitoring and logging tools such as Prometheus, Grafana, Dynatrace Solid understanding of networking and security (VPC, Nginx, AWS WAF, etc.) Database experience with DynamoDB, Aurora, Redshift, SQL Comfortable with Linux/Unix OS administration Ideally, AWS DevOps Engineer certification Exposure to Ping Identity (ForgeRock) is also desirable Business & People Skills Ability to work independently and More ❯
the data platform, including data pipelines, orchestration and modelling. Lead the team in building and maintaining robust data pipelines, data models, and infrastructure using tools such as Airflow, AWS Redshift, DBT and Looker.Ensuring the team follows agile methodologies to improve delivery cadence and responsiveness. Contribute to hands-on coding, particularly in areas requiring architectural input, prototyping, or critical delivery … Strong mentoring skills and ability to foster team growth and development Strong understanding of the data engineering lifecycle, from ingestion to consumption Hands-on experience with our data stack (Redshift, Airflow, Python, DVT, MongoDB, AWS, Looker, Docker) Understanding of data modelling, transformation, and orchestration best practices Experience delivering both internal analytics platforms and external data-facing products Knowledge of More ❯