System Validation training desirable. Knowledge, Skills, and Abilities Proven experience working with ETQ, MasterControl, or similar eQMS platforms - configuration, administration, and user training. Strong working knowledge with CSV and GxP system implementations in a regulated pharmaceutical or life sciences environment. Hands-on experience with CSV activities relating to Empower CDS, including: Validation of new Empower installations and version upgrades Data More ❯
E-WorkBook software, IDBS has extended its flexible, scalable solutions to the IDBS Polar and PIMS cloud platforms to help scientists make smarter decisions with assured confidence in both GxP and non-GxP environments. Do you want to work in a dynamic, fast paced, high performing, safe to fail and fun environment which is founded on trust, empowerment and autonomy More ❯
E-WorkBook software, IDBS has extended its flexible, scalable solutions to the IDBS Polar and PIMS cloud platforms to help scientists make smarter decisions with assured confidence in both GxP and non-GxP environments. Do you want to work in a dynamic, fast paced, high performing, safe to fail and fun environment which is founded on trust, empowerment and autonomy … the technical implementation plan and backlog refinement. Provide technical perspective to products enhancements & new requirements activities. Optimize Spark-based workflows for performance, scalability, and data integrity, ensuring alignment with GxP and other regulatory standards. Research, and promote new technologies, design patterns, approaches, tools and methodologies that could optimise and accelerate development. Apply strong software engineering practices including version control (Git … pipelines that process clinical and pharmaceutical data efficiently, reducing data latency and improving time-to-insight for research and regulatory teams. Enabled regulatory compliance by implementing secure, auditable, and GxP-aligned data workflows with robust access controls. Improved system performance and cost-efficiency by optimizing Spark jobs and Databricks clusters, leading to measurable reductions in compute costs and processing times. More ❯