Geoscience Data & Automation Engineer
PURPOSE
As part of its 2026-2030 Exploration Strategy, Endeavour Mining is strengthening its enterprise management of exploration and geoscience data to accelerate discovery, improve decision quality, and enable advanced analytics and AI. As part of this journey, EDV is recruiting a Geoscience Data & Automation Engineer plays a critical role in industrializing exploration data flows across all sites. The position is accountable for designing and maintaining automated systems, data transformation, and quality control pipelines for drilling, logging, sampling, assay, survey, and geospatial datasets. The objective is to eliminate manual uploads, spreadsheet dependency, and fragmented QA/QC processes, while ensuring that exploration data is structured, historised, reliable, and analytics ready.
The role contributes directly to AI-readiness by enabling clean, standardized, and scalable exploration datasets to be consumed by geological modelling, analytics, and machine learning initiatives.
KEY ACCOUNTABILITIES
Exploration Data Ingestion & Automation
- Design, build, and maintain end-to-end automated data exchanges for exploration systems, including:
- Drilling and logging systems (e.g., LogChief, IMDEX), Assay laboratory datasets (CSV, XML, API based feeds)
- BoxScan imaging datasetsAnalytics & AI Readiness
- Survey and geospatial datasets
Central Geoscience Data Management Systems (GDMS)
- Ensure secure, scalable, and reliable data flows from field systems to the enterprise data platform
- Reduce manual handling and eliminate duplicate or inconsistent data uploads.
Data Transformation & Historisation
- Implement transformation logic to standardise, harmonise, and historise exploration datasets.
- Ensure data models support longitudinal analysis, geological interpretation, and AI-driven use cases.
- Maintain clear separation between raw, curated, and analytics-ready datasets.
Data Quality & QA/QC Automation
- Design and implement automated QA/QC checks across exploration datasets, including:
- Drillhole validation rules, Assay consistency checks, Referential integrity controls
- Schema validation and completeness checks
- Monitor data quality metrics and collaborate with site geologists and QA/QC managers to resolve data issues.
- Ensure datasets are certified before use in resource modelling or advanced analytics.
Systems Integration & Architecture
- Collaborate with Enterprise Architecture and Data Platform teams to:
- Integrate exploration systems with the central data platform
- Ensure metadata, lineage, and documentation standards are respected
- Design scalable and resilient data pipelines, implement monitoring, logging, and incident resolution mechanisms, Support hybrid architectures where required (site systems + cloud platform)
Data Capture, Validation & Quality Assurance
- Design and supervise data capture workflows from field to system: site geologists, samplers, contractors.
- Implement automated QA/QC processes to ensure data accuracy, completeness, and consistency. Coordinate with: Regional Exploration QA/QC Managers, Site Data Stewards / Field Geologists, GIS Geologists, to ensure data is validated at each stage before use in interpretation or modelling.
Collaboration & Stakeholder Enablement
- Work closely with:
- Site geologists and data stewards
- Regional Exploration QA/QC Managers
- GIS geologists
- Data Scientists and Data Engineers
SKILLS, KNOWLEDGE & EXPERIENCE
Education:
- Master’s degree in computer science, Data Engineering, Information Systems,
- or a related technical field. Exposure to geoscience or mining domain is highly desirable.
Exploration Data & Domain Knowledge
- Strong understanding of exploration data workflows, including drilling, logging, assays, and survey data.
- Familiarity with drillhole data structures and QA/QC processes.
- Exposure to exploration systems such as GDMS, LogChief, IMDEX, BoxScan, or similar platforms.
Data Engineering & Automation
- Proven experience designing and maintaining ETL/ELT pipelines.
- Strong scripting capabilities (Python preferred).
- Experience handling heterogeneous data formats (CSV, XML, JSON, APIs).
- Experience implementing batch and incremental ingestion patterns
- Understanding of data transformation, standardisation, and historisation techniques.
Data Quality & Governance
- Experience implementing validation rules and automated data quality frameworks.
- Understanding referential integrity and structured data models.
- Familiarity with metadata management and data lineage principles.
- Ability to balance governance controls with operational usability.
Cloud & Platform Experience
- Experience with enterprise data platforms (cloud preferred)
- Understanding of orchestration tools and pipeline monitoring
- Familiarity with security, access control, and secure data transfer protocols (SFTP, APIs, token based authentication)
Personal Competencies
- Strong analytical and problem-solving mindset.
- Structured, detail-oriented, and quality-driven.
- Ability to work across IT and Exploration stakeholders.
- Capable of managing multiple priorities in a transformation environment.
- Proactive and solution-oriented, with a strong ownership mindset.
TYPE OF CONTRAT: LONG TERM