Drummond Basin Prospectivity Mapping
Greg A. Partington, Ross Mining NL
Level 4, 139 Coronation Drive, Milton Queensland 4064
Click here to download paper as PDF
It is important that risks of developing mineral resources are known as accurately as possible. This process starts at the pre-discovery exploration stage and should continue through feasibility to the development stage. Until recently, this type of analysis has been carried out manually, leading to intuitive judgements. With GIS and resource estimation software now available on personal computers, probabilistic models can now be generated. A program of digital data compilation has recently been undertaken to allow the use of more probabilistic data analysis techniques, moving away from the traditional expert-system methods. The first prospectivity map was in the Drummond Basin where the aim was to assess the potential of the area and to test current geological models. Prospectivity mapping, using weights of evidence techniques was carried out at approximately 1:100,000 scale. The initial work involved database compilation, which highlighted errors and gaps in the database.
The result of the Drummond Basin study, although useful, was not as important as the analysis. This provided some important lessons and provided a focus on geological models and exploration methodologies. This analysis allowed the comparison of disparate datasets and associations not easily recognisable between these datasets. This work increased the confidence in the exploration models and techniques currently used in the Drummond Basin. The calculation of the prior probabilities produced a correlation matrix of variable comprising the geological model. This allowed an objective assessment of the model and those features that should be concentrated on during exploration. Finally working with GIS datasets has highlighted the need for good quality data and data management. This has become a problem, as databases are presently available from a diverse number of groups, resulting in variable data quality and standards. No matter how sophisticated your analytical software if your data is poor the result will be of a similar quality. This applies to all aspects of the exploration industry from spatial mapping (GIS) to resource modeling.
GIS, Modern Mineral Potential Modelling and Quantitative Resource Assessment: Implications for the Geological Survey of Queensland
Margaretha Scott, Geological Survey Office, Queensland Department of Mines and Energy and the WH Bryan Mining Geology Research Centre, University of Queensland.
Click here to download paper as PDF
The Geological Survey Office (GSO) undertook a pilot study with the objective of assessing the ability of Survey data to support modern quantitative mineral potential modelling techniques. The Yarrol pilot project was designed to audit digital data sets routinely produced by the GSO, and to provide mineral assessment outputs relevant to both industry and Government decision-making processes. The study focused on the estimation of mineral potential using modern quantitative methods including the USGS three-part resource assessment methodology. Mineral potential was assessed for porphyry-copper-type deposits in part of the Yarrol Province, central Queensland establishing:
- ground permissive for the occurrence of copper-type porphyry deposits;
- zones favourable for the occurrence of such deposits using computer-based prospectivity modelling techniques; and
- estimates of the number of potential undiscovered deposits (probability of occurrence modelling). In this study the statistical technique ‘weights of evidence’ was used for prospectivity modelling in a general PC GIS software environment.The application processes used and the results of the Yarrol case study are reported in this paper, as well as the implications for the operations of the Geological Survey.
Decentralised GIS and Data Management – An M.I.M. Perspective
P.M. Jayawardhana, M.I.M. Exploration Pty Ltd
Click here to download paper as PDF
Over the last five years M.I.M. Exploration (MIMEX) has focused on developing a spatial data management system that enables the visualisation and analysis of data at the area of operation. It allows up-to-date information to be in the hands of the decision makers around the world. This has been a major turn around and re-think from the days of centralised data management.
This paper highlights some of the advantages and disadvantages of decentralised GIS from a M.I.M. (Mount Isa Mines) perspective and what the future holds. We are living in exciting times of constant change, which require our systems to be adaptable to all conditions and environments. The challenge of the 21st Century is to harness technology to work for us and not against us, enabling us to work more efficiently and effectively, without impacting on our overall lifestyle.
The Queensland Geological Survey Office’s (GSO) Approach to the Compilation, Management and Application of its Geoscience Data
John Tuttle, Paul Garrad and Linda Toich
Geological Survey Office, Queensland Department of Mines and Energy
Click here to download paper as a PDF
The Geological Survey Office (GSO) through its information assets plays a strategic role in developing and promoting the prospectivity and exploration potential of Queensland. GSO has benefited from digital technology in its data management and map production for over 15 years and has developed a high level of skill in the various data capture aspects relevant to geology. GSO and its support units have developed rigorous and effective data capture and integration strategies for field data including:
- A corporate digital geoscience data resource
- A structured approach to the capture of primary geoscience information
- An structured process for digital geological map compilation and production
- Ongoing capture of regional geophysical data over prospective terrains in Queensland
This corporate data framework guarantees that users have access to the most up to date geoscience data sets for project compilation and analysis.
The emphasis is now being placed on extracting and disseminating the information value of this data through prospectivity analysis, and new initiatives for the delivery of industry relevant digital geoscience information. The advent of desktop GIS systems has enabled the GSO to explore more accessible methods of disseminating digital geological information about Queensland including:
- A refocussing of geological projects to output data as GIS conformable structures
- Promotional GIS packages
- Mineral occurrence information packages and detailed project specific GIS packages containing seamless digital geological and related data
- Internet map server technology across the QDME intranet and eventually to the Internet
- Internet access to the open file geophysical data sets for Qld using GDADS (Geophysical Data Archiving and Distribution System)
- Digital company reporting and imaging of existing reports.
A continued focus on evolving the GIS and corporate database environments will enable QDME to take maximum advantage of the intranet/Internet as an information access and delivery strategy for local and global exploration and mining industry clients.
Standardisation and Centralisation of Data – Data Quality and Quantity
David R. Jenkins, Terra Search Pty Ltd
P.O.Box 981, Castletown Queensland 4812, email@example.com
Click here to download the paper as a PDF
Digital data management in the Minerals Industry is still in the developmental stages. Many data management procedures currently in place are more suited to hard copy reports than digital files. Data management practices in our industry need to change in response to the major changes in communications and digital technology occurring throughout the world. Standardisation of data in terms of the way data is gathered, collated and stored allows rapid integration and analysis on any scale. Standardisation on a global basis is a difficult task due to the diversity of attributes that are collected. A system needs to be comprehensive, flexible and user friendly to allow standardisation.
The attributes collected also need to be standardised to ensure completeness of datasets. The codes used should be as consistent as possible and codes should not be concatenated to reduce the number of fields in a file.
Operating system standardisation, while less important, must allow free movement of data between software packages. Media for storage should be easily readable and preferable online.
Duplication of data should be avoided as much as possible. Centralising data on an office level removes the risk of having multiple copies of the data without knowing which if any is the most up to date version. Centralisation of data on a wider scale should only occur when the data is needed centrally. Maintaining of two versions of a database is inefficient if the data is only used in one of the locations. A more elegant solution is to have the databases linked on line to allow querying and retrieval when required. This requires consistency between systems in each location.
A standardised data structure will solve many of the long-term data problems while also streamlining current data gathering. Centralisation on a project level is critical. Broader centralisation of a group’s data may be desirable but will only be cost effective if standardisation of data management has occurred on the same scale.