Quantitative evaluation of multiple-point simulations using image segmentation and texture descriptors

Détails

ID Serval
serval:BIB_66C2F538CBCE
Type
Article: article d'un périodique ou d'un magazine.
Collection
Publications
Institution
Titre
Quantitative evaluation of multiple-point simulations using image segmentation and texture descriptors
Périodique
Computational Geosciences
Auteur⸱e⸱s
Abdollahifard Mohammad Javad, Mariéthoz Grégoire, Ghavim Maryam
ISSN
1420-0597
1573-1499
Statut éditorial
Publié
Date de publication
12/2019
Peer-reviewed
Oui
Volume
23
Numéro
6
Pages
1349-1368
Langue
anglais
Résumé
Continuous growth of multiple-point simulation algorithms for modeling environmental variables necessitates a straightforward, reliable, robust, and distinctive method for evaluating the quality of output images. A good simulation method should produce realizations consistent with the training image (TI). Moreover, it should be capable of producing diverse realizations to effectively model the variability of real fields. In this paper, the pattern innovation capability is evaluated by estimating the coherence map using keypoint detection and matching, without assuming any access to the simulation process. Local binary patterns, as distinctive and effective texture descriptors, are also employed to evaluate the consistency of realizations with the TI. Our proposed method provides absolute measures in the interval [0,1], allowing MPS algorithms to be evaluated on their own. Experiments show that the produced scores are consistent with human perception and robust for different realizations obtained using the same method, allowing for a reliable judgment using a few realizations. While a human observer is highly sensitive to discontinuities and insensitive to verbatim copies, the proposed method considers both factors simultaneously.
Mots-clé
Geostatistics, Variability, Texture analysis, Local binary patterns, Coherence map
Web of science
Création de la notice
15/05/2020 8:48
Dernière modification de la notice
18/05/2024 5:59
Données d'usage