Quantifying accuracy improvement in sets of pooled judgments: does dialectical bootstrapping work?

Détails

Ressource 1Télécharger: BIB_C60340BE1A69.P001.pdf (102.38 [Ko])
Etat: Public
Version: Author's accepted manuscript
ID Serval
serval:BIB_C60340BE1A69
Type
Article: article d'un périodique ou d'un magazine.
Collection
Publications
Institution
Titre
Quantifying accuracy improvement in sets of pooled judgments: does dialectical bootstrapping work?
Périodique
Psychological science
Auteur⸱e⸱s
White C.M., Antonakis J.
ISSN
1467-9280 (Electronic)
ISSN-L
0956-7976
Statut éditorial
Publié
Date de publication
01/01/2013
Peer-reviewed
Oui
Volume
24
Numéro
1
Pages
115-116
Langue
anglais
Notes
Publication types: Journal Article
Publication Status: ppublish
Résumé
Galton (1907) first demonstrated the "wisdom of crowds" phenomenon by averaging independent estimates of unknown quantities given by many individuals. Herzog and Hertwig (2009; hereafter H&H in Psychological Science) showed that individuals' own estimates can be improved by asking them to make two estimates at separate times and averaging them. H&H claimed to observe far greater improvement in accuracy when participants received "dialectical" instructions to consider why their first estimate might be wrong before making their second estimates than when they received standard instructions. We reanalyzed H&H's data using measures of accuracy that are unrelated to the frequency of identical first and second responses and found that participants in both conditions improved their accuracy to an equal degree.
Mots-clé
Feedback, Psychological, Humans, Judgment, Meta-Analysis as Topic, Problem Solving, Statistics as Topic/methods
Pubmed
Web of science
Création de la notice
22/04/2012 9:55
Dernière modification de la notice
20/08/2019 15:41
Données d'usage