Multisensory context portends object memory.

Details

Ressource 1Download: 25137580_AM.pdf (2145.46 [Ko])
State: Public
Version: Author's accepted manuscript
Serval ID
serval:BIB_FAC545C6371B
Type
Article: article from journal or magazin.
Publication sub-type
Letter (letter): Communication to the publisher.
Collection
Publications
Institution
Title
Multisensory context portends object memory.
Journal
Current Biology
Author(s)
Thelen A., Matusz P.J., Murray M.M.
ISSN
1879-0445 (Electronic)
ISSN-L
0960-9822
Publication state
Published
Issued date
2014
Volume
24
Number
16
Pages
R734-R735
Language
english
Notes
Publication types: Letter Publication Status: ppublish Document Type: Letter
Abstract
Multisensory processes facilitate perception of currently-presented stimuli and can likewise enhance later object recognition. Memories for objects originally encountered in a multisensory context can be more robust than those for objects encountered in an exclusively visual or auditory context [1], upturning the assumption that memory performance is best when encoding and recognition contexts remain constant [2]. Here, we used event-related potentials (ERPs) to provide the first evidence for direct links between multisensory brain activity at one point in time and subsequent object discrimination abilities. Across two experiments we found that individuals showing a benefit and those impaired during later object discrimination could be predicted by their brain responses to multisensory stimuli upon their initial encounter. These effects were observed despite the multisensory information being meaningless, task-irrelevant, and presented only once. We provide critical insights into the advantages associated with multisensory interactions; they are not limited to the processing of current stimuli, but likewise encompass the ability to determine the benefit of one's memories for object recognition in later, unisensory contexts.
Pubmed
Web of science
Open Access
Yes
Create date
19/09/2014 17:54
Last modification date
20/08/2019 16:26
Usage data