Schyns, P.G. and Petro, L. and Smith, Marie L. (2007) Dynamics of visual information integration in the brain for categorizing facial expressions. Current Biology 17 (18), pp. 1580-1585. ISSN 0960-9822.
Abstract
A key to understanding visual cognition is to determine when, how, and with what information the human brain distinguishes between visual categories. So far, the dynamics of information processing for categorization of visual stimuli has not been elucidated. By using an ecologically important categorization task (seven expressions of emotion), we demonstrate, in three human observers, that an early brain event (the N170 Event Related Potential, occurring 170 ms after stimulus onset [1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16]) integrates visual information specific to each expression, according to a pattern. Specifically, starting 50 ms prior to the ERP peak, facial information tends to be integrated from the eyes downward in the face. This integration stops, and the ERP peaks, when the information diagnostic for judging a particular expression has been integrated (e.g., the eyes in fear, the corners of the nose in disgust, or the mouth in happiness). Consequently, the duration of information integration from the eyes down determines the latency of the N170 for each expression (e.g., with “fear” being faster than “disgust,” itself faster than “happy”). For the first time in visual categorization, we relate the dynamics of an important brain event to the dynamics of a precise information-processing function.
Metadata
Item Type: | Article |
---|---|
School: | Birkbeck Faculties and Schools > Faculty of Science > School of Psychological Sciences |
Depositing User: | Sarah Hall |
Date Deposited: | 10 Mar 2020 15:26 |
Last Modified: | 02 Aug 2023 17:58 |
URI: | https://eprints.bbk.ac.uk/id/eprint/31257 |
Statistics
Additional statistics are available via IRStats2.