Abstract
The human brain is an efficient, adaptive, and predictive machine, constructing a generative model of the environment that we then perceive and become conscious of. Here, we show that different types of prediction-errors - the discrepancies between top-down expectations and bottom-up sensory input - are integrated across processing levels and sensory modalities of the cortical hierarchy. We designed a novel, hybrid protocol in which five prediction-establishing sounds were played in rapid succession (e.g., "meow", "meow", "meow", etc.), followed by either a standard (e.g., "meow") or a deviant (e.g., "woof") prime sound, then a visual target word that was either congruent or incongruent (e.g., "cat" or "dog") with the prime sound. We found that the deviants elicited a more negative voltage than the standards at about 150 ms - the mismatch negativity (MMN), an event-related potential (ERP) sensitive to low-level perceptual violations - and that the incongruent words elicited a more negative voltage than the congruent words at about 350 ms - the N400, an ERP sensitive to high-level semantic violations. We also found that the N400 was context-dependent: the N400 was larger when the target words were preceded by a standard than a deviant. Our results suggest that perceptual prediction-errors modulate subsequent semantic prediction-errors. We conclude that our results are consistent with one of the most important assumptions of predictive coding theories: hierarchical prediction-error processing.
Citation
ID:
79944
Ref Key:
jack2019semanticbrain