Skip to main content

Research Repository

Advanced Search

Reciprocal semantic predictions drive categorization of scene contexts and objects even when they are separate

Reciprocal semantic predictions drive categorization of scene contexts and objects even when they are separate Thumbnail


Abstract

Visual categorization improves when object-context associations in scenes are semantically consistent, thus predictable from schemas stored in long-term memory. However, it is unclear whether this is due to differences in early perceptual processing, in matching of memory representations or in later stages of response selection. We tested these three concurrent explanations across five experiments. At each trial, participants had to categorize a scene context and an object briefly presented within the same image (Experiment 1), or separately in simultaneous images (Experiments 2-5). We analyzed unilateral (Experiments 1, 3) and bilateral presentations (Experiments 2, 4, 5), and presentations on the screen's horizontal midline (Experiments 1-2) and in the upper and lower visual fields (Experiments 3, 4). In all the experiments, we found a semantic consistency advantage for both context categorization and object categorization. This shows that the memory for object-context semantic associations is activated regardless of whether these two scene components are integrated in the same percept. Our study suggests that the facilitation effect of semantic consistency on categorization occurs at the stage of matching the percept with previous knowledge, supporting the object selection account and extending this framework to an object-context reciprocal influence on matching processes (object-context selection account).

Acceptance Date Mar 20, 2020
Publication Date May 21, 2020
Publicly Available Date Mar 28, 2024
Journal Scientific Reports
Print ISSN 2045-2322
Publisher Nature Publishing Group
Pages 8447 - ?
DOI https://doi.org/10.1038/s41598-020-65158-y
Keywords human behaviour, visual system
Publisher URL https://doi.org/10.1038/s41598-020-65158-y

Files




Downloadable Citations