Evidence from a new study published in PLOS Computational Biology by researchers from Brown University and led by Assistant Professor Thomas Serre suggests that when we analyze scenery we simply make the easiest judgments first, rather than following a priority order of categories.
There are many ways we understand scenery. Is it navigable or obstructed? Natural or man-made? A face or not a face? In previous experiments, researchers have found that some categorization tasks seem special, in that they occur earlier than others, leading to a hypothesis that the brain has a prescribed set of priorities. One example of this, the "superordinate advantage," holds that people will first sort out global properties of a scene or "superordinate" categorization before analyzing more specific properties or "basic" categorization. Judging "indoor vs. outdoor," the hypothesis goes, not only does happen before "kitchen vs. bathroom," but also must happen beforehand.
To check that assumption, Serre and colleagues iterated upon a standard computational model that could reliably rate the "discriminability" of scenery, or how easily images could be categorized. Then they did two experiments with human volunteers. The first showed that the more discriminable scenery was as predicted by the model, the faster and more accurately people categorized it. The second showed that by manipulating discriminability they could completely wipe out the "superordinate advantage." If a more basic categorization was easier, it happened faster than the superordinate categorization.
"The mere fact that it is possible to reverse [the superordinate advantage], shows that it not a sequential type of process," Serre said. "Whatever is happening in the visual system might not be as sophisticated as we thought."
It's certainly still possible that a hybrid of the two hypotheses exist, Serre said. There may be some hierarchy or priorities, but discriminability is so a powerful a factor it can actually overwhelm it. Further experiments are underway.
Image Credit: Flickr / Creative Commons
Image Link: https:/
All works published in PLOS Computational Biology are Open Access, which means that all content is immediately and freely available. Use this URL in your coverage to provide readers access to the paper upon publication: http://journals.
Contact: Thomas Serre
Address: Cognitive, Linguistic & Psychological Sciences Department, Brown Institute for Brain Science, Brown University, Providence, Rhode Island, United States of America
Citation: Sofer I, Crouzet SM, Serre T (2015) Explaining the Timing of Natural Scene Understanding with a Computational Model of Perceptual Categorization. PLoS Comput Biol 11(9):e1004456. doi:10.1371/journal.pcbi.1004456
Funding: This work was supported by the National Science Foundation (NSF) early career award [grant number IIS-1252951 to TS]. Additional support was provided by the Defense Advanced Research Projects Agency (DARPA) young faculty award [grant number YFA N66001-14-1-4037 to TS], the Office of Naval Research (ONR) grant [grant number N000141110743 to TS], the Brown Institute for Brain Sciences (BIBS), the Center for Vision Research (CVR), and the Center for Computation and Visualization (CCV). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.
Competing Interests: The authors have declared that no competing interests exist.
About PLOS Computational Biology
PLOS Computational Biology features works of exceptional significance that further our understanding of living systems at all scales through the application of computational methods. All works published in PLOS Computational Biology are Open Access. All content is immediately available and subject only to the condition that the original authorship and source are properly attributed. Copyright is retained. For more information follow @PLOSCompBiol on Twitter or contact email@example.com.
PLOS is a nonprofit publisher and advocacy organization founded to accelerate progress in science and medicine by leading a transformation in research communication. For more information, visit http://www.