Our brains keep tabs on threats in our peripheral vision, researchers find
BY Wade Hemsworth
September 15, 2021
While you’re concentrating on whatever’s right in front of your eyes, your brain is doing you a favour by sorting objects in your peripheral vision in time for you to be able to react if necessary.
McMaster researchers from the Department of Psychology, Neuroscience and Behaviour have shown that our brains are registering and sorting objects on the outer edges of our field of vision based on their distance from us.
Co-authors Hong-jin Sun, Allison Sekuler, Patrick Bennett are the academic supervisors of graduate student Jiali Song, who is the lead author of a newly published in the Journal of Vision.
Many experts had previously thought one’s eye and brain were only able to sort peripheral objects on the basis of how much of the field of vision they occupied. But as it turns out, the brain uses a more complex analysis to sort whether an object is, say, a distant parked car or a nearby bee.
Both might be the same size to the eye, but by applying context and perspective, our brains are able to triage objects in the periphery and devote more attention to whatever appears nearer.
Such sorting permits us to make more informed decisions about possible threats to our safety, Sun says, by devoting as much as 20 per cent more attention to the peripheral objects that are closest to us and may demand urgent action.
“It’s a huge difference, and until now it has been overlooked,” says Sun, an associate professor of Psychology, Neuroscience and Behaviour. “We can’t process everything around us to the same degree, and this shows us how the brain is prioritizing the finite resources it can devote to visual processing.”
Understanding how the brain and eye co-ordinate to create layers of visual perception is important not only for its own sake, Song explains, but also as a window into how we think.
The idea for the experiment originated with other research on perceptions related to driving safety. That led to a series of driving-simulation experiments that produced new information about the previously unrecognized level of context the brain is processing, often at high speed.
The researchers were able to show that subjects in a driving simulation concentrating on following a car ahead of them were still able to differentiate between objects that appeared for less than a second in the periphery of their eyesight and could recall accurately where they had appeared.
When we drive, we concentrate on what we’re seeing directly, Song says, but a lot of safety-related information comes from our peripheral vision, where hazards typically appear first. Knowing that the brain and eyes are co-ordinating to sort such information into a hierarchy will fuel new research into the structures that facilitate such complex processing, she says.