Jump to content

Your Brain Chooses What to Let You See


humble3d

Recommended Posts

Your Brain Chooses What to Let You See


Beneath our awareness, the brain lets certain kinds of stimuli automatically capture our attention by lowering the priority of the rest.


An osprey can zero in on an unassuming trout, even amid distracting sensory information such as the motion of the river’s current. This

ability is due in part to automatic filtering mechanisms hard-wired in the brain. Photo by Jeff J Mitchell / Staff / Getty Images.

 

Quanta recently reported on the filtering mechanisms that allow us to focus our attention on stimuli of interest — that let us tune out the

music in a room to listen to a nearby conversation, or disregard greens, blues and yellows in a crowd when searching for a friend

wearing red. That kind of processing, which involves the suppression of some sensory data to highlight signals that are more relevant,

is directed by a goal.

 

But other processes operate well below this level of awareness, filtering out information that the brain deprioritizes without our willing it,

or even knowing it. In these cases, our focus is directed not by a goal but rather by particular properties of the stimuli, like their

brightness or motion — properties that we’re seemingly hard-wired to consider important. “It makes sense from an evolutionary

perspective,” said Duje Tadin, a neuroscientist at the University of Rochester. “If something is moving, it’s often fairly important to your

survival.”

 

Scientists have long known that our sensory processing must automatically screen out extraneous inputs — otherwise, we couldn’t

experience the world as we do. When we look at our surroundings, for instance, our perceived field of view holds steady or moves

smoothly with our gaze. But the eye is also constantly making small movements, or saccades; our visual system has to subtract that

background jitter from what we see.

 

“Automatic suppressive types of mechanisms take place … through large swaths of the brain,” said Richard Krauzlis, a neuroscientist at

the National Eye Institute at the National Institutes of Health in Maryland. “Basically all over the place.”

 

And automatic background subtraction, it turns out, can also manifest in intriguing, unexpected ways. Take a counterintuitive finding

that Tadin and his colleagues made in 2003: We’re good at perceiving the movements of small objects, but if those objects are simply

made bigger, we find it much more difficult to detect their motion.

 

Recently in Nature Communications, Tadin’s team offered a tantalizing explanation for why this happens: The brain prioritizes the

detection of objects that are more important for us to see, and those tend to be smaller. To a hawk hunting for its next meal, a mouse

suddenly darting through a field matters more than the swaying motion of the grass and trees around it. As a result, Tadin and his team

discovered, the brain suppresses information about the movement of the background — and as a side effect, it has more difficulty

perceiving the movements of larger objects, because it treats them as a kind of background, too.

 

The team further confirmed this idea with a training experiment conducted in older adults. Other researchers had previously reported

that there’s not much difference between how well seniors observe the motion of a small object and the motion of a larger one.

Because of this, Tadin and his colleagues predicted that older people would have problems spotting small moving objects against a

moving backdrop — and that’s exactly what they found. Still, with a few weeks’ training, the test subjects got much better at recognizing

that motion.

 

Yet, as the researchers discovered, the training didn’t actually improve the subjects’ ability to detect small moving objects; when

measured alone, that skill hadn’t changed. Instead, their performance bump occurred because they were less distracted: They had

gotten worse at detecting the movements of the larger background objects. “In some sense, their brain discarded information it was

able to process only five weeks ago,” Tadin said.

 

What these results highlighted, he added, is that our sensitivity to larger moving objects is lower “because that’s the strategy our brain

uses to make smaller moving objects against those backgrounds stand out more.”

 

It’s the same strategy (executed by a different mechanism) that the brain uses in goal-directed attentional processes: It gets rid of

information that’s distracting or less useful in order to make the more relevant inputs stand out.

 

“Before attention gets to do its job,” Tadin said, “there’s already a lot of pruning of information.” For motion perception, that pruning has

to happen automatically because it needs to be done very quickly. “Attention can do the same thing in much smarter and more flexible

ways, but not so effortlessly.”

 

Together, these processes — both the automatic bottom-up ones and the more conscious top-down ones — generate the brain’s

internal representation of its environment. It is what Ian Fiebelkorn, a cognitive neuroscientist at Princeton University, refers to as a

“priority map,” with peaks and valleys that dictate where attentional resources should be aimed. Through learning and training, he

said, top-down goals continue to “manipulate that map, amplifying or suppressing the peaks” that represent salient properties of a

stimulus.

 

When it comes to how and what we perceive, Tadin said, “there’s a lot going on behind the scenes that we just take for granted.”

 


https://getpocket.com/explore/item/your-brain-chooses-what-to-let-you-see?utm_source=pocket-newtab

 

 

Link to comment
Share on other sites


  • Views 724
  • Created
  • Last Reply

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...