London, Nov 21 (ANI): Have you ever wondered how we make accurate perception of environment out of smell, taste, hear, view and touch.
Now, a team of scientists at the University of Rochester, Washington University in St. Louis, and Baylor College of Medicine has solved the mystery.
The human brain is bombarded with a cacophony of information from the eyes, ears, nose, mouth and skin.
But the team has discovered how the brain manages to process those complex, rapidly changing, and often conflicting sensory signals to make sense of our world.
They say a relatively simple computation performed by single nerve cells, an operation that can be described mathematically as a straightforward weighted average, make it possible.
The key is that the neurons have to apply the correct weights to each sensory cue, and the authors reveal how this is done.
The study represented the first direct evidence of how the brain combines multiple sources of sensory information to form as accurate a perception as possible of its environment, the researchers said.
It showed that the brain does not have to first "decide" which sensory cue is more reliable and demonstrated that the low-level computations performed by single neurons in the brain, when repeated by millions of neurons performing similar computations, accounts for the brain's complex ability to know which sensory signals to weight as more important.
"Thus, the brain essentially can break down a seemingly high-level behavioural task into a set of much simpler operations performed simultaneously by many neurons," explained study coauthor Greg DeAngelis, professor and chair of brain and cognitive sciences at the University of Rochester.
The discovery may eventually lead to new therapies for people with Alzheimer's disease and other disorders that impair a person's sense of self-motion, noted DeAngelis.
The study will be published online Nov. 20 in Nature Neuroscience. (ANI)
|
Comments: