Virtual Reality vs. Real Life: How Brain Neurons Light Up
Virtual Reality vs. Real Life: How Brain Neurons Light Up
Mon, 01/12/2015 - 8:56am
by Cynthia Fox,
Science Writer
Space-mapping brain neurons do not “light up” in scans
when exposed to the virtual reality (VR) at work in kids’ video games, the way
they do in the “real world.” The neurons—found in the hippocampus—only mirror
the “reality” state some 50 percent of the time.
This was according to a paper published in a recent issue
of Nature Neuroscience by the team of neurobiologist Mayank Mehta, University
of California San Francisco (UCSF), professor of physics and neurology, where
rats were exposed to VR while their heads were partially immobilized.
“I think this paper makes it clear that spatial
perception depends on a synthesis of information from multiple sources,
including all the senses,” Loren Frank told Bioscience Technology. Frank, a
UCSF neuroscientist, was not involved in the study. “In a way, that is what
makes the hippocampus remarkable: it is able to create a coherent
representation of space from many different sorts of information.”
Several neuroscientists contacted by Bioscience
Technology agreed the work solidifies the notion that the less mobile the head
during games and experiments, the more the brain experiences these as less than
life-like.
“This paper did not surprise me based on the views we
expressed in our commentary paper that was published in the Journal of
Cognitive Neuroscience last year,” said Dartmouth College neuroscientist
Jeffrey Taube. “Our paper discusses the importance of movement cues being
available (proprioceptive, motor, vestibular) for normal spatial
cognition/perception. Without these cues available, it is not surprising that
the place cell signal in the hippocampus may not be as robust or as
well-defined as usual. It's like playing football with half a team. You can do
it, but you won't do it as well as when you have a full team (analogous to a
full constellation of available cues).”
He concluded, “This paper shows the importance of these
cues for normal spatial processing.”
Mirroring the world of teen games
From the start, Mehta set out to replicate the experience
of teens playing video games. “Without doubt, it is the system we have used
that best approximates the conditions used in humans,” Mehta told Bioscience
Technology. “In fact, we devised our VR system to mimic not only video games
used for recreation, but far more importantly, the video games used in clinics
to diagnose memory damage in patients. We emphasized this quite a bit in the
manuscript.”
Mehta’s team focused on the hippocampus, a brain region
critical for memory and making spatial “maps.” When people explore any space in
real life, their hippocampal neurons generate “cognitive maps.” Neuroscientists
believe that the hippocampus does this by analyzing distances between objects
and landmarks. Others’ senses—like smell and hearing—play a role as well.
To test whether the hippocampus can create spatial maps
using only “seen” landmarks—sans all other sensory triggers—Mehta’s team
created a noninvasive VR environment and watched the response of hundreds of
hippocampal neurons in rats. Rats were harnessed and placed on a treadmill
surrounded by a “virtual world” on video screens— a virtual environment
comparatively more immersive than IMAX for humans—in a dark isolated room.
Mehta’s team analyzed the behavior of the rats and their neurons both there and
in a “real” room designed to look like the virtual room.
Results were dramatically different. In the virtual
world, hippocampal neurons fired randomly, as if the neurons didn’t know the
rats’ locations at all. This was despite the fact that the rats behaved
normally in both worlds.
Effectively, the cognitive map vanished in the VR
environment, and more than half of the neurons that were highly active in the
“real” environment turned off.
As Mehta noted, the virtual world in the experiment was
very similar to virtual reality used by humans.
Neural music interrupted
Mehta’s team also looked at groups of neurons. Research
has shown that neuron groups generate electrical activity in complicated
patterns key for learning. Such neurons communicate using two different
languages—at once. One language is based on rhythm; the other, intensity.
Astonishingly, all hippocampal neurons speak both languages simultaneously.
Mehta likens this to complementary melodies in a Bach fugue. In the virtual and
real worlds, the language based on rhythm possesses similar structures, even though
each is conveying different messages. It is the language based on intensity
that is utterly disrupted.
When people engage in memory recall—or when they simply
move—hippocampal activity gets rhythmic. The rhythms enable memory formation
and recollection. Such neurons interact with other neurons like musicians in an
orchestra, Mehta likes to say. Perfect synchronization is hugely important.
Learning and memory disorders may be the result when those rhythms break down.
Mehta has shown before that brain rhythms are critical
for hippocampal connections to grow and strengthen with learning.
Two fundamental messages
Gyorgy Buzsaki, a New York University Neurosciences
Institute professor expert in self-organizing brain rhythms, told Bioscience
Technology Mehta’s study offered two “fundamental” take-home messages.
“First is the reliance on internally generated patterns
when the environmental and body cues are diminished (i.e., VR). It is as if the
brain makes up for the missing ingredients and constructs a plan by some
expectation. But its performance is inferior compared to times when the
expectations are constantly verified by feedback.”
Mehta’s team, he said, emphasized this with one-to-two
second-long “motives” that neuroscientists call the ‘life time’ of cell
assemblies. This is a general rule, present not only in the hippocampus under
most circumstances, such as navigation, memory and even REM sleep, but also in
the neocortex, where it has been shown to be under the control of short term
plasticity of the pyramidal cell-interneuron synapse. This is a fundamental
principle of self-organized or internally generated brain dynamics.”
Buzsaki said the second important take-home message from
the Mehta paper is the idea that “constraining the animal’s behavior introduces
hard-to-interpret problems in brain activity. In the Mehta study, the body was
fixed, but the head could still be moved and exploit feedback from visual
parallax, vestibular, neck muscle and other information. In another, almost
identical task by Dimitry Aronov and David Tank in Neuron, somewhat more
freedom of movement was provided to the rat, and the similarities to RW were
much higher. By extrapolation, when the head is fixed, as is the routine in
many recent VR tasks, the discrepancy between real world and VR is expected to
be higher than shown here in the Mehta study. Head-fixed preparation is
convenient for the experimenter but should not be a goal (unless specific
questions are to be addressed). Efforts should be made to build technologies that
allow sophisticated measurements in unrestrained, freely behaving animals.”
Tank, head of the Princeton University Neuroscience
Institute, told Bioscience Technology that he agreed. “Yes, the method in our
recent Neuron paper allows the subject’s head (and body) to turn naturally in
order to change directions in the virtual world; that is different than the
method used in [Mehta’s] Nature Neuroscience paper. Rotation is sensed by the vestibular system,
so our method retains more natural vestibular system signals during the
navigation. One hypothesis is that it is this difference that results in the
firing of neurons recorded in our virtual navigation to be more similar to what
is observed during completely unconstrained ‘real world’ conditions.”
Tank, also uninvolved with Mehta’s work, says a “direct”
comparison can’t be made between teens and video games, and the rats in Mehta’s
virtual world. “But the addition of appropriate vestibular input may help
explain why people find the new Oculus VR systems—which change the virtual
reality display as it monitors your head rotation-- very realistic.”
Facebook acquired the Oculus VR company in 2014 for $2
billion.
Comments
Post a Comment