Leaving the house in the morning may seem simple, but with every move we make, our brains are working feverishly to create maps of the outside world that allow us to navigate and to remember where we are.
Take one step out the front door, and an individual brain cell fires. Pass by your rose bush on the way to the car, another specific neuron fires. And so it goes. Ultimately, the brain constructs its own pinpoint geographical chart that is far more precise than anything you’d find on Google Maps.
But just how neurons make these maps of space has fascinated scientists for decades. It is known that several types of stimuli influence the creation of neuronal maps, including visual cues in the physical environment — that rose bush, for instance — the body’s innate knowledge of how fast it is moving, and other inputs, like smell. Yet the mechanisms by which groups of neurons combine these various stimuli to make precise maps are unknown.
To solve this puzzle, UCLA neurophysicists built a virtual-reality environment that allowed them to manipulate these cues while measuring the activity of map-making neurons in rats. Surprisingly, they found that when certain cues were removed, the neurons that typically fire each time a rat passes a fixed point or landmark in the real world instead began to compute the rat’s relative position, firing, for example, each time the rodent walked five paces forward, then five paces back, regardless of landmarks. And many other mapping cells shut down altogether, suggesting that different sensory cues strongly influence these neurons.
Finally, the researchers found that in this virtual world, the rhythmic firing of neurons that normally speeds up or slows down depending on the rate at which an animal moves, was profoundly altered. The rats’ brains maintained a single, steady rhythmic pattern.
The findings, reported in the May 2 online edition of the journal Science, provide further clues to how the brain learns and makes memories.
The mystery of how cells determine place
“Place cells” are individual neurons located in the brain’s hippocampus that create maps by registering specific places in the outside environment. These cells are crucial for learning and memory. They are also known to play a role in such conditions as post-traumatic stress disorder and Alzheimer’s disease when damaged.
For some 40 years, the thinking had been that the maps made by place cells were based primarily on visual landmarks in the environment, known as distal cues — a tall tree, a building — as well on motion, or gait, cues. But, as UCLA neurophysicist and senior study author Mayank Mehta points out, other cues are present in the real world: the smell of the local pizzeria, the sound of a nearby subway tunnel, the tactile feel of one’s feet on a surface. These other cues, which Mehta likes to refer to as “stuff,” were believed to have only a small influence on place cells.
Could it be that these different sensory modalities led place cells to create individual maps, wondered Mehta, a professor with joint appointments in the departments of neurology, physics and astronomy. And if so, do these individual maps cooperate with each other, or do they compete? No one really knew for sure.
Virtual reality reveals new clues
To investigate, Mehta and his colleagues needed to separate the distal and gait cues from all the other “stuff.” They did this by crafting a virtual-reality maze for rats in which odors, sounds and all stimuli, except distal and gait cues, were removed. As video of a physical environment was projected around them, the rats, held by a harness, were placed on a ball that rotated as they moved. When they ran, the video would move along with them, giving the animals the illusion that they were navigating their way through an actual physical environment.
As a comparison, the researchers had the rats — six altogether — run a real-world maze that was visually identical to the virtual-reality version but that included the additional “stuff” cues. Using micro-electrodes 10 times thinner than a human hair, the team measured the activity of some 3,000 space-mapping neurons in the rats’ brains as they completed both mazes.
What they found intrigued them. The elimination of the “stuff” cues in the virtual-reality maze had a huge effect: Fully half of the neurons being recorded became inactive, despite the fact that the distal and gate cues were similar in the virtual and real worlds. The results, Mehta said, show that these other sensory cues, once thought to play only a minor role in activating the brain, actually have a major influence on place cells.
And while in the real world, place cells responded to fixed, absolute positions, spiking at those same positions each time rats passed them, regardless of the direction they were moving — a finding consistent with previous experiments — this was not the case in the virtual-reality maze.
“In the virtual world,” Mehta said, “we found that the neurons almost never did that. Instead, the neurons spiked at the same relative distance in the two directions as the rat moved back and forth. In other words, going back to the front door-to-car analogy, in a virtual world, the cell that fires five steps away from the door when leaving your home would not fire five steps away from the door upon your return. Instead, it would fire five steps away from the car when leaving the car. Thus, these cells are keeping track of the relative distance traveled rather than absolute position. This gives us evidence for the individual place cell’s ability to represent relative distances.”
Mehta thinks this is because neuronal maps are generated by three different categories of stimuli — distal cues, gait and “stuff” — and that all are competing for control of neural activity. This competition is what ultimately generates the “full” map of space.
“All the external stuff is fixed at the same absolute position and hence generates a representation of absolute space,” he said. “But when all the stuff is removed, the profound contribution of gait is revealed, which enables neurons to compute relative distances traveled.”
The researchers also made a new discovery about the brain’s theta rhythm. It is known that place cells use the rhythmic firing of neurons to keep track of “brain time,” the brain’s internal clock. Normally, Mehta said, the theta rhythm becomes faster as subjects run faster, and slower as running speed decreases. This speed-dependent change in brain rhythm was thought to be crucial for generating the ‘brain time’ for place cells. But the team found that in the virtual world, the theta rhythm was uninfluenced by running speed.
“That was a surprising and fascinating discovery, because the ‘brain time’ of place cells was as precise in the virtual world as in the real world, even though the speed-dependence of the theta rhythm was abolished,” Mehta said. “This gives us a new insight about how the brain keeps track of space-time.”
The researchers found that the firing of place cells was very precise, down to one-hundredth of a second, “so fast that we humans cannot perceive it but neurons can,” Mehta said. “We have found that this very precise spiking of neurons with respect to ‘brain-time’ is crucial for learning and making new memories.”
Mehta said the results, taken together, provide insight into how distinct sensory cues both cooperate and compete to influence the intricate network of neuronal activity. Understanding how these cells function is key to understanding how the brain makes and retains memories, which are vulnerable to such disorders as Alzheimer’s and PTSD.
“Ultimately, understanding how these intricate neuronal networks function is a key to developing therapies to prevent such disorders,” he said.
In May, Mehta joined 100 other scientists in Washington, D.C., to help shape President Obama’s BRAIN Initiative (Brain Research through Advancing Innovative Neurotechnologies), with the goal of trying to tease out how this most complicated of organs works.