How blurring the border between virtual reality and the real world is helping us to understand how neurons create the perception of space!
Acharya, Aghajan, Vuong, and Moore, in the Mehta laboratory at UCLA have just published a paper in the journal Cell where they use rats running in the real world and in an immersive virtual reality environments to explore how rodents process spatial information. Studies in monkeys and humans have shown that hippocampal neurons respond specifically to visual cues while this has not been seen in rodents. Mehta and colleagues used clever manipulations of the conditions in which rats navigate in space, including carefully crafted virtual environments, and developed sophisticated mathematical techniques, to resolve this discrepancy between rodent and primate studies. By manipulating visual and vestibular cues with the help of virtual environments, the authors found clear evidence for directional modulation despite the loss of vestibular information, a result that contradicts current theories of directionality. Accordingly, manipulations of visual cues, in the absence of vestibular signals, showed that they play a direct role in directionally selective hippocampal responses. By showing that hippocampal cells in rodents respond to visual cues, these results resolve a long standing gap between primate and rodent studies of spatial navigation.
For a Terry Sejnowsky News and Views on this paper click here