Fusing Visual and Auditory Space

 

Friday, February 7, 2014
217 Perkins Library - 11:00 a.m.

Jennifer M. Groh, PhD

Professor, Department of Psychology & Neuroscience, Duke University 

 

Abstract:

The brain uses different mechanisms for detecting the positions of visual vs auditory stimuli. I will talk about the differences between visual and auditory representations in the brain, and present research concerning the neural mechanisms for reconciling these discrepancies in visual and auditory coding.

Biographical Sketch:

Jennifer M. Groh is a professor of Psychology, Neurobiology, and Cognitive Neuroscience at Duke University.

Dr. Groh received her undergraduate degree in biology summa cum laude from Princeton University, where she studied the behavior and ecology of wild horses. She studied neuroscience at the University of Michigan (Master’s, 1990) and the University of Pennsylvania, where she received her Ph.D. in 1993 for her dissertation concerning how the brain translates auditory and somatosensory information into a frame of reference more akin to that used by the visual system. She then moved to Stanford University for postdoctoral research concerning how the brain’s sensory representations are interpreted by motor areas to produce behavior. In 1997, she joined the faculty at Dartmouth. Since 2006 she has been a member of the faculty in the Center for Cognitive Neuroscience and the Duke Institute for Brain Sciences at Duke University. She also holds appointments in the Departments of Neurobiology and Psychology & Neuroscience at Duke. In 2009, she was named a John Simon Guggenheim Memorial Fellow in Neuroscience.

Dr. Groh directs a research laboratory that investigates the neural mechanisms of hearing, vision, and eye movements. She is interested in how the brain processes spatial information from different sensory systems. Much of her research concerns the reference frame used by the brain for visual, auditory, and tactile information. She has also investigated whether the brain uses an “analog” or “digital” code for sound location.

Dr. Groh’s work also concerns how what we see influences what we hear. Her laboratory has demonstrated that neurons in auditory brain regions are sometimes responsive not just to what we hear but also to what direction we are looking and what visual stimuli we can see. These surprising findings challenge the prevailing assumption that the brain’s sensory pathways remain separate and distinct from each other at early stages, and suggest a mechanism for such multisensory interactions as lip-reading and ventriloquism (the capture of perceived sound location by a plausible nearby visual stimulus).

Her program has been supported by a variety of sources including the National Institutes of Health, the National Science Foundation, and the Office of Naval Research Young Investigator Program, the McKnight Endowment Fund for Neuroscience, the John Merck Scholars Program, the EJLB Foundation, the Alfred P. Sloan

Related Readings:

http://dx.doi.org/10.1371/journal.pone.0072562

http://dx.doi.org/10.1371/journal.pone.0085017