We are interested in understanding how the nervous system integrates sensory information from multiple modalities. We want to address how sensory inputs get linked to motor function in learned behavior, since sensory and motor codes are a priori different and are not related by any simple mapping. Human speech is a wonderful model system to study issues of sensorimotor integration. In our lab we employ a variety of psychophysical and imaging techniques to study behaviors and their neural correlates.
Our main behavioral paradigm involves alteration of sensory feedbacks. We use computer-controlled robotic devices to perturb speech movements, which in turn alter somatosensory feedback. We also perturb online auditory feedback during speech by altering vowel sounds. For example, we can shift the first formant frequency upward for the vowel in the word “head” while leaving the other formants and the fundamental frequency unchanged. This change will make the word sound more like “hid.” In addition, we can simultaneously perturb somatosensory and auditory feedbacks to further examine their interactions.
We use EEG techniques to trace out temporal patterns of the neural dynamics underlying the behaviors being studied at the lab. We are currently exploring fMRI and TMS techniques to learn more about the core processes of sensorimotor integration and its age related changes in normal and disordered speech. Over all, our research goal is to provide an integrated approach to the understanding of sensorimotor learning by using human speech as a model.