Decoding and interpreting incoming sensory information are among the brain's most important tasks. These are ongoing processes that make it possible for us not only to know the world in which we live, but to plan and initiate behaviors that are appropriate for a particular circumstance.
Because survival depends on the speed and accuracy of such processes, it is not surprising to find that encoding, decoding, and evaluating sensory information have been powerful driving forces in evolution. Consequently, extant organisms have an impressive array of specialized sensory systems.
Having multiple sensory systems provides significant benefits; it allows an organism to monitor simultaneously a host of environmental cues, and also provides a means of substituting one sensory system for another when necessary (e.g., hearing and/or touch can substitute for vision in the dark).
The ability to monitor and process multiple sensory cues in “parallel” not only increases the likelihood that a given stimulus will be detected, but, because the information carried along each sensory channel reflects a different feature of that stimulus, it increases the chances that the stimulus will be properly identified. Stimuli that may be difficult to distinguish by means of a single sensory modality (e.g., how they look) can become quite distinct via information from another modality (how they sound or feel).
Of particular benefit is the brain’s ability to use the information carried by these different sensory channels synergistically. Thus, different sensory inputs can enhance one another’s effectiveness, increasing the likelihood that an event will be detected, properly identified, and that an appropriate response is initiated as fast as is possible.
Our research goal is to determine how information from different senses is pooled in making decisions, and how the brain develops this remarkable capacity. We use two interrelated neural models to do this: one in the midbrain superior colliculus (SC) and one in association cortex.
The midbrain SC has proved to be a most effective model, but its principal features have been shown also to be operative in cortex. Its advantage is not only its host of sensory inputs (visual, auditory, somatosensory), but its many multisensory neurons, and its distinct behavioral function: facilitating the detection, localization and orientation to external events.
Current Projects
Contrasts In How The Brain Integrates Within-Modal and Cross-Modal Information
Despite the substantial literature describing the impact of multisensory integration in multiple brain areas, and the importance of this process for perception and behavior, a fundamental question has been largely ignored: does the fusion of information from different sensory source yield a product that is different from the fusion of information from the same sensory source? We are investigating this question at physiological and behavioral levels.
Anatomical Bases of Multisensory Integration
Physiological and behavioral studies have revealed that SC multisensory integration depends on the functional integrity of converging projections that descend from different regions of association cortex. We are currently investigating the nature of these projections and the development of this cortico-SC circuit using modern neuroanatomical tracing techniques and electron microscopy.
Computational Bases of Multisensory Integration
Recent evidence suggests that multisensory integration at the physiological level involves both linear and nonlinear computations. We are currently developing and testing a neural network model that seeks to explain a multitude of empirical findings within a single framework.
Effects of Multisensory Integration on Response Timing and Information
The magnitude of the physiological impact of integration at the single neuron level has traditionally been measured as a change (usually expressed as a % change) in the total number of stimulus-elicited impulses. We have recently found that concordant cross-modal stimuli not only produce more robust responses, but significantly speeds these responses. In addition, we have found that integration produces substantial enhancements in information transmission; conveying not only more information, but information at a faster rate.
Similarities and Differences in Integration Across Species
The traditional animal model of SC multisensory integration is the cat, and the anatomical circuit underlying integration has been well-described in this model. We have recently begun evaluating the generality of this anatomical circuit across mammalian species.
Experiential and Anatomical Dependencies of the Development of Integration
Our recent evidence suggests that capacity to engage in multisensory integration is not innate capability, but rather one that the brain develops after birth in an experience-dependent fashion. We are currently investigating the constraints guiding its appearance and maturation; in particular, the anatomical circuits involved, the essential properties of the cues that are critical to link them to one another, and whether the strategies of integration change during maturation.
Multisensory Integration in Motion Perception
Natural environments are dynamic, and salient targets are frequently in motion. Animals interact appropriately with these targets by estimating their velocity and direction of movement. We are current investigating how the brain combines information from different senses to affect and improve these decisions.