Introduction
Scientists have researched the integration of visual and auditory spatial information for many decades. Through this research, the scientific community has acquired knowledge pertaining to the benefit that the brain receives from combining both visual and spatial information. The benefit from the contribution of both modalities in terms of spatial localization results because both audition and vision provide distinctive and complimentary information to the brain. Although scientific evidence can confirm that, through direct projection, vision provides reliable and accurate information for spatial localization. Audition is also very important in terms of spatial localization due to the broad range of information it can provide about the location of a desired signal in any direction.
Audition provides invaluable information when a visual stimulus is not available or the visual stimulus is hidden or camouflaged. Spatial localization that results from the combination of both modalities is more reliable than when only using either modality in isolation. The following paper will discuss research that verified visual dominance in spatial localization, with additional evidence that supports how important audition is in terms of spatial localization.
Background
Visual Dominance
Those who have been exposed to a ventriloquist act have observed the effects of visual dominance in spatial localization. This phenomenon occurs when a human identifies the auditory stimulus as initiating from near the visual stimulus, when the stimuli are inharmonious (Knudsen & Brainard 1995). Visual localization can even influence the localization of an auditory stimulus when visual and auditory stimuli are es...
... middle of paper ...
...edith, M. A., & Stein, B. E. (1986). Visual, auditory, and somatosensory convergence on cells in superior colliculus results in multisensory integration. Journal of Neurophysiology, 56(3), 640-662.
Rauschecker, J., & Harris, L. (1983). Auditory compensation of the effects of visual deprivation in the cat's superior colliculus. Experimental Brain Research, 50(1), 69-83.
Rauschecker, J. P. (1993). Auditory compensation for early blindness in cat cerebral cortex. The Journal of Neuroscience, 13(10), 4538.
THURLOW, W. R. (1976). Further study of existence regions for the" ventriloquism effect". Journal of the American Audiology Society, 1(6), 280.
Wallace, M. T., Meredith, M. A., & Stein, B. E. (1993). Converging influences from visual, auditory, and somatosensory cortices onto output neurons of the superior colliculus. Journal of Neurophysiology, 69(6), 1797-1809.
The ultimate goal for a system of visual perception is representing visual scenes. It is generally assumed that this requires an initial ‘break-down’ of complex visual stimuli into some kind of “discrete subunits” (De Valois & De Valois, 1980, p.316) which can then be passed on and further processed by the brain. The task thus arises of identifying these subunits as well as the means by which the visual system interprets and processes sensory input. An approach to visual scene analysis that prevailed for many years was that of individual cortical cells being ‘feature detectors’ with particular response-criteria. Though not self-proclaimed, Hubel and Wiesel’s theory of a hierarchical visual system employs a form of such feature detectors. I will here discuss: the origins of the feature detection theory; Hubel and Wiesel’s hierarchical theory of visual perception; criticism of the hierarchical nature of the theory; an alternative theory of receptive-field cells as spatial frequency detectors; and the possibility of reconciling these two theories with reference to parallel processing.
The next speaker, Dr. Gottlieb investigated the hearing aspect of our senses. He investigated the interaction between our heari...
Sensory signals relating information about our physical movements, as well as information regarding external object motion, are required in order to preserve a stable and accurate view of the world, and estimate external motion. Space constancy is the visual system’s ability to maintain a view of the outside world that does not jump about and move with an eye movement (Deubel, Bridgeman, & Schneider, 1998; Stark & Bridgeman, 1983). A simple way of achieving this is to add the velocity estimates that are derived from afferent and efferent motion signals. The sum of these estimates would result in head-centred motion. For instance, the image on the retina of stationary objects in the world would gain a motion opposite and equal to any eye movement. As suggested above, reafferent retinal motion should provide a velocity estimate of similar magnitude to the efferent estimates of eye movement. If these two estimates are equal to one another, but have opposite sign, then their sum would correctly suggest null motion.
Lu, Z.-L., Williamson, S.J., & Kaufman L. (1992, Dec 4). Behavioral lifetime of human auditory
Kanske, P., Heissler, J., Schönfelder, S., Forneck, J., & Wessa, M. (2013). Neural correlates of
McLachlan, N. M., Phillips, D. S., Rossell, S. L., & Wilson, S. J. (2013). Auditory processing
McDonald, J., Teder-Salejarvi, W, & Hillyard, S. (2000). Involuntary orienting to sound improves visual perception. Nature, 407, 906-907.
Massaro, D. W. & Warner, D. S. (1977). Dividing attention between auditory and visual perception. Attention, Perception & Psychophysics, 21(6): 569-574.
M.M. Merzenich, J. K. (1983). Topographical reorganization of somatosensory cortial areas 3b and 1 in adult monkeys following restrictive deafferentation. Neuroscience, 33-55.
...I) to show activation in the dorsal cortex during unconscious perception. Therefore, if neuroimaging evidence demonstrates that dorsal stream is activated during unconscious processing, then this can strengthen the conclusions drawn from their experiment.
Sounds automatically produce conscious visual and auditory experiences in auditory-visual synesthesia. Direct auditory-visual percepts may play a functional role in multisensory processing, which may give rise to synesthesia-like illusion or illusory flash. The illusion occurs predominantly in peripheral vision, and is accompanied by electrical activity over occipital sites (Oz, O1, and O2) (Shams et al., 2001). The cross-modal transfer hypothesis assumes that connections between auditory and visual regions are indirect and are mediated by multisensory audiovisual brain regions (Goller et al., 2009). Multisensory processes may be activated when two senses are stimulated or by a unimodal stimulus such as synesthesia (Goller et al., 2009). This
Rauscher, F. H., Shaw, G. L., & Ky, K. N (1993). Music and spatial task performance. Nature, 365(6447), 611. doi:10.1038/365611a0
We use our ears for the hearing sense, and we use our eyes for vision.
Cherry, E. C. (1953). Some Experiments on recognition of speech, with one and with two ears. Journal of the Acoustical Society of America, 25, 975-979.
As Table 1 shows, the mean reaction time to visual stimulus is greater than the mean reaction time to auditory stimulus. The chi-squared value of 9.600 in Table 2 allows us to reject our null hypothesis that there is no difference between auditory and visual reaction times. This result is consistent with our predicted outcome and it also supports Brebner and Welford (1980). Reaction time to a stimulus depends on many factors, including the reception of the stimulus by the eyes, the transmission of a neural signal to the brain, muscular activation, and finally, the physical reaction to the stimulus (Pain and Hibbs, 2007). The reaction times to auditory stimulus were shorter than to visual stimulus, implying that the auditory stimulus reaches the motor cortex