Title :
Quantitative models of visual-auditory interactions
Author :
Hong, J. ; Papathomas, T.V. ; Vidnyanszky, Z.
Author_Institution :
Dept. of Biomed. Eng., Rutgers Univ., New Brunswick, NJ, USA
Abstract :
We developed a neurocomputational model for the integration of visual and auditory stimuli. The model comprises three main stages. The first visual stage is motion extraction, with a corresponding stage for auditory processing. The second stage models visual-auditory neural interactions, simulating a neural network involving the superior colliculus. The third stage is a global integration stage, simulating higher cortical areas. Simulation results with this model agree closely with experimental data.
Keywords :
brain; ear; eye; hearing; neural nets; neurophysiology; physiological models; visual perception; auditory processing; auditory stimuli; cortical simulation; global integration stage; motion extraction; neural network; neurocomputational model; superior colliculus; visual stage; visual stimuli; visual-auditory interactions; visual-auditory neural interaction; Biological neural networks; Biological system modeling; Biology computing; Biomedical engineering; Brain modeling; Computational modeling; Data mining; Gabor filters; Motion detection; Predictive models;
Conference_Titel :
Bioengineering Conference, 2005. Proceedings of the IEEE 31st Annual Northeast
Print_ISBN :
0-7803-9105-5
Electronic_ISBN :
0-7803-9106-3
DOI :
10.1109/NEBC.2005.1431902