TY - GEN
T1 - Mechanisms of visual-auditory temporal processing for artificial intelligence
AU - Yang, Jingjing
AU - Li, Qi
AU - Li, Xiujun
AU - Wu, Jinglong
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014
Y1 - 2014
N2 - In everyday life, our brains integrate various kinds of information from different modalities to perceive our complex environment. Temporal synchrony of audiovisual stimuli is required for audiovisual integration. Many studies have shown that temporal asynchrony of visual-auditory stimuli can influence the interaction between visual and auditory stimulus, however, the multisensory mechanisms of asynchrony inputs were not well understood. In present study, visual and auditory stimuli onset asynchrony (SOA= ±250 ms, ±200 ms, ±150 ms, ±100 ms, ±50 ms, 0 ms), only the auditory stimulus was attended. From the behavioral results, the responses to temporal asynchronous audiovisual stimuli were more accurate than unimodal auditory stimuli. The most significant enhancement was SOA = -100ms condition (the visual preceding), which reaction time was the fastest. These results revealed the basis of audiovisual interaction in which audiovisual stimuli presented with different SOA. The temporal alignment of visual-auditory stimuli can enhance the auditory detection. The study can offer basic theory for artificial intelligence.
AB - In everyday life, our brains integrate various kinds of information from different modalities to perceive our complex environment. Temporal synchrony of audiovisual stimuli is required for audiovisual integration. Many studies have shown that temporal asynchrony of visual-auditory stimuli can influence the interaction between visual and auditory stimulus, however, the multisensory mechanisms of asynchrony inputs were not well understood. In present study, visual and auditory stimuli onset asynchrony (SOA= ±250 ms, ±200 ms, ±150 ms, ±100 ms, ±50 ms, 0 ms), only the auditory stimulus was attended. From the behavioral results, the responses to temporal asynchronous audiovisual stimuli were more accurate than unimodal auditory stimuli. The most significant enhancement was SOA = -100ms condition (the visual preceding), which reaction time was the fastest. These results revealed the basis of audiovisual interaction in which audiovisual stimuli presented with different SOA. The temporal alignment of visual-auditory stimuli can enhance the auditory detection. The study can offer basic theory for artificial intelligence.
KW - Audiovisual integration
KW - Multimodal
KW - Temporal alignment
UR - http://www.scopus.com/inward/record.url?scp=84988268372&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84988268372&partnerID=8YFLogxK
U2 - 10.1109/BMEI.2014.7002868
DO - 10.1109/BMEI.2014.7002868
M3 - Conference contribution
AN - SCOPUS:84988268372
T3 - Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014
SP - 724
EP - 728
BT - Proceedings - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2014 7th International Conference on BioMedical Engineering and Informatics, BMEI 2014
Y2 - 14 October 2014 through 16 October 2014
ER -