Sound Effect on Visual Gaze When Looking at Videos
This paper presents an analysis of sound effect on visual gaze when looking at videos to help to predict eye positions. First, an audio-visual experiment was designed with two groups of participants, with Audio-Visual (AV) and Visual (V) conditions to test the sound effect. The authors classify the sound in three classes: on-screen speech, non-speech and non-sound. They observe with statistical methods that the sound effect is different depending on the class of sound. Then a comparison of the experimental data and a visual saliency model was carried out, which proves that adding sound to video decreases the accuracy of the prediction of the visual saliency model without a sound pathway.