Affordable Access

Access to the full text

Multimedia content analysis for emotional characterization of music video clips

Authors
  • Yazdani, Ashkan1
  • Skodras, Evangelos2
  • Fakotakis, Nikolaos2
  • Ebrahimi, Touradj1
  • 1 Ecole Polytechnique Fédérale de Lausanne (EPFL), Multimedia Signal Processing Group (MMSPG), Institute of Electrical Engineering (IEL), Lausanne, 1015, Switzerland , Lausanne (Switzerland)
  • 2 University of Patras, Artificial Intelligence Group, Wire Communications Laboratory, Department of Electrical and Computer Engineering, Patras, 265 04, Greece , Patras (Greece)
Type
Published Article
Journal
EURASIP Journal on Image and Video Processing
Publisher
Springer International Publishing
Publication Date
Apr 30, 2013
Volume
2013
Issue
1
Identifiers
DOI: 10.1186/1687-5281-2013-26
Source
Springer Nature
Keywords
License
Green

Abstract

Nowadays, tags play an important role in the search and retrieval process in multimedia content sharing social networks. As the amount of multimedia contents explosively increases, it is a challenging problem to find a content that will be appealing to the users. Furthermore, the retrieval of multimedia contents, which can match users’ current mood or affective state, can be of great interest. One approach to indexing multimedia contents is to determine the potential affective state, which they can induce in users. In this paper, multimedia content analysis is performed to extract affective audio and visual cues from different music video clips. Furthermore, several fusion techniques are used to combine the information extracted from the audio and video contents of music video clips. We show that using the proposed methodology, a relatively high performance (up to 90%) of affect recognition is obtained.

Report this publication

Statistics

Seen <100 times