Affordable Access

Multi-Face Tracking by Extended Bag-of-Tracklets in Egocentric Videos

Authors
  • Aghaei, Maedeh
  • Dimiccoli, Mariella
  • Radeva, Petia
Type
Preprint
Publication Date
Jan 13, 2016
Submission Date
Jul 16, 2015
Identifiers
arXiv ID: 1507.04576
Source
arXiv
License
Yellow
External links

Abstract

Wearable cameras offer a hands-free way to record egocentric images of daily experiences, where social events are of special interest. The first step towards detection of social events is to track the appearance of multiple persons involved in it. In this paper, we propose a novel method to find correspondences of multiple faces in low temporal resolution egocentric videos acquired through a wearable camera. This kind of photo-stream imposes additional challenges to the multi-tracking problem with respect to conventional videos. Due to the free motion of the camera and to its low temporal resolution, abrupt changes in the field of view, in illumination condition and in the target location are highly frequent. To overcome such difficulties, we propose a multi-face tracking method that generates a set of tracklets through finding correspondences along the whole sequence for each detected face and takes advantage of the tracklets redundancy to deal with unreliable ones. Similar tracklets are grouped into the so called extended bag-of-tracklets (eBoT), which is aimed to correspond to a specific person. Finally, a prototype tracklet is extracted for each eBoT, where the occurred occlusions are estimated by relying on a new measure of confidence. We validated our approach over an extensive dataset of egocentric photo-streams and compared it to state of the art methods, demonstrating its effectiveness and robustness.

Report this publication

Statistics

Seen <100 times