Affordable Access

deepdyve-link
Publisher Website

A Locally Weighted Fixation Density-Based Metric for Assessing the Quality of Visual Saliency Predictions.

Authors
  • Gide, Milind S
  • Karam, Lina J
Type
Published Article
Journal
IEEE Transactions on Image Processing
Publisher
Institute of Electrical and Electronics Engineers
Publication Date
Aug 01, 2016
Volume
25
Issue
8
Pages
3852–3861
Identifiers
DOI: 10.1109/TIP.2016.2577498
PMID: 27295671
Source
Medline
License
Unknown

Abstract

With the increased focus on visual attention (VA) in the last decade, a large number of computational visual saliency methods have been developed over the past few years. These models are traditionally evaluated by using performance evaluation metrics that quantify the match between predicted saliency and fixation data obtained from eye-tracking experiments on human observers. Though a considerable number of such metrics have been proposed in the literature, there are notable problems in them. In this paper, we discuss shortcomings in the existing metrics through illustrative examples and propose a new metric that uses local weights based on fixation density, which overcomes these flaws. To compare the performance of our proposed metric at assessing the quality of saliency prediction with other existing metrics, we construct a ground-truth subjective database in which saliency maps obtained from 17 different VA models are evaluated by 16 human observers on a five-point categorical scale in terms of their visual resemblance with corresponding ground-truth fixation density maps obtained from eye-tracking data. The metrics are evaluated by correlating metric scores with the human subjective ratings. The correlation results show that the proposed evaluation metric outperforms all other popular existing metrics. In addition, the constructed database and corresponding subjective ratings provide an insight into which of the existing metrics and future metrics are better at estimating the quality of saliency prediction and can be used as a benchmark.

Report this publication

Statistics

Seen <100 times