This Article 
 Bibliographic References 
 Add to: 
State-of-the-Art in Visual Attention Modeling
Jan. 2013 (vol. 35 no. 1)
pp. 185-207
A. Borji, Dept. of Comput. Sci., Univ. of Southern California, Los Angeles, CA, USA
L. Itti, Dept. of Comput. Sci., Univ. of Southern California, Los Angeles, CA, USA
Modeling visual attention-particularly stimulus-driven, saliency-based attention-has been a very active research area over the past 25 years. Many different models of attention are now available which, aside from lending theoretical contributions to other fields, have demonstrated successful applications in computer vision, mobile robotics, and cognitive systems. Here we review, from a computational perspective, the basic concepts of attention implemented in these models. We present a taxonomy of nearly 65 models, which provides a critical comparison of approaches, their capabilities, and shortcomings. In particular, 13 criteria derived from behavioral and computational studies are formulated for qualitative comparison of attention models. Furthermore, we address several challenging issues with models, including biological plausibility of the computations, correlation with eye movement datasets, bottom-up and top-down dissociation, and constructing meaningful performance measures. Finally, we highlight current research trends in attention modeling and provide insights for future.
Index Terms:
psychology,computer vision,eye,top-down dissociation,stimulus-driven saliency-based visual attention modeling,computer vision,mobile robotics,cognitive systems,eye movement datasets,bottom-up dissociation,Computational modeling,Visualization,Hidden Markov models,Feature extraction,Humans,Solid modeling,Search problems,gist,Visual attention,bottom-up attention,top-down attention,saliency,eye movements,regions of interest,gaze control,scene interpretation,visual search
A. Borji, L. Itti, "State-of-the-Art in Visual Attention Modeling," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 35, no. 1, pp. 185-207, Jan. 2013, doi:10.1109/TPAMI.2012.89
Usage of this product signifies your acceptance of the Terms of Use.