The Community for Technology Leaders
2016 International Conference on Frontiers of Information Technology (FIT) (2016)
Islamabad, Pakistan
Dec. 19, 2016 to Dec. 21, 2016
ISBN: 978-1-5090-5300-1
pp: 132-136
Marzuki , School of Electrical Engineering and Informatics, Institut Teknologi Bandung, Bandung, Indonesia
Egi Muhamad Hidayat , School of Electrical Engineering and Informatics, Institut Teknologi Bandung, Bandung, Indonesia
Rinaldi Munir , School of Electrical Engineering and Informatics, Institut Teknologi Bandung, Bandung, Indonesia
P Ary Setijadi , School of Electrical Engineering and Informatics, Institut Teknologi Bandung, Bandung, Indonesia
Carmadi Machbub , School of Electrical Engineering and Informatics, Institut Teknologi Bandung, Bandung, Indonesia
ABSTRACT
Automation of visual perception on the machine is the automation of the interpretation of the data generated by the camera sensor like humans. Some automated systems such as the robot and intelligent vehicle relies heavily on an understanding of the environment. The most fundamental understanding of the visual environment is the Machine's ability to identify categories of scenery is still a lot of uncertainty. The uncertainty of the categorization of the scene appears because of the difficulty categories indicated in complex environments, the layout of objects and different scene in the same category. We present a model approach Distribution of the chance appearance of objects and classify them to obtain the semantic meaning of a scene. From the experiments conducted, the model shows high accuracy and using a dataset SUN908 this approach is an effective approach to explore the knowledge-based on the scenes label using large dataset.
INDEX TERMS
object probability, scene category, machine perception, visual perception, scene understanding
CITATION
Marzuki, Egi Muhamad Hidayat, Rinaldi Munir, P Ary Setijadi, Carmadi Machbub, "Scenes categorization based on appears objects probability", 2016 International Conference on Frontiers of Information Technology (FIT), vol. 00, no. , pp. 132-136, 2016, doi:10.1109/FIT.2016.7857552
181 ms
(Ver 3.3 (11022016))