The Community for Technology Leaders
2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (2008)
Anchorage, AK, USA
June 23, 2008 to June 28, 2008
ISBN: 978-1-4244-2339-2
pp: 1-8
Masatoshi Ishikawa , Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Japan
Kota Yamaguchi , Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Japan
Takashi Komuro , Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Japan
Yoshihiro Watanabe , Graduate School of Information Science and Technology, The University of Tokyo, 7-3-1 Hongo, Bunkyo-ku, Japan
ABSTRACT
This paper describes an in-depth investigation and implementation of interleaved memory for pixel lookup operations in computer vision. Pixel lookup, mapping between coordinates and pixels, is a common operation in computer vision, but is also a potential bottleneck due to formidable bandwidth requirements for real-time operation. We focus on the acceleration of pixel lookup operations through parallelizing memory banks by interleaving. The key to applying interleaving for pixel lookup is 2D block data partitioning and support for unaligned access. With this optimization of interleaving, pixel lookup operations can output a block of pixels at once without major overhead for unaligned access. An example implementation of our optimized interleaved memory for affine motion tracking shows that the pixel lookup operations can achieve 12.8 Gbps for random lookup of a 4x4 size block of 8-bit pixels under 100 MHz operation. Interleaving can be a cost-effective solution for fast pixel lookup in embedded computer vision.
INDEX TERMS
CITATION
Masatoshi Ishikawa, Kota Yamaguchi, Takashi Komuro, Yoshihiro Watanabe, "Interleaved pixel lookup for embedded computer vision", 2012 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, vol. 00, no. , pp. 1-8, 2008, doi:10.1109/CVPRW.2008.4563152
107 ms
(Ver )