This Article 
 Bibliographic References 
 Add to: 
Tracking Human Motion in Structured Environments Using a Distributed-Camera System
November 1999 (vol. 21 no. 11)
pp. 1241-1247

Abstract—This paper presents a comprehensive framework for tracking coarse human models from sequences of synchronized monocular grayscale images in multiple camera coordinates. It demonstrates the feasibility of an end-to-end person tracking system using a unique combination of motion analysis on 3D geometry in different camera coordinates and other existing techniques in motion detection, segmentation, and pattern recognition. The system starts with tracking from a single camera view. When the system predicts that the active camera will no longer have a good view of the subject of interest, tracking will be switched to another camera which provides a better view and requires the least switching to continue tracking. The nonrigidity of the human body is addressed by matching points of the middle line of the human image, spatially and temporally, using Bayesian classification schemes. Multivariate normal distributions are employed to model class-conditional densities of the features for tracking, such as location, intensity, and geometric features. Limited degrees of occlusion are tolerated within the system. Experimental results using a prototype system are presented and the performance of the algorithm is evaluated to demonstrate its feasibility for real time applications.

[1] K. Sato, T. Maeda, H. Kato, and S. Inokuchi, “CAD-Based Object Tracking with Distributed Monocular Camera for Security Monitoring,” Proc. Second CAD-Based Vision Workshop, pp. 291-297, Champion, Pa., Feb. 1994.
[2] P.H. Kelly, A. Katkere, D.Y. Kuramura, S. Moezzi, S. Chatterjee, and R. Jain, “An Architecture for Multiple Perspective Interactive Video,” Proc. ACM Conf. Multimedia, pp. 201-212, 1995.
[3] Q. Cai and J.K. Aggarwal, “Tracking Human Motion Using Multiple Cameras,” Proc. Int'l Conf. Pattern Recognition, pp. 68-72, Vienna, Austria, Aug. 1996.
[4] Q. Cai, A. Mitiche, and J.K. Aggarwal, “Tracking Human Motion in an Indoor Environment,” Proc. Second Int'l Conf. Image Processing, pp. 215-218, Washington, D.C., Oct. 1995.
[5] Q. Cai, “Tracking Human Motion in Indoor Environments Using a Distributed-Camera System,” PhD thesis, The Univ. of Texas at Austin, 1997.
[6] P. Rodriguez and S. Sibal, “Spread: Scalable Platform for Reliable and Efficient Automated Distribution,” Computer Networks, vol. 33, nos. 1-6, pp. 33-49, June 2000.
[7] Y. Chang, X. Lebegue, and J.K. Aggarwal, “Calibrating a Mobile Camera's Parameters,” Pattern Recognition, vol. 26, no. 1, pp. 75-88, 1993.
[8] K. Kanatani, “Constraints on Length and Angle,” Computer Visualization, Graphics, and Image Processing, vol. 41, pp. 28-42, 1988.
[9] Q. Cai and J.K. Aggarwal, “Automatic Tracking of Human Motion in Indoor Scenes across Multiple Synchronized Video Streams,” Proc. Int'l Conf. Computer Vision, Bombay, India, Jan. 1998.
[10] S. Pingali and J. Segen, “Performance Evaluation of People Tracking System,” Proc. IEEE CS Workshop Applications in Computer Vision, pp. 33-38, Sarasota, Fla., 1996.

Index Terms:
Tracking, human modeling, motion estimation, multiple perspectives, Bayesian classification, end-to-end vision systems.
Q. Cai, J.k. Aggarwal, "Tracking Human Motion in Structured Environments Using a Distributed-Camera System," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 21, no. 11, pp. 1241-1247, Nov. 1999, doi:10.1109/34.809119
Usage of this product signifies your acceptance of the Terms of Use.