This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Timeline Editing of Objects in Video
July 2013 (vol. 19 no. 7)
pp. 1218-1227
Shao-Ping Lu, Dept. of Comput. Sci. & Technol., Tsinghua Univ., Beijing, China
Song-Hai Zhang, Dept. of Comput. Sci. & Technol., Tsinghua Univ., Beijing, China
Jin Wei, Dept. of Comput. Sci. & Technol., Tsinghua Univ., Beijing, China
Shi-Min Hu, Dept. of Comput. Sci. & Technol., Tsinghua Univ., Beijing, China
R. R. Martin, Sch. of Comput. Sci. & Inf., Cardiff Univ., Cardiff, UK
We present a video editing technique based on changing the timelines of individual objects in video, which leaves them in their original places but puts them at different times. This allows the production of object-level slow motion effects, fast motion effects, or even time reversal. This is more flexible than simply applying such effects to whole frames, as new relationships between objects can be created. As we restrict object interactions to the same spatial locations as in the original video, our approach can produce high-quality results using only coarse matting of video objects. Coarse matting can be done efficiently using automatic video object segmentation, avoiding tedious manual matting. To design the output, the user interactively indicates the desired new life spans of objects, and may also change the overall running time of the video. Our method rearranges the timelines of objects in the video whilst applying appropriate object interaction constraints. We demonstrate that, while this editing technique is somewhat restrictive, it still allows many interesting results.
Index Terms:
Electron tubes,Trajectory,Bismuth,Visualization,Educational institutions,Optimization,IEEE Potentials,time reversal,Object-level motion editing,foreground/background reconstruction,slow motion,fast motion
Citation:
Shao-Ping Lu, Song-Hai Zhang, Jin Wei, Shi-Min Hu, R. R. Martin, "Timeline Editing of Objects in Video," IEEE Transactions on Visualization and Computer Graphics, vol. 19, no. 7, pp. 1218-1227, July 2013, doi:10.1109/TVCG.2012.145
Usage of this product signifies your acceptance of the Terms of Use.