The Community for Technology Leaders
Green Image
Issue No. 07 - July (2012 vol. 24)
ISSN: 1041-4347
pp: 1328-1343
Costas Panagiotakis , Tech. Educational Inst. of Crete, Ierapetra
Nikos Pelekis , University of Piraeus, Piraeus
Ioannis Kopanakis , Tech. Educational Inst. of Crete, Ierapetra
Emmanuel Ramasso , FEMTO-ST Research Inst., Besançon
Yannis Theodoridis , University of Piraeus, Piraeus
Moving Object Databases (MOD), although ubiquitous, still call for methods that will be able to understand, search, analyze, and browse their spatiotemporal content. In this paper, we propose a method for trajectory segmentation and sampling based on the representativeness of the (sub)trajectories in the MOD. In order to find the most representative subtrajectories, the following methodology is proposed. First, a novel global voting algorithm is performed, based on local density and trajectory similarity information. This method is applied for each segment of the trajectory, forming a local trajectory descriptor that represents line segment representativeness. The sequence of this descriptor over a trajectory gives the voting signal of the trajectory, where high values correspond to the most representative parts. Then, a novel segmentation algorithm is applied on this signal that automatically estimates the number of partitions and the partition borders, identifying homogenous partitions concerning their representativeness. Finally, a sampling method over the resulting segments yields the most representative subtrajectories in the MOD. Our experimental results in synthetic and real MOD verify the effectiveness of the proposed scheme, also in comparison with other sampling techniques.
Trajectory segmentation, subtrajectory sampling, data mining, moving object databases.

I. Kopanakis, N. Pelekis, E. Ramasso, Y. Theodoridis and C. Panagiotakis, "Segmentation and Sampling of Moving Object Trajectories Based on Representativeness," in IEEE Transactions on Knowledge & Data Engineering, vol. 24, no. , pp. 1328-1343, 2011.
82 ms
(Ver 3.3 (11022016))