The Community for Technology Leaders
Parallel and Distributed Processing Symposium, International (2012)
Shanghai, China China
May 21, 2012 to May 25, 2012
ISSN: 1530-2075
ISBN: 978-1-4673-0975-2
pp: 1068-1079
ABSTRACT
Quality Threshold Clustering (QTC) is an algorithm for partitioning data, in fields such as biology, where clustering of large data-sets can aid scientific discovery. Unlike other clustering algorithms, QTC does not require knowing the number of clusters a priori, however, its perceived need for high computing power often makes it an unattractive choice. This paper presents a thorough study of QTC. We analyze the worst case complexity of the algorithm and discuss methods to reduce it by trading memory for computation. We also demonstrate how the expected running time of QTC is affected by the structure of the input data. We describe how QTC can be parallelized, and discuss implementation details of our thread-parallel, GPU, and distributed memory implementations of the algorithm. We demonstrate the efficiency of our implementations through experimental data. We show how data sets with tens of thousands of elements can be clustered in a matter of minutes in a modern GPU, and seconds in a small scale cluster of multi-core CPUs, or multiple GPUs. Finally, we discuss how user selected parameters, as well as algorithmic and implementation choices, affect performance.
INDEX TERMS
Complexity theory, Clustering algorithms, Proteins, Algorithm design and analysis, Memory management, Graphics processing unit, Upper bound, distributed, QT-clustering, complexity, GPU, multi-core
CITATION

J. S. Vetter, C. McCurdy and A. Danalis, "Efficient Quality Threshold Clustering for Parallel Architectures," Parallel and Distributed Processing Symposium, International(IPDPS), Shanghai, China China, 2012, pp. 1068-1079.
doi:10.1109/IPDPS.2012.99
159 ms
(Ver 3.3 (11022016))