
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Derek C. Stanford, Adrian E. Raftery, "Finding Curvilinear Features in Spatial Point Patterns: Principal Curve Clustering with Noise," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 6, pp. 601609, June, 2000.  
BibTex  x  
@article{ 10.1109/34.862198, author = {Derek C. Stanford and Adrian E. Raftery}, title = {Finding Curvilinear Features in Spatial Point Patterns: Principal Curve Clustering with Noise}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {22}, number = {6}, issn = {01628828}, year = {2000}, pages = {601609}, doi = {http://doi.ieeecomputersociety.org/10.1109/34.862198}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Finding Curvilinear Features in Spatial Point Patterns: Principal Curve Clustering with Noise IS  6 SN  01628828 SP601 EP609 EPD  601609 A1  Derek C. Stanford, A1  Adrian E. Raftery, PY  2000 KW  Bayes factor KW  BIC KW  CEM algorithm KW  earthquake KW  EM algorithm KW  Hough transform KW  modelbased clustering KW  smoothing KW  spatial point process KW  visual defect metrology. VL  22 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
Abstract—Clustering about principal curves combines parametric modeling of noise with nonparametric modeling of feature shape. This is useful for detecting curvilinear features in spatial point patterns, with or without background noise. Applications include the detection of curvilinear minefields from reconnaissance images, some of the points in which represent false detections, and the detection of seismic faults from earthquake catalogs. Our algorithm for principal curve clustering is in two steps: The first is hierarchical and agglomerative (HPCC) and the second consists of iterative relocation based on the Classification EM algorithm (CEMPCC). HPCC is used to combine potential feature clusters, while CEMPCC refines the results and deals with background noise. It is important to have a good starting point for the algorithm: This can be found manually or automatically using, for example, nearest neighbor clutter removal or modelbased clustering. We choose the number of features and the amount of smoothing simultaneously, using approximate Bayes factors.
[1] D. Allard and C. Fraley, “Nonparametric Maximum Likelihood Estimation of Features in Spatial Point Processes Using Voronoi Tesselation,” J. Am. Statistical Assoc., vol. 92, pp. 1,4851,493, 1997.
[2] C. Ambroise and G. Govaert, “Constrained Clustering and Kohonen SelfOrganizing Maps,” J. Classification, vol. 13, pp. 299313, 1996.
[3] J.D. Banfield and A.E. Raftery, “Ice Floe Identification in Satellite Images Using Mathematical Morphology and Clustering about Principal Curves,” J. Am. Statistical Assoc., vol. 87, pp. 716, 1992.
[4] J.D. Banfield and A.E. Raftery, “ModelBased Gaussian and NonGaussian Clustering,” Biometrics, vol. 49, pp. 803821, 1993.
[5] S.D. Byers and A.E. Raftery, “Nearest Neighbor Clutter Removal for Estimating Features in Spatial Point Processes,” J. Am. Statistical Assoc., vol. 93, pp. 557584, 1998.
[6] G. Celeux and G. Govaert, “A Classification EM Algorithm for Clustering and Two Stochastic Versions,” Computational Statistics and Data Analysis, vol. 14, pp. 315332, 1992.
[7] S. Cunningham and S. MacKinnon, “Statistical Methods for Visual Defect Metrology,” IEEE Trans. Semiconductor Manufacturing, vol. 11, pp. 4853, 1998.
[8] A. Dasgupta and A.E. Raftery, “Detecting Features in Spatial Point Processes with Clutter via ModelBased Clustering,” J. Am. Statistical Assoc., vol. 93, pp. 294302, 1998.
[9] P. Delicado, “Another Look at Principal Curves and Surfaces,” Working Paper 309, Department d'Economia i Empresa, Universitat Pompeu Fabra, 1998.
[10] A. Dempster, N. Laird, and D. Rubin, “Maximum Likelihood from Incomplete Data via the EM Algorithm (with Discussion),” J. Royal Statistical Soc., Series B, vol. 39, pp. 138, 1977.
[11] C. Fraley and A.E. Raftery, “How Many Clusters? Which Clustering Method?—Answers via ModelBased Cluster Analysis,” Computer J., vol. 41, pp. 578588, 1998.
[12] P. Green, “On the Use of the EM Algorithm for Penalized Likelihood Estimation,” J. Royal Statistical Soc., Series B, vol. 52, pp. 443452, 1990.
[13] K. Hansen and P. Toft, “Fast Curve Estimation Using Preconditioned Generalized Radon Transform,” IEEE Trans. Image Processing, vol. 5, no. 12, pp. 1,6511,661, 1996.
[14] T. Hastie and W. Stuetzle, “Principal Curves,” J. Am. Statistical Assoc., vol. 84, pp. 502516, 1989.
[15] T. Hastie and R. Tibshirani, Generalized Additive Models. New York: Chapman and Hall 1990.
[16] P.V.C. Hough, A Method and Means for Recognizing Complex Patterns, U.S. Patent 3,069,654, 1962.
[17] J. Illingworth and J. Kitter, "A survey of Hough transform," CVGIP, vol. 44, pp. 87116, 1988.
[18] R.E. Kass and A.E. Raftery, “Bayes Factors,” J. Am. Statistical Assoc., vol. 90, pp. 773795, 1995.
[19] R.E. Kass and L. Wasserman, “A Reference Bayesian Test for Nested Hypotheses and Its Relationship to the Schwarz Criterion,” J. Am. Statistical Assoc., vol. 90, pp. 928934, 1995.
[20] B. Kégl, A. Krzyzak, T. Linder, and K. Zeger, “Principal Curves: Learning and Convergence,” Proc. IEEE Int'l Symp. Information Theory, 1998.
[21] B. Kégl, A. Krzyzak, T. Linder, and K. Zeger, “A Polygonal Line Algorithm for Constructing Principal Curves,” Neural Information Processing Systems, vol. 11, pp. 501507, 1998.
[22] T. Kohonen, “SelfOrganized Formation of Topologically Correct Feature Maps,” Biological Cybernetics, vol. 43, pp. 5969, 1982.
[23] T. Kohonen, SelfOrganizing Maps. Berlin: SpringerVerlag, 1995.
[24] G. Latham and R. Anderssen, “Assessing Quantification for the EM Agorithm,” Linear Algebra and Its Applications, vol. 210, pp. 89122, 1994.
[25] G. Latham, “Existence of EMS Solutions and APriori Estimates,” SIAM J. Matrix Analysis and Applications, vol. 16, pp. 943953, 1995.
[26] M. LeBlanc and R. Tibshirani, “Adaptive Principal Surfaces,” J. Am. Statistical Assoc., vol. 89, pp. 5364, 1994.
[27] W. Lu, “The ExpectationSmoothing Approach for Indirect Curve Estimation,” ASA Proc. Statistical Computing Section, pp. 5762, 1995.
[28] F. Murtagh, “Interpreting the Kohonen SelfOrganization Feature Map Using Contiguity Constrained Clustering,” Pattern Recognition Letters, vol. 16, pp. 399408, 1995.
[29] D. Nychka, “Some Properties of Adding a Smoothing Step to the EM Algorithm,” Statistics and Probability Letters, vol. 9, pp. 187193, 1990.
[30] R. Prim, “Shortest Connection Networks and Some Generalizations,” Bell System Technical J., pp. 1,3891,401, 1957.
[31] K. Roeder and L. Wasserman, “Practical Bayesian Density Estimation Using Mixtures of Normals,” J. Am. Statistical Assoc., vol. 92, pp. 894902, 1997.
[32] G. Schwarz, “Estimating the Dimension of a Model,” Annals of Statistics, vol. 6, pp. 461464, 1978.
[33] B. Silverman, M. Jones, J. Wilson, and D. Nychka, “A Smoothed EM Approach to Indirect Estimation Problems, with Particular Reference to Stereology and Emission Tomography (with Discussion),” J. Royal Statistical Soc., Series B, vol. 52, pp. 271324, 1990.
[34] C. Steger, An Unbiased Detector of Curvilinear Structures Trans. Pattern Analysis and Machine Intelligence, vol. 20, no. 2, pp. 113125, Feb. 1998.
[35] R. Tibshirani, “Principal Curves Revisited,” Statistics and Computing, vol. 2, pp. 183190, 1992.
[36] R. Tibshirani and T. Hastie, “Local Likelihood Estimation,” J. Am. Statistical Assoc., vol. 82, pp. 559568, 1987.
[37] S. Wold, “Spline Functions in Data Analysis,” Technometrics, vol. 16, pp. 111, 1974.
[38] C. Zahn, “GraphTheoretical Methods for Detecting and Describing Gestalt Structures,” IEEE Trans. Computers, vol. 20, no. 1, pp. 6886, Jan. 1971.