Issue No. 11 - November (1999 vol. 21)
DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.809115
<p><b>Abstract</b>—The standard least squared distance method of fitting a line to a set of data points is known to be unreliable when the random noise in the input is significant compared with the data correlated to the line itself. Here, we present a new statistical clustering method based on Legendre moment theory and maximum entropy principle for line fitting in a noisy image. We propose a new approach for estimating the underlying probability density function (p.d.f.) of the data set. The p.d.f. is expanded in terms of Legendre polynomials by means of the Legendre moments. The order of the expansion is selected according to the maximum entropy principle (M.E.P.). Then, the points corresponding to the maxima of the p.d.f. will be the true points of the line to be extracted by a chaining algorithm. This approach is directly generalized to multidimensional data. The proposed algorithm was successfully applied to real and simulated noisy line images, with comparison to some well-known methods.</p>
Line fitting, outliers, inliers, underlying p.d.f., Legendre moments, maximum entropy principle, clustering.
H. Qjidaa and L. Radouane, "Robust Line Fitting in a Noisy Image by the Method of Moments," in IEEE Transactions on Pattern Analysis & Machine Intelligence, vol. 21, no. , pp. 1216-1223, 1999.