This Article 
   
 Share 
   
 Bibliographic References 
   
 Add to: 
 
Digg
Furl
Spurl
Blink
Simpy
Google
Del.icio.us
Y!MyWeb
 
 Search 
   
Implicit Polynomials, Orthogonal Distance Regression, and the Closest Point on a Curve
February 2000 (vol. 22 no. 2)
pp. 191-199

Abstract—Implicit polynomials (i.e., multinomials) have a number of properties that make them attractive for modeling curves and surfaces in computer vision. This paper considers the problem of finding the best fitting implicit polynomial (or algebraic curve) to a collection of points in the plane using an orthogonal distance metric. Approximate methods for orthogonal distance regression have been shown by others to be prone to the problem of cusps in the solution and this is confirmed here. Consequently, this work focuses on exact methods for orthogonal distance regression. The most difficult and costly part of exact methods is computing the closest point on the algebraic curve to an arbitrary point in the plane. This paper considers three methods for achieving this in detail. The first is the standard Newton's method, the second is based on resultants which are recently making a resurgence in computer graphics, and the third is a novel technique based on successive circular approximations to the curve. It is shown that Newton's method is the quickest, but that it can fail sometimes even with a good initial guess. The successive circular approximation algorithm is not as fast, but is robust. The resultant method is the slowest of the three, but does not require an initial guess. The driving application of this work was the fitting of implicit quartics in two variables to thinned oblique ionogram traces.

[1] P.T. Boggs, R.H. Byrd, J.E. Rogers, and R.B. Schnabel, “User's Reference Guide for ODRPACK Version 2.01 Software for Weighted Orthogonal Distance Regression,” Applied and Computational Math. Division, Nat'l Inst. of Standards and Technology, U.S. Dept. of Commerce, NISTIR 92-4834, 1992.
[2] E.-W. Chionh, “Base Points, Resultants, and the Implicit Representation of Rational Surfaces,” PhD Thesis, Univ. of Waterloo, Ontario, Canada, 1990.
[3] D. Cox, J. Little, and D. O'Shea, Ideals, Varieties and Algorithms: An Introduction to Computational Algebraic Geometry and Commutative Algebra, second ed., Springer-Verlag, 1997.
[4] Y. De Montaudouin and W. Tiller, “The Cayley Method in Computer Aided Geometric Design,” Computer Aided Geometric Design, vol. 1, pp. 309-326, 1984.
[5] J.E. DennisJr. and R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Prentice Hall, 1983.
[6] S. Goedecker, “Remark on Algorithms to Find Roots of Polynomials,” SIAM J. Scientific Computing, vol. 15, no. 5, pp. 1,059-1,063, 1994.
[7] I. Gohberg, P. Lancaster, and L. Rodman, Matrix Polynomials. Academic Press, 1982.
[8] G.H. Golub and C.F. Van Loan, Matrix Computations, second ed., Johns Hopkins Univ. Press, 1989.
[9] M. Gulliksson and I. Söderkvist, “Surface Fitting and Parameter Estimation with Nonlinear Least Squares,” Optimization Methods and Software, vol. 5, pp. 247-269, 1995.
[10] B.K.P. Horn, “Relative Orientation Revisited,” J. Optical Soc. Am., vol. 8, no. 10, pp. 1,630-1,638, 1991.
[11] K. Kanatani, “Statistical Bias of Conic Fitting and Renormalization,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 16, no. 3, pp. 320-326, Mar. 1994.
[12] K. Kanatani, Statistical Optimization for Geometric Computation: Theory and Practice. Elsevier Science, 1996.
[13] D.I. Kettler and N.J. Redding, “A Trimming Algorithm to Clean Thinned Features for Feature Extraction in Image Understanding,” Proc. Fourth Australian and New Zealand Conf. Intelligent Information Systems, pp. 304-307, 1996.
[14] D. Manocha, “Algebraic and Numeric Techniques for Modelling and Robotics,” doctoral dissertation, Univ. of California, Berkeley, 1992.
[15] D. Manocha, “Solving Systems of Polynomial Equations,” IEEE Computer Graphics and Applications, pp. 46-55, Mar. 1994.
[16] D. Manocha and S. Krishnan, “Solving Algebraic Systems Using Matrix Computations,” ACM Sigsam Bulletin, vol. 30, no. 4, pp. 4-21, 1996.
[17] F. Mokhtarian and A.K. Mackworth, “A Theory of Multiscale, Curvature-Based Shape Representation for Planar Curves,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 14, no. 8, pp. 789-805, Aug. 1992.
[18] A.P. Morgan, “Polynomial Continuation and Its Relationship to the Symbolic Reduction of Polynomial Systems,” Symbolic and Numerical Computation for Artificial Intelligence, B. Donald et al., eds., pp. 23-45, Academic Press, 1992.
[19] G.N. Newsam and N.J. Redding, “Fitting the Most Probable Curve to Noisy Observations,” Proc. Int'l Conf. Image Processing, vol. 2, pp. 752-755, 1997.
[20] W.H. Press, S.A. Teukolsky, W.T. Vetterling, and B.P. Flannery, Numerical Recipes in C. Cambridge Univ. Press, 1992.
[21] N. Redding, “The Autoscaling of Oblique Ionograms,” DSTO Electronics and Surveillance Research Laboratory, Salisbury, South Australia, Australia, Research Report DSTO-RR-0074, 1996. www.dsto.defence.gov.au/corporate/reports DSTO-RR-0074.pdf.
[22] N.J. Redding, “Image Understanding of Oblique Ionograms: The Autoscaling Problem,” Proc. Fourth Australian and New Zealand Conf. Intelligent Information Systems, pp. 155-160, Nov. 1996.
[23] N.J. Redding, “Fitting Implicit Polynomials to Use as Features in Image Understanding,” Proc. Fourth Australian and New Zealand Conf. Intelligent Information Systems, pp. 161-164, Nov. 1996.
[24] N.J. Redding and R. Whatmough, “Fitting Implicit Quartics for Use in Feature Extraction,” Proc. Int'l Conf. Image Processing, vol. 2, pp. 410-413, Oct. 1997.
[25] G. Salmon, Modern Higher Algebra. Dublin: Hodges, Smith&Co., 1866.
[26] K. Schittkowski, “NLPQL: A FORTRAN Subroutine Solving Contrained Nonlinear Programming Problems,” Annals of Operations Research, vol. 5, pp. 485-500, 1986.
[27] T.W. Sederberg, D.C. Anderson, and R.N. Goldman, “Implicit Representation of Parametric Curves and Surfaces,” Computer Vision, Graphics, and Image Processing, vol. 28, pp. 72-84, 1984.
[28] T.W. Sederberg, “Algorithm for Algebraic Curve Intersection,” Computer Aided Design, vol. 21, no. 9, pp. 547-554, 1989.
[29] S. Sullivan, L. Sandford, and J. Ponce, “Using Geometric Distance Fits for 3D Object Modeling and Recognition,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 16, no. 2, pp. 1,183-1,196, Feb. 1994.
[30] G. Taubin,“Estimation of planar curves, surfaces, and nonplanar space curves defined by implicit equations with applications to edge and range image segmentation,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 13, no. 11, pp. 1115-1137, Nov. 1991.
[31] G. Taubin, F. Cukierman, S. Sullivan, J. Ponce, and D.J. Kriegman, “Parameterized Families of Polynomials for Bounded Algebraic Curve and Surface Fitting,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 16, no. 3, pp. 287-303, Mar. 1994.
[32] Visual Numerics, Inc., IMSL C/Math/Library C Functions for Mathematical Applications. Visual Numerics, Inc., 1995.
[33] A. Wallack, I.Z. Emiris, and D. Manocha, “MARS: A Maple/Matlab/C Resultant-Based Solver,” Proc. Int'l Symp. Symbolic and Algebraic Computation, pp. 244-251, 1998.
[34] L.T. Watson, S.C. Billups, and A.P. Morgan, “Algorithm 652: HOMPACK: A Suite of Codes for Globally Convergent Homotopy Algorithms,” ACM Trans. Math. Software, vol. 13, pp. 281-310, 1987.
[35] D.M. Young and R.T. Gregory, A Survey of Numerical Mathematics, vol. 1. Dover Publications, 1973.
[36] Z. Zhang, “Parameter Estimation Techniques: A Tutorial with Application to Conic Fitting,” Image and Vision Computing, vol. 15, pp. 59-97, 1997.

Index Terms:
Fitting, orthogonal distance regression, implicit polynomials, algebraic curve, successive circular approximation, resultants, ionograms.
Citation:
Nicholas J. Redding, "Implicit Polynomials, Orthogonal Distance Regression, and the Closest Point on a Curve," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 2, pp. 191-199, Feb. 2000, doi:10.1109/34.825757
Usage of this product signifies your acceptance of the Terms of Use.