Publication 2002 Issue No. 10 - October Abstract - Estimating the Intrinsic Dimension of Data with a Fractal-Based Method
Estimating the Intrinsic Dimension of Data with a Fractal-Based Method
October 2002 (vol. 24 no. 10)
pp. 1404-1407
 ASCII Text x Francesco Camastra, Alessandro Vinciarelli, "Estimating the Intrinsic Dimension of Data with a Fractal-Based Method," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 10, pp. 1404-1407, October, 2002.
 BibTex x @article{ 10.1109/TPAMI.2002.1039212,author = {Francesco Camastra and Alessandro Vinciarelli},title = {Estimating the Intrinsic Dimension of Data with a Fractal-Based Method},journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence},volume = {24},number = {10},issn = {0162-8828},year = {2002},pages = {1404-1407},doi = {http://doi.ieeecomputersociety.org/10.1109/TPAMI.2002.1039212},publisher = {IEEE Computer Society},address = {Los Alamitos, CA, USA},}
 RefWorks Procite/RefMan/Endnote x TY - JOURJO - IEEE Transactions on Pattern Analysis and Machine IntelligenceTI - Estimating the Intrinsic Dimension of Data with a Fractal-Based MethodIS - 10SN - 0162-8828SP1404EP1407EPD - 1404-1407A1 - Francesco Camastra, A1 - Alessandro Vinciarelli, PY - 2002KW - Bayesian information criterionKW - correlation integralKW - Grassberger-Procaccia's algorithmKW - intrinsic dimensionKW - nonlinear principal component analysisKW - box-counting dimensionKW - fractal dimensionKW - Kolmogorov capacity.VL - 24JA - IEEE Transactions on Pattern Analysis and Machine IntelligenceER -

Abstract—In this paper, the problem of estimating the intrinsic dimension of a data set is investigated. A fractal-based approach using the Grassberger-Procaccia algorithm is proposed. Since the Grassberger-Procaccia algorithm performs badly on sets of high dimensionality, an empirical procedure that improves the original algorithm has been developed. The procedure has been tested on data sets of known dimensionality and on time series of Santa Fe competition.

[1] C.M. Bishop, Neural Networks for Pattern Recognition. Clarendon Press, 1995.
[2] D.S. Broomhead and M. Kirby, “A New Approach to Dimensionality Reduction: Theory and Algorithms,” SIAM J. Applied Math., vol. 60, no. 6, pp. 2114-2142, 2000.
[3] J. Bruske and G. Sommer, “Intrinsic Dimensionality Estimation with Optimally Topology Preserving Maps,” IEEE Trans. Pattern Analysis and Machine Intelligence, vol. 20, no. 5, pp. 572-575, May 1998.
[4] J.H. Conway and N.J.A. Sloane, “Sphere Packings, Lattices and Groups,” Grundlehren der Mathematischen Wissenschaften 290, New York: Springer-Verlag, 1988.
[5] J.P. Eckmann and D. Ruelle, “Ergodic Theory of Chaos and Strange Attractors,” Rev., Modern Physics, vol. 57, pp. 617-659, 1985.
[6] J.P. Eckmann and D. Ruelle, “Fundamental Limitations for Estimating Dimensions and Lyapounov Exponents in Dynamical Systems,” Physica, vol. D56, pp. 185-187, 1992.
[7] D. Fotheringhame and R.J. Baddeley, “Nonlinear Principal Component Analysis of Neuronal Spike Train Data,” Biological Cybernetics, vol. 77, pp. 282-288, 1997.
[8] F. Frisone, F. Firenze, P. Morasso, and L. Ricciardiello, “Application of Topological-Representing Networks to the Estimation of the Intrinsic Dimensionality of Data,” Proc. Int'l Conf. Artificial Neural Networks, 1995.
[9] K. Fukunaga, “Intrinsic Dimensionality Extraction,” Classification, Pattern Recognition and Reduction of Dimensionality, Volume 2 of Handbook of Statistics, P.R. Krishnaiah and L.N. Kanal, eds., pp. 347-360, Amsterdam: North Holland, 1982.
[10] K. Fukunaga and D.R. Olsen, “An Algorithm for Finding Intrinsic Dimensionality of Data,” IEEE Trans. Computer, vol. 20, no. 2, pp. 165-171, Feb. 1976.
[11] P. Grassberger and I. Procaccia, “Measuring the Strangeness of Strange Attractors,” Physica, vol. D9, pp. 189-208, 1983.
[12] S. Haykin and X. Bo Li, “Detection of Signals in Chaos,” Proc. IEEE, vol. 83, no. 1, pp. 95-122, 1995.
[13] Q. Huang, J.R. Lorch, and R.C. Dubes, “Can the Fractal Dimension of Images Be Measured?” Pattern Recognition, vol. 27, no. 3, pp. 339-349, 1994.
[14] U. Hübner, C.O. Weiss, N.B. Abraham, and D. Tang, “Lorenz-Like Chaos in$\big. {\rm{NH}}_3\hbox{-}{\rm{FIR}}\bigr.$Lasers,” Time Series Prediction. Forecasting the Future and Understanding the Past, A. Weigend and N.A. Gershenfeld, eds., pp. 73-104, Reading, Mass.: Addison Wesley, 1994.
[15] V. Isham, “Statistical Aspects of Chaos: A Review,” Networks and Chaos-Statistical and Probabilistic Aspects, O.E. Barndorff-Nielsen, J.L. Jensen, and W.S. Kendall, eds., pp. 124-200, London: Chapman&Hall, 1993.
[16] A.K. Jain and R.C. Dubes, Algorithms for Clustering Data. Englewood Cliffs, N.J.: Prentice Hall, 1988.
[17] D. Kaplan and L. Glass, Understanding Nonlinear Dynamics. New York: Springer-Verlag, 1995.
[18] M. Kirby, Geometric Data Analysis: An Empirical Approach to Dimensionality Reduction and the Study of Patterns. New York: John Wiley and Sons, 2001.
[19] E.C. Malthouse, “Limitations of Nonlinear PCA as Performed with Generic Neural Networks,” IEEE Trans. Neural Networks, vol. 9, no. 1, pp. 165-173, Jan. 1998.
[20] B. Mandelbrot, Fractals: Form, Chance, and Dimension. San Francisco: Freeman, 1977.
[21] T. Martinetz and K. Schulten, "Topology Representing Networks," Neural Networks, Vol. 7, No. 3, 1994, pp. 507-522.
[22] E. Ott, Chaos in Dynamical Systems. Cambridge, UK: Cambridge Univ. Press, 1993.
[23] F.J. Pineda and J.C. Sommerer, “Estimating Generalized Dimensions and Choosing Time Delays: A Fast Algorithm,” Time Series Prediction. Forecasting the Future and Understanding the Past, A. Weigend and N.A. Gershenfeld, eds., pp. 367-385, Reading, Mass.: Addison Wesley, 1994.
[24] J.A. Scheinkman and B. Le Baron, “Nonlinear Dynamics and Stock Returns,” J. Businness, vol. 62, pp. 311-337, 1989.
[25] G. Schwartz, “Estimating the Dimension of a Model,” Annals of Statistics, vol. 6, pp. 497-511, 1978.
[26] L.A. Smith, “Intrinsic Limits on Dimension Calculations,” Physics Letters, vol. A133, pp. 283-288, 1988.
[27] R.L. Smith, “Optimal Estimation of Fractal Dimension,” Nonlinear Modeling and Forecasting, SFI Studies in the Sciences of Complexity, M. Casdagli and S. Eubank, eds., vol. XII, pp. 115-135, 1992.
[28] F. Takens, “On the Numerical Determination of the Dimension of an Attractor,” Proc. Groningen 1984 Dynamical Systems and Bifurcations, B. Braaksma, H. Broer, and F. Takens, eds., no. 1125, pp. 99-106, 1985.
[29] J. Theiler, “Lacunarity in a Best Estimator of Fractal Dimension,” Physics Letters, vol. A133, pp. 195-200, 1988.
[30] J. Theiler, “Statistical Precision of Dimension Estimators,” Physical Rev., vol. A41, pp. 3038-3051, 1990.
[31] W.S. Tirsch, M. Keidel, S. Perz, H. Scherb, and G. Sommer, “Inverse Covariation of Spectral Density and Correlation Dimension in Cyclic EEG Dimension of the Human Brain,” Biological Cybernetics, vol. 82, pp. 1-14, 2000.
[32] V.N. Vapnik, Statistical Learning Theory, John Wiley&Sons, 1998.

Index Terms:
Bayesian information criterion, correlation integral, Grassberger-Procaccia's algorithm, intrinsic dimension, nonlinear principal component analysis, box-counting dimension, fractal dimension, Kolmogorov capacity.
Citation:
Francesco Camastra, Alessandro Vinciarelli, "Estimating the Intrinsic Dimension of Data with a Fractal-Based Method," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 10, pp. 1404-1407, Oct. 2002, doi:10.1109/TPAMI.2002.1039212