
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
Huiwen Zeng, H. Joel Trussell, "Constrained Dimensionality Reduction Using a MixedNorm Penalty Function with Neural Networks," IEEE Transactions on Knowledge and Data Engineering, vol. 22, no. 3, pp. 365380, March, 2010.  
BibTex  x  
@article{ 10.1109/TKDE.2009.107, author = {Huiwen Zeng and H. Joel Trussell}, title = {Constrained Dimensionality Reduction Using a MixedNorm Penalty Function with Neural Networks}, journal ={IEEE Transactions on Knowledge and Data Engineering}, volume = {22}, number = {3}, issn = {10414347}, year = {2010}, pages = {365380}, doi = {http://doi.ieeecomputersociety.org/10.1109/TKDE.2009.107}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Knowledge and Data Engineering TI  Constrained Dimensionality Reduction Using a MixedNorm Penalty Function with Neural Networks IS  3 SN  10414347 SP365 EP380 EPD  365380 A1  Huiwen Zeng, A1  H. Joel Trussell, PY  2010 KW  Pruning KW  neural networks KW  penalty function KW  mixednorm penalty. VL  22 JA  IEEE Transactions on Knowledge and Data Engineering ER   
[1] G.E. Hinton, “Connectionist Learning Procedures,” Artificial Intelligence, vol. 40, no. 1, pp. 143150, 1989.
[2] E.B. Baum, “What Size of Neural Net Gives Valid Generalization?” Neural Computation, vol. 1, no. 1, pp. 51160, 1989.
[3] J.K. Kruschke and J.R. Movellan, “Benefits of Gain: Speeded Learning and Minimal Hidden Layers in BackPropagation Networks,” IEEE Trans. Systems Man and Cybernetics, vol. 21, no. 1, pp. 273280, Jan./Feb. 1991.
[4] C.M. Bishop, Neural Networks for Pattern Recognition. Oxford Univ. Press, 1995.
[5] A. Hyvarinen, J. Karhunen, and E. Oja, Independent Component Analysis. Wiley, 2001.
[6] I. Koch and K. Naito, “Dimension Selection for Feature Selection and Dimension Reduction with Principal and Independent Component Analysis,” Neural Computation, vol. 19, pp. 513545, 2007.
[7] J.B. Tenenbaum, V. de Silva, and J.C. Langford, “A Global Geometric Framework for Nonlinear Dimensionality Reduction,” Science, vol. 290, no. 5500, pp. 23192323, Dec. 2000.
[8] I. Borg and J. Lingoes, Multidimensional Similarity Structure Analysis. SpringerVerlag, 1987.
[9] K. Fukunaga, Introduction to Statistical Pattern Recognition, second ed. Academic Press, 1990.
[10] Y. LeCun, J. Denker, S. Solla, R.E. Howard, and L.D. Jackel, “Optimal Brain Damage,” Advances in Neural Information Processing Systems, D.S. Touretzky, ed., vol. 2, pp. 598605, Morgan Kaufmann, 1990.
[11] B. Hassibi and D.G. Stork, “Second Order Derivatives for Network Pruning: Optimal Brain Surgeon,” Advances in Neural Information Processing Systems, S.J. Hanson, J.D. Cowan, C.L. Giles, eds., vol. 5, pp. 164171, Morgan Kaufmann, 1993.
[12] A.S. Weigend, D.E. Rumelhart, and B.A. Huberman, “Generalization by Weight Elimination with Application to Forecasting,” Advances in Neural Information Processing Systems, R.P.Lippmann, J.E. Moody, and D.S. Touretzky, eds., vol. 3, pp.875882, Morgan Kaufmann, 1991.
[13] J. Moody and T. R¨gnvaldsson, “Smoothing Regularizers for Projective Basis Function Networks,” Advances in Neural Information Processing Systems, M.C. Mozer, M.I. Jordan, and T. Petsche, eds., vol. 9, pp. 585591, MIT Press, 1997.
[14] P.O. Hoyer, “NonNegative Matrix Factorization with Sparseness Constraints,” J. Machine Learning Research, vol. 5, no. 9, pp. 14571469, 2004.
[15] J. Sietsma and R.J.F. Dow, “Neural Net Pruning—Why and How?” Proc. IEEE Int'l Conf. Neural Network, vol. 1, pp. 325332, 1988.
[16] G. Castellano, A.M. Fanelli, and M. Pelillo, “An Iterative Pruning Algorithm for Feedforward Neural Networks,” IEEE Trans. Neural Networks, vol. 8, no. 3, pp. 519531, May 1997.
[17] M.Y. Chow and J. Teeter, “An Analysis of Weight Decay As a Methodology of Reducing ThreeLayer Feedforward Artificial Neural Networks for Classification Problems,” Proc. IEEE Int'l Conf. Neural Network, pp. 600605, 1994.
[18] R. Fletcher, Practical Methods of Optimization, second ed. John Wiley Sons, 1987.
[19] N. Alexandrov and J.E. Dennis, “Algorithms for Bilevel Optimization,” Proc. AIAA/NASA/USAF/ISSMO Symp. Multidisciplinary Analysis and Optimization, pp. 810816, 1994.
[20] http://color.psych.upenn.edu/hyperspectral/ bearfruitgraybearfruitgray.html, 2009.
[21] P.M. Williams, “Bayesian Regularisation and Pruning Using a Laplace Prior,” technical report, School of Cognitive and Computing Sciences, Univ. of Sussex, 1994.
[22] http://www.mathworks.com/access/helpdesk/ help/toolboxoptim/, 2009.
[23] R. Fletcher and M.J.D. Powell, “A Rapidly Convergent Descent Method for Minimization,” Computer J., vol. 6, no. 2, pp. 163168, 1963.
[24] D. Goldfarb, “A Family of Variable Metric Updates Derived by Variational Means,” Math. of Computing, vol. 24, no. 109, pp. 2326, 1970.
[25] S.P. Han, “A Globally Convergent Method for Nonlinear Programming,” J. Optimization Theory and Applications, vol. 22, no. 3, pp. 297309, 1977.
[26] M.J.D. Powell, “A Fast Algorithm for Nonlinearly Constrained Optimization Calculations,” Numerical Analysis, G.A.Watson, ed., pp. 144157, SpringerVerlag, 1978.
[27] K. Hornik, M. Stinchcombe, and H. White, “Universal Approximation of an Unknown Mapping and Its Derivatives Using Multilayer Feedforward Networks,” Neural Networks, vol. 3, no. 5, pp. 551560, 1990.
[28] E.D. Sontag, “Feedback Stabilization Using TwoHidden Layer Nets,” IEEE Trans. Neural Networks, vol. 3, no. 6, pp. 981990, Nov. 1992.
[29] H. Zeng, “Dimensionality Reduction and Feature Selection Using a MixedNorm Penalty Function,” PhD thesis, Electrical Eng. Dept., North Carolina State Univ., 2005.
[30] H. Zeng and H.J. Trussell, “Feature Selection Using a MixedNorm Penalty Function,” Proc. IEEE Int'l Conf. Image Processing, Oct. 2006.