
This Article  
 
Share  
Bibliographic References  
Add to:  
Digg Furl Spurl Blink Simpy Del.icio.us Y!MyWeb  
Search  
 
ASCII Text  x  
M. Pelillo, M. Refice, "Learning Compatibility Coefficients for Relaxation Labeling Processes," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 16, no. 9, pp. 933945, September, 1994.  
BibTex  x  
@article{ 10.1109/34.310691, author = {M. Pelillo and M. Refice}, title = {Learning Compatibility Coefficients for Relaxation Labeling Processes}, journal ={IEEE Transactions on Pattern Analysis and Machine Intelligence}, volume = {16}, number = {9}, issn = {01628828}, year = {1994}, pages = {933945}, doi = {http://doi.ieeecomputersociety.org/10.1109/34.310691}, publisher = {IEEE Computer Society}, address = {Los Alamitos, CA, USA}, }  
RefWorks Procite/RefMan/Endnote  x  
TY  JOUR JO  IEEE Transactions on Pattern Analysis and Machine Intelligence TI  Learning Compatibility Coefficients for Relaxation Labeling Processes IS  9 SN  01628828 SP933 EP945 EPD  933945 A1  M. Pelillo, A1  M. Refice, PY  1994 KW  relaxation theory; learning (artificial intelligence); nonlinear programming; numerical analysis; probability; neural nets; iterative methods; pattern recognition; compatibility coefficients; relaxation labeling processes; image processing; pattern recognition; artificial intelligence; iterative procedures; local ambiguities; global consistency; contextual information; training data; real numbers VL  16 JA  IEEE Transactions on Pattern Analysis and Machine Intelligence ER   
Relaxation labeling processes have been widely used in many different domains including image processing, pattern recognition, and artificial intelligence. They are iterative procedures that aim at reducing local ambiguities and achieving global consistency through a parallel exploitation of contextual information, which is quantitatively expressed in terms of a set of "compatibility coefficients." The problem of determining compatibility coefficients has received a considerable attention in the past and many heuristic, statisticalbased methods have been suggested. In this paper, the authors propose a rather different viewpoint to solve this problem: they derive them attempting to optimize the performance of the relaxation algorithm over a sample of training data; no statistical interpretation is given: compatibility coefficients are simply interpreted as real numbers, for which performance is optimal. Experimental results over a novel application of relaxation are given, which prove the effectiveness of the proposed approach.
[1] R. M. Haralick and L. G. Shapiro, "The consistent labeling problem: Part I,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI1, no. 2, pp. 173184, 1979.
[2] R. M. Haralick, L. S. Davis, A. Rosenfeld, and D. L. Milgram, "Reduction operations for constraint satisfaction,"Inform. Sci., vol. 14, pp. 199219, 1978.
[3] A. Rosenfeld, R. A. Hummel, and S. W. Zucker, "Scene labeling by relaxation operations,"IEEE Trans. Syst., Man, Cybern.. vol. SMC6, no. 6, pp. 420433, 1976.
[4] S. Peleg and A. Rosenfeld, "Breaking substitution ciphers using a relaxation algorithm" inCommun. ACM, vol. 22, no. 11, pp. 598605, Nov. 1979.
[5] S. Peleg, "Ambiguity reduction in handwriting with ambiguous segmentation and uncertain interpretation,"Comput. Graph. Image Processing, vol. 10, pp. 235245, 1979.
[6] J. O. Eklundh and A. Rosenfeld, "Some relaxation experiments using triples of pixels,"IEEE Trans. Syst., Man, Cybern., vol. SMC10, no. 3, pp. 150153, 1980.
[7] R. A. Hummel and S. W. Zucker, "On the foundations of relaxation labeling processes,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI5, no. 3, pp. 267287, 1983.
[8] O. D. Faugeras and M. Berthod, "Improving consistency and reducing ambiguity in stochastic labeling: An optimization approach,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI3, no. 4, pp. 412424, 1981.
[9] J. J. Hopfield, "Neural networks and physical systems with emergent collective computational abilities,"Proc. Nat. Acad. Sci. USA, vol. 79, pp. 25542558, 1982.
[10] J. J. Hopfield, "Neurons with graded response have collective computational properties like those of twostate neurons,"Proc. Nat. Acad. Sci. USA, vol. 81, pp. 30883092, 1984.
[11] S. E. Fahlman, G. E. Hinton, and T. J. Sejnowski, "Massively parallel architectures for AI: NETL, Thistle, and Boltzmann machines," inProc. Nat. Conf. Artificial Intell. (AAAI83), Washington, DC, 1983, pp. 109113.
[12] M. Kamada, K. Toraichi, R. Mori, K. Yamamoto, and H. Yamada, "A parallel architecture for relaxation operations,"Pattern Recognit., vol. 21, no. 2, pp. 175181, 1988.
[13] Z. Chen, S. Lin, and Y. Chen, "A parallel architecture for probabilistic relaxation operations on images,"Pattern Recognit., vol. 23, no. 6, pp. 637645, 1990.
[14] R. M. Haralick, J. L. Mohammed, and S. W. Zucker, "Compatibilities and the fixed points of arithmetic relaxation processes,"Comput. Graph. Image Processing, vol. 13, pp. 242256, 1980.
[15] D. P. O'Leary and S. Peleg, "Analysis of relaxation processes: The twonode twolabel case,"IEEE Trans. Syst., Man, Cybern., vol. SMC13, no. 4, pp. 618623, 1983.
[16] S. Peleg and A. Rosenfeld, "Determining compatibility coefficients for curve enhancement relaxation processes,"IEEE Trans. Syst., Man, Cybern., vol. SMC8, no. 7, pp. 548555, 1978.
[17] H. Yamamoto, "A method of deriving compatibility coefficients for relaxation operators,"Comput. Graph. Image Processing, vol. 10, pp. 256271, 1979.
[18] S. Peleg, "A new probabilistic relaxation scheme,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI2, no. 4, pp. 362369, 1980.
[19] L. S. Davis and A. Rosenfeld, "Cooperating processes for lowlevel vision: A survey,"Artificial Intell., vol. 17, pp. 245263, 1981.
[20] J. Kittler, "Compatibility and support functions in probabilistic relaxation," inProc. 8th Int. Conf. Pattern Recognit., Paris, France, 1986, pp. 186189.
[21] R. A. Hummel, "A design method for relaxation labeling applications," inProc. Nat. Conf. Artificial Intell. (AAAI83), Washington, DC, 1983, pp. 168171.
[22] S. W. Zucker, R. A. Hummel, and A. Rosenfeld, "An application of relaxation labeling to line and curve enhancement,"IEEE Trans. Comput., vol. C26, no. 4, pp. 394403, 1977.
[23] L. S. Davis and A. Rosenfeld, "Curve segmentation by relaxation labeling,"IEEE Trans. Comput., vol. C26, no. 10, pp. 10531057, 1977.
[24] J. B. Rosen, "The gradient projection method for nonlinear programmingPart I: Linear constraints,"J. Soc. Indust. Appl. Math., vol. 8, no. 1, pp. 181217, 1960.
[25] D. G. Luenberger,Linear and Nonlinear Programming. Reading, MA: AddisonWesley, 1984.
[26] S. A. Lloyd, "An optimization approach to relaxation labeling algorithms,"Image Vision Comput., vol. 1, no. 2, pp. 8591, 1983.
[27] J. Illingworth and J. Kittler, "Optimization algorithms in probabilistic relaxation labeling," inPattern Recognition Theory and Applications, P. A. Devijver and J. Kittler, Eds. Berlin: Springer, 1987, pp. 109117.
[28] L. R. Bahl, F. Jelinek, and R. L. Mercer, "A maximumlikelihood approach to continuous speech recognition,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI5, no. 2, pp. 179190, 1983.
[29] J. Hertz, A. Krogh, and R. G. Palmer.Introduction to the Theory of Neural Networks. Reading, MA: AddisonWesley, 1991.
[30] J. Kittler and J. Illingworth, "Relaxation labeling algorithmsA review,"Image Vision Comput., vol. 3, no. 4, pp. 206216, 1985.
[31] X. Zhuang, R. M. Haralick, and H. Joo, "A simplexlike algorithm for the relaxation labeling process,"IEEE Trans. Pattern Anal. Machine Intell., vol. 11, no. 12, pp. 13161321, 1989.
[32] S. W. Zucker, Y. G. Leclerc, and J. L. Mohammed, "Continuous relaxation and local maxima selection: Conditions for equivalence,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI3, no. 3, pp. 117127, 1981.
[33] S. Kullback,Information Theory and Statistics. New York: Wiley, 1959.
[34] G. E. Hinton and T. J. Sejnowski, "Learning and relearning in Boltzmann machines," inPARALLEL DISTRIBUTED PROCESSING: Explorations in the Microstructure of Cognition, Vol. 1: Foundations, pp. 282317.
[35] S. A. Solla, E. Levin, and M. Fleisher, "Accelerated learning in layered neural networks,"Complex Syst., vol. 2, pp. 625640, 1988.
[36] J. Lin, "Divergence measures based on the Shannon entropy,"IEEE Trans. Inform. Theory, vol. IT37, no. 1, pp. 145151, 1991.
[37] M. Pelillo and M. Refice, "An optimization algorithm for determining the compatibility coefficients of relaxation labeling processes," inProc. 11th Int. Conf. Pattern Recognit., The Hague, The Netherlands, 1992, pp. 145148.
[38] M. Pelillo and M. Refice, "Learning compatibility coefficients for wordclass disambiguation relaxation processes," inProc. Int. Conf. Spoken Language Processing, Banff, Canada, 1992, pp. 389392.
[39] J. L. Mohammed, R. A. Hummel, and S. W. Zucker, "A gradient projection algorithm for relaxation methods,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI5, no. 3, pp. 330332, 1983.
[40] P. Parent and S. W. Zucker, "Radial projection: An efficient update rule for relaxation labeling,"IEEE Trans. Pattern Anal. Machine Intell., vol. 11, no. 8, pp. 886889, 1989.
[41] J. J. Hopfield and D. W. Tank, "Neural computation of decisions in optimization problems,"Biol. Cybern., vol. 52, pp. 141152, 1985.
[42] G. V. Wilson and G. S. Pawley, "On the stability of the travelling salesman algorithm of Hopfield and Tank,"Biol. Cybern., vol. 58, pp. 6370, 1988.
[43] G. W. Davis, "Sensitivity analysis in neural net solutions,"IEEE Trans. Syst., Man, Cybern., vol. 19, no. 5, pp. 10781082, 1989.
[44] T. Elfving and J. O. Eklundh, "Some properties of stochastic labeling procedures,"Comput. Graph. Image Processing, vol. 20, pp. 158170, 1982.
[45] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, "Learning internal representation by error propagation,"Parallel Distributed Processing: Explorations in the Microstructure of Cognition, vols. 1 and 2. Cambridge, MA: MIT Press, 1986.
[46] F. J. Pineda, "Generalization of backpropagation to recurrent neural networks,"Phys. Rev. Lett., vol. 59, no. 19, pp. 22292232, 1987.
[47] L. B. Almeida, "A learning rule for asynchronous perceptrons with feedback in a combinatorial environment," inProc. Int. Conf. Neural Networks, San Diego, CA, 1987, pp. 609618.
[48] R. Rohwer and B, Forrest, "Training timedependence in neural networks," inProc. Int. Conf. Neural Networks, San Diego, CA, 1987, pp. 701708.
[49] R. J. Williams and D. Zipser, "A learning algorithm for continually running fully recurrent neural networks,"Neural Computat., vol. 1, pp. 270280, 1989.
[50] C. Torras, "Relaxation and neural learning: Points of convergence and divergence,"J. Parallel Distrib. Comput., vol. 6, pp. 217244, 1989.
[51] S. S. Yu and W. H. Tsai, "Relaxation by the Hopfield neural network,"Pattern Recognit., vol. 25, no. 2, pp. 197209, 1992.
[52] A. M. Derouault and B. Mérialdo, "Natural language modeling for phonemetotext transcription,"IEEE Trans. Pattern Anal. Machine Intell., vol. PAMI8, no. 6, pp. 742749, 1986.
[53] K. W. Church, "A stochastic parts program and noun phrase parser for unrestricted text," inProc. Int. Conf. Acoust., Speech, Signal Processing, Glasgow, Scotland, 1989, pp. 695698.
[54] M. Pelillo, F. Moro, and M. Refice, "Probabilistic prediction of partsofspeech from word spelling using decision trees," inProc. Int. Conf. Spoken Language Processing, Banff, Canada, 1992, pp. 13431346.
[55] M. Pelillo and M. Refice, "Syntactic category disambiguation through relaxation processes," inProc. EUROSPEECH 91, Genova, Italy, 1991, pp. 757760.
[56] L. Boves and M. Refice, "The linguistic processor in a multilingual texttospeech and speechtotext conversion system," inProc. Europ. Conf. Speech Technol., Edimburgh, Scotland, 1987, pp. 385401.
[57] S. M. Katz, "Estimation of probabilities from sparse data for the language model component of a speech recognizer,"IEEE Trans. Acoust., Speech, Signal Processing, vol. ASSP35, no. 3, pp. 400401, 1987.
[58] I. J. Good, "The population frequencies of species and the estimation of population parameters,"Biometrika, vol. 40, no. 3/4, pp. 237264, 1953.
[59] M. J. J. Holt, "Comparison of generalization in multilayer perceptrons with the loglikelihood and leastsquares cost functions," inProc. 11th Int. Conf. Pattern Recognit., The Hague, The Netherlands, 1992, pp. 1720.
[60] J. Benello, A. W. Mackie, and J. A. Anderson, "Syntatic category disambiguation with neural networks,"Comput. Speech Language, vol. 3, pp. 203217, 1989.
[61] T. A. Jamison and R. J. Schalkoff, "Image labeling: A neural network approach,"Image and Vision Comput., vol. 6, no. 4, pp. 203213, Nov. 1988.
[62] T. J. Sejnowski and C. R. Rosenberg, "Parallel networks that learn to pronounce English text,"Complex Syst., vol. 1, pp. 145168, 1987.
[63] D. E. Goldberg,Genetic Algorithms in Search, Optimization, and Machine Learning. Reading, MA: AddisonWesley, 1989.
[64] P.J.M. van Laarhoven and E.H.L. Aarts,Simulated Annealing: Theory and Applications, Kluwer Academic, Boston, 1987.
[65] D. H. Ballard, G. E. Hinton, and T. J. Sejnowski, "Parallel visual computation,"Nature, vol. 306, pp. 2126, 1983.
[66] J. A. Anderson, "Cognitive and psychological computation with neural models,"IEEE Trans. Syst., Man, Cybern., vol. SMC13, no. 5, pp. 799815, 1983.
[67] S. W. Zucker, A. Dobbins, and L. Iverson, "Two stages of curve detection suggest two styles of visual computation,"Neural Computat., vol. 1, pp. 6881, 1989.
[68] G. Mitchinson, "Learning algorithms and networks of neurons," inThe Computing Neuron, R. Durbin, C. Miall, and G. Mitchinson, Eds. Reading, MA: AddisonWesley, 1989, pp. 3553.
[69] F. Crick, "The recent excitement about neural networks,"Nature. vol. 337, pp. 129132, 1989.