Issue No. 11 - November (1997 vol. 19)

ISSN: 0162-8828

pp: 1236-1250

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/34.632983

ABSTRACT

<p><b>Abstract</b>—This article addresses two important themes in early visual computation: First, it presents a novel theory for learning the universal statistics of natural images—a prior model for typical cluttered scenes of the world—from a set of natural images, and, second, it proposes a general framework of designing reaction-diffusion equations for image processing. We start by studying the statistics of natural images including the scale invariant properties, then generic prior models were learned to duplicate the observed statistics, based on the minimax entropy theory studied in two previous papers. The resulting Gibbs distributions have potentials of the form <tmath>$U\left( {{\schmi{\bf I}};\,\Lambda ,\,S} \right)=\sum\nolimits_{\alpha =1}^K {\sum\nolimits_{ \left( {x,y} \right)} {\lambda ^{\left( \alpha \right)}}}\left( {\left( {F^{\left( \alpha \right)}*{\schmi{\bf I}}} \right)\left( {x,y} \right)} \right)$</tmath> with <it>S</it> = {<it>F</it><super>(1)</super>, <it>F</it><super>(2)</super>, ..., <it>F</it><super>(<it>K</it>)</super>} being a set of filters and Λ = {λ<super>(1)</super>(), λ<super>(2)</super>(), ..., λ<super>(<it>K</it>)</super>()} the potential functions. The learned Gibbs distributions confirm and improve the form of existing prior models such as line-process, but, in contrast to all previous models, <it>inverted</it> potentials (i.e., λ(<it>x</it>) decreasing as a function of |<it>x</it>|) were found to be necessary. We find that the partial differential equations given by gradient descent on <it>U</it>(<b>I</b>; Λ, <it>S</it>) are essentially reaction-diffusion equations, where the usual energy terms produce anisotropic diffusion, while the inverted energy terms produce reaction associated with pattern formation, enhancing preferred image features. We illustrate how these models can be used for texture pattern rendering, denoising, image enhancement, and clutter removal by careful choice of both prior and data models of this type, incorporating the appropriate features.</p>

INDEX TERMS

Visual learning, Gibbs distribution, reaction-diffusion, anisotropic diffusion, texture synthesis, clutter modeling, image restoration.

CITATION

D. Mumford and S. C. Zhu, "Prior Learning and Gibbs Reaction-Diffusion," in

*IEEE Transactions on Pattern Analysis & Machine Intelligence*, vol. 19, no. , pp. 1236-1250, 1997.

doi:10.1109/34.632983

CITATIONS