The Community for Technology Leaders
Green Image
ABSTRACT
<p><b>Abstract</b>—We present a new method for converting a photo or image to a synthesized painting following the painting style of an example painting. Treating painting styles of brush strokes as sample textures, we reduce the problem of learning an example painting to a texture synthesis problem. The proposed method uses a hierarchical patch-based approach to the synthesis of directional textures. The key features of our method are: 1) Painting styles are represented as one or more blocks of sample textures selected by the user from the example painting; 2) image segmentation and brush stroke directions defined by the medial axis are used to better represent and communicate shapes and objects present in the synthesized painting; 3) image masks and a hierarchy of texture patches are used to efficiently synthesize high-quality directional textures. The synthesis process is further accelerated through texture direction quantization and the use of Gaussian pyramids. Our method has the following advantages: First, the synthesized stroke textures can follow a direction field determined by the shapes of regions to be painted. Second, the method is very efficient; the generation time of a synthesized painting ranges from a few seconds to about one minute, rather than hours, as required by other existing methods, on a commodity PC. Furthermore, the technique presented here provides a new and efficient solution to the problem of synthesizing a 2D directional texture. We use a number of test examples to demonstrate the efficiency of the proposed method and the high quality of results produced by the method.</p>
INDEX TERMS
Digital painting, example-based painting, painting style, artistic filter, painting systems, simulation, image segmentation, Gaussian pyramid, texture synthesis, directional texture, nonphotorealistic rendering.
CITATION

B. Wang, W. Wang, J. Sun and H. Yang, "Efficient Example-Based Painting and Synthesis of 2D Directional Texture," in IEEE Transactions on Visualization & Computer Graphics, vol. 10, no. , pp. 266-277, 2004.
doi:10.1109/TVCG.2004.1272726
93 ms
(Ver 3.3 (11022016))