The Community for Technology Leaders
Green Image
<p><b>Abstract</b>—Complex repetitive scenes containing forests, foliage, grass, hair, or fur, are challenging for common modeling and rendering tools. The amount of data, the tediousness of modeling and animation tasks, and the cost of realistic rendering have caused such kind of scene to see only limited use even in high-end productions. We describe here how the use of <it>volumetric textures</it> is well suited to such scenes. These primitives can greatly simplify modeling and animation tasks. More importantly, they can be very efficiently rendered using ray tracing with few aliasing artifacts. The main idea, initially introduced by Kajiya and Kay [<ref rid="bibv00559" type="bib">9</ref>], is to represent a pattern of 3D geometry in a reference volume, that is tiled over an underlying surface much like a regular 2D texture. In our contribution, the mapping is independent of the mesh subdivision, the pattern can contain any kind of shape, and it is prefiltered at different scales as for MIP-mapping. Although the model encoding is volumetric, the rendering method differs greatly from traditional volume rendering: A volumetric texture only exists in the neighborhood of a surface, and the repeated instances (called <it>texels</it>) of the reference volume are spatially deformed. Furthermore, each voxel of the reference volume contains a key feature which controls the <it>reflectance function</it> that represents aggregate intravoxel geometry. This allows for ray-tracing of highly complex scenes with very few aliasing artifacts, using a single ray per pixel (for the part of the scene using the volumetric texture representation). The major technical considerations of our method lie in the ray-path determination and in the specification of the reflectance function.</p>
Volumetric textures, complex geometry, levels of detail.

F. Neyret, "Modeling, Animating, and Rendering Complex Scenes Using Volumetric Textures," in IEEE Transactions on Visualization & Computer Graphics, vol. 4, no. , pp. 55-70, 1998.
96 ms
(Ver 3.3 (11022016))