The Community for Technology Leaders
Green Image
Issue No. 01 - January (2012 vol. 18)
ISSN: 1077-2626
pp: 5-16
S. Guntury , Center for Visual Inf. Technol. (CVIT), Int. Inst. of Inf. Technol., Hyderabad, India
P. J. Narayanan , Center for Visual Inf. Technol. (CVIT), Int. Inst. of Inf. Technol., Hyderabad, India
Raytracing dynamic scenes at interactive rates have received a lot of attention recently. We present a few strategies for high performance raytracing on a commodity GPU. The construction of grids needs sorting, which is fast on today's GPUs. The grid is thus the acceleration structure of choice for dynamic scenes as per-frame rebuilding is required. We advocate the use of appropriate data structures for each stage of raytracing, resulting in multiple structure building per frame. A perspective grid built for the camera achieves perfect coherence for primary rays. A perspective grid built with respect to each light source provides the best performance for shadow rays. Spherical grids handle lights positioned inside the model space and handle spotlights. Uniform grids are best for reflection and refraction rays with little coherence. We propose an Enforced Coherence method to bring coherence to them by rearranging the ray to voxel mapping using sorting. This gives the best performance on GPUs with only user-managed caches. We also propose a simple, Independent Voxel Walk method, which performs best by taking advantage of the L1 and L2 caches on recent GPUs. We achieve over 10 fps of total rendering on the Conference model with one light source and one reflection bounce, while rebuilding the data structure for each stage. Ideas presented here are likely to give high performance on the future GPUs as well as other manycore architectures.
Graphics processing unit, Instruction sets, Coherence, Data structures, Light sources

S. Guntury and P. J. Narayanan, "Raytracing Dynamic Scenes on the GPU Using Grids," in IEEE Transactions on Visualization & Computer Graphics, vol. 18, no. 1, pp. 5-16, 2011.
191 ms
(Ver 3.3 (11022016))