Subscribe

Issue No.03 - March (2011 vol.17)

pp: 357-367

Jianmin Zheng , Nanyang Technological University, Singapore

Juyong Zhang , Nanyang Technological University, Singapore

DOI Bookmark: http://doi.ieeecomputersociety.org/10.1109/TVCG.2010.57

ABSTRACT

This paper considers the problem of interactively finding the cutting contour to extract components from an existing mesh. First, we propose a constrained random walks algorithm that can add constraints to the random walks procedure and thus allows for a variety of intuitive user inputs. Second, we design an optimization process that uses the shortest graph path to derive a nice cut contour. Then a new mesh cutting algorithm is developed based on the constrained random walks plus the optimization process. Within the same computational framework, the new algorithm provides a novel user interface for interactive mesh cutting that supports three typical user inputs and also their combinations: 1) foreground/background seed inputs: the user draws strokes specifying seeds for “foreground” (i.e., the part to be cut out) and “background” (i.e., the rest); 2) soft constraint inputs: the user draws strokes on the mesh indicating the region which the cuts should be made nearby; and 3) hard constraint inputs: the marks which the cutting contour must pass. The algorithm uses feature sensitive metrics that are based on surface geometric properties and cognitive theory. The integration of the constrained random walks algorithm, the optimization process, the feature sensitive metrics, and the varieties of user inputs makes the algorithm intuitive, flexible, and effective as well. The experimental examples show that the proposed cutting method is fast, reliable, and capable of producing good results reflecting user intention and geometric attributes.

INDEX TERMS

Computational geometry and object modeling, interaction techniques, geometric algorithms.

CITATION

Jianmin Zheng, Juyong Zhang, "Interactive Mesh Cutting Using Constrained Random Walks",

*IEEE Transactions on Visualization & Computer Graphics*, vol.17, no. 3, pp. 357-367, March 2011, doi:10.1109/TVCG.2010.57REFERENCES