Mesh Decomposition With Cross-Boundary Brushes

Youyi Zheng and Chiew-Lan Tai

Computer Graphics Forum (Proc. of Eurographics 2010)

The Hong Kong University of Science and Technology

 

Results of applying our method to Princeton Benchmark models. Most cuts are executed using single-stroke mode. Observe that our boundaries follow the geometry features.

abstract
We present a new intuitive UI, which we call cross-boundary brushes, for interactive mesh decomposition. The user roughly draws one or more strokes across a desired cut and our system automatically returns a best cut running through all the strokes. By the different natures of part components (i.e., semantic parts) and patch components (i.e., flatter surface patches) in general models, we design two corresponding brushes: part-brush and patch-brush. These two types of brushes share a common user interface, enabling easy switch between them. The part-brush executes a cut along an isoline of a harmonic field driven by the user-specified strokes. We show that the inherent smoothness of the harmonic field together with a carefully designed isoline selection scheme lead to segmentation results that are insensitive to noise, pose, tessellation and variation in user's strokes. Our patch-brush uses a novel facet-based surface metric that alleviates sensitivity to noise and fine details common in region-growing algorithms. Extensive experimental results demonstrate that our cutting tools can produce user-desired segmentations for a wide variety of models even with single strokes. We also show that our tools outperform the state-of-art interactive segmentation tools in terms of ease of use and segmentation quality.
Keywords
Mesh Decomposition, User Interface
Paper
Video
segmentation.wmv
Bibtex

@inproceedings{Zheng10,
author = {Youyi Zheng and Chiew-Lan Tai},
title = {Mech Decomposition with Cross-Boundary Brushes},
booktitle = {Computer Graphics Forum (In Proc. of Eurographics 2010)},
year = {2010},
volume = {29},
number = {2},
pages = {to appear}
}

Acknowledgement
We would like to thank the anonymous reviewers for their valuable comments. We also thank Hongbo Fu and Oscar Kin-Chung Au for insightful discussion, Ligang Liu for the code of Easy Mesh Cutting, and Pedro Sander for video narration. This work is supported by the Hong Kong Research Grant Council (Project No: GRF619908).