Dot Scissor: A Single-Click Interface for Mesh Segmentation

Youyi Zheng1         Chiew-Lan Tai1         Oscar Kin-Chung Au2

IEEE Transaction on Visualization and Computer Graphics

1Hong Kong Unversity of Science and Technology

2City University of Hong Kong

 

Interface of our dot scissor. The user cuts out a component by moving a dot circle and clicking at a location where a cut-boundary is desired. Our system automatically returns a best cut boundary that respects the local geometry features.

abstract
This paper presents a very easy-to-use interactive tool, which we call dot scissor, for mesh segmentation. The user's effort is reduced to placing only a single click where a cut is desired. Such a simple interface is made possible by a directional search strategy supported by a concavity-aware harmonic field and a robust voting scheme that selects the best isoline as the cut. With a concavity-aware weighting scheme, the harmonic fields gather dense isolines along concave regions which are natural boundaries of semantic components. The voting scheme relies on an isoline-face scoring mechanism that considers both shape geometry and user intent. We show by extensive experiments and quantitative analysis that our tool advances the state-of-the-art segmentation methods in both simplicity of use and segmentation quality.
Keywords
Interactive Mesh Segmentation, Dot Scissor, Concavity-aware, Harmonic Fields, Voting.
Paper
Video


video (18M)
Bibtex

@inproceedings{ZTA11,
author = {Youyi Zheng and Chiew-Lan Tai and Oscar Kin-Chung Au},
title = {Dot Scissor: A Single-Click Interface for Mesh Segmentation},
booktitle = {IEEE Trans. Vis. Comp. Graphics},
year = {2011},
volume = {},
number = {},
pages = {To appear}
}

Acknowledgement
We thank the anonymous reviewers for their valuable comments, Pengfei Xu for his help in the benchmark evaluation, Thomas Li for the video narration and Jackson Yuen for inspiring the single click interface during a discussion. This work was supported in part by the Hong Kong Research Grant Council (Project No. GRF619908 and GRF619611) and grants from the City University of Hong Kong (Project No. 7200148).