Perceptually Guided Simplification of Lit, Textured MeshesReport
We present a new algorithm for best-effort simplification of polygonal meshes based on principles of visual perception. Building on previous work, we use a simple model of low-level human vision to estimate the perceptibility of local simplification operations in a view-dependent Multi-Triangulation structure. Our algorithm improves on prior perceptual simplification approaches by accounting for textured models and dynamic lighting effects. We also model more accurately the scale of visual changes resulting from simplification, using parametric texture deviation to bound the size (represented as spatial frequency) of features destroyed, created, or altered by simplifying the mesh. The resulting algorithm displays many desirable properties: it is viewdependent, sensitive to silhouettes, sensitive to underlying texture content, and sensitive to illumination (for example, preserving detail near highlight and shadow boundaries, while aggressively simplifying washed-out regions). Using a unified perceptual model to evaluate these effects automatically accounts for their relative importance and balances between them, overcoming the need for ad hoc or hand-tuned heuristics. CR Categories: I.3.5 [Computer Graphics]: Computational Geometry and Object Modeling-Geometric algorithms, languages, and systems
Note: Abstract extracted from PDF text
All rights reserved (no additional license for public reuse)
Luebke, David, Jonathan Cohen, Nathaniel Williams, Mike Kelley, and Brenden Schubert. "Perceptually Guided Simplification of Lit, Textured Meshes." University of Virginia Dept. of Computer Science Tech Report (2002).
University of Virginia, Department of Computer Science