10
CS Technical Report #378, September 16, 2002 Eidgenössische Technische Hochschule Zürich Swiss Federal Institute of Technology Zurich Multiresolution Modeling of Point-Sampled Geometry Mark Pauly, Leif Kobbelt*, Markus Gross Computer Science Department ETH Zurich, Switzerland e-mail: {pauly, grossm}@inf.ethz.ch http://graphics.ethz.ch/ *Computer Science Department RWTH Aachen, Germany e-mail: [email protected] http://www.informatik.rwth-aachen.de/I8/

Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

  • Upload
    others

  • View
    3

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

CS

Tech

nica

lRep

ort#

378,

Sept

embe

r16

,200

2Eidgenössische Technische Hochschule ZürichSwiss Federal Institute of Technology Zurich

Multiresolution Modeling of Point-Sampled Geometry

Mark Pauly, Leif Kobbelt*, Markus Gross

Computer Science DepartmentETH Zurich, Switzerland

e-mail: pauly, [email protected]://graphics.ethz.ch/

*Computer Science DepartmentRWTH Aachen, Germany

e-mail: [email protected]://www.informatik.rwth-aachen.de/I8/

Page 2: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

1

Multiresolution Modeling of Point-Sampled Geometry

Mark Pauly Markus Gross Leif P. Kobbelt

ETH Zürich ETH Zürich RWTH Aachen

Abstract. We present a framework and algorithms for multires-olution modeling of point-sampled geometry. Most of our tech-niques extend the key ingredients of recent multiresolution editingconcepts, including geometry smoothing, simplification, and offsetcomputation, to point clouds. Using these building blocks, wedesigned a powerful and versatile hierarchical representation forpoint-based modeling. An intuitive editing metaphor along withdynamic resampling techniques facilitates modeling operations atdifferent scales and enables us to effectively handle very largemodel deformations. In our approach all processing is performeddirectly on point-sampled geometry without the need for interme-diate local triangulation or meshing.

Keywords: Multiresolution modeling, point-sampled geometry,deformation, dynamic resampling.

1 INTRODUCTION

In spite of a long tradition, it is only in recent years that pointshave received a growing attention in computer graphics. Variousresearchers proposed the point sample as a simple and versatilegraphics primitive complementing conventional geometric ele-ments, such as triangles or splines. Most recent efforts have beendevoted to the acquisition, processing [Pauly and Gross 2001], andrendering [Rusinkiewicz and Levoy 2000], [Pfister et al. 2000],[Zwicker et al. 2001] of point-sampled models. Earlier work[Szeliski and Tonnesen 1992], [Witkin and Heckbert 1994] onphysics-based frameworks proposed particles to represent and editsurfaces. While particle modeling provides an expressive and intu-itive approach to surface design, it is prohibitively expensive forthe interactive creation and efficient modeling of large scale point-sampled geometry. In our work, we complement prior research onpoint-based methods by introducing a framework for multiresolu-tion modeling of point-sampled geometry. To this end, we aretransferring key components of the well-established multiresolu-tion decomposition and editing framework from classical mani-fold-based geometry representations (e.g. splines or polygonmeshes) to unstructured point clouds. We adapt basic techniquessuch as smoothing, normal displacement, or decimation to thepoint-based setting. Furthermore, we present a powerful and intui-tive metaphor for interactive modeling operations and we intro-duce a resampling method to dynamically adjust the model’ssampling rate.

The concept of multiresolution surface editing was success-fully applied to triangle meshes in [Zorin et al. 1997]. While thisapproach is confined to meshes with subdivision connectivity,[Kobbelt et al. 1998] presented a more general framework for arbi-trary meshes using variational fairing. More recently, [Guskov etal. 1999] proposed a scheme targeted at mesh signal processing,but also achieved remarkable results in multiresolution surfacemodifications. In the context of surface editing, multiresolutionrepresentations serve two main purposes: Complexity reductionthrough the use of a model hierarchy and powerful editing seman-tics at different geometric frequency bands. We found that inde-pendent of the underlying geometry representation, each

multiresolution editing framework has to provide a set of kernelfunctions - basic building blocks for higher level editing opera-tions. These core components can be summarized as follows:

• Refinement and simplification operators to construct the hierar-chy levels of the model

• Lowpass filtering to compute smooth base domains

• Offset computation to encode geometric details for decomposi-tion and reconstruction

• Editing metaphors to interactively manipulate the model surface

• Dynamic resampling to adjust the discretization of the modelduring editing

For polygonal meshes the up- and downsampling operations areapplied to switch between different resolution levels, where “reso-lution” refers to both vertex density and amount of geometricdetail. In the case of point clouds both concepts of resolution haveto be treated separately, since a point-sampled surface representinga low geometric detail still has to provide a sufficiently high sam-pling density. Hence, we use hierarchies of differently dense pointclouds only to increase the efficiency of filtering operations. Oth-erwise, the vertex density is kept constant on all resolution levels.For simplification we extend quadratic error metrics [Garland andHeckbert 1997] to the point cloud setting.

Low-pass filtering is employed to remove high geometric fre-quencies from a given surface for the robust computation of thebase domains. Most smoothing techniques for polygonal meshesuse recursive Gaussian filtering by iteratively updating vertexpositions. Discretization of the Laplacian is usually accomplishedby weighted averaging of adjacent vertices [Taubin 1995], [Des-brun et al. 1999]. We extend this concept to point clouds by replac-ing the vertex neighborhood relation with the -nearest neighborsof a sample. To accelerate the filtering, we designed a v-cycle mul-tilevel smoother [Kobbelt et al. 1998] that alternates the relaxationwith sub- and upsampling operations. Additional local volumeconstraints [Hubeli and Gross 2001] are used to locally bound thecurvature of the base domain.

Offset computation is used to find an explicit representation ofthe difference between a surface P and its low-pass filtered versionQ. We recall that in conventional multiresolution settings the detailD=P-Q is typically stored with respect to local frames spanned bythe normals and tangents of Q [Forsey and Bartels 1988]. Since Pcontains less geometric detail, a smaller number of basis functionsand hence, samples, can be used to represent it. Thus, when editingthe surface on a lower resolution level Q’ , the final result P’ isreconstructed by first upsampling Q’ and then adding the detail Dback with respect to the modified local frames of Q’ . By contrast,our framework keeps the number of samples intact through thehierarchy levels. Again, the low-frequency version Q is derived byapplying the smoothing filter. For offset computation, we modifythe sample positions in Q (not the sampling rate!) such that eachsample in P can be obtained by normal displacement of the corre-sponding sample in the newly created point set Q’ .

k

Page 3: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

2

The modification of the point cloud is accomplished by aninteractive editing metaphor which is similar in spirit to free-formdeformation. The user both paints an active region and a handleonto the point sampled surface. Then we generate a continuoustensor field assigning transformations, such as rotations and trans-lations, to each point in space. The samples are subsequently trans-formed by evaluating the tensor field at corresponding locations inspace. We use this metaphor because it does not assume the under-lying geometry to be a manifold surface.

It is important to distinguish the simplification and refinementoperators of the multiresolution framework from dynamic adapta-tions of the model’s sampling rate such as being necessary forexcessive local stretches or squeezes during interactive editing. Itis useful to compare such resampling with a remeshing operationof a heavily deformed polygon mesh [Kobbelt et al. 2000]. Inorder to guarantee sufficient local sampling density we estimatelocal stretches dynamically during editing. Once the sampling den-sity becomes too high or too low, we adjust the sampling rate ofthe representation. Sampling positions are equalized by efficientlocal relaxations, which guarantees sufficient sampling rates forlarge model deformations. In addition, we have to preserve andinterpolate the detail information D to eventually guarantee properreconstruction.

The following sections elaborate on the technical details of ourframework. Section 3 addresses the core multiresolution represen-tation we use including smoothing, simplification and detailencoding. Section 4 is devoted to the editing metaphor and section5 discusses the dynamic resampling. In section 6, we present aseries of modeling operations to demonstrate the key features ofour method.

2 POINT-BASED MODELS

This section briefly summarizes some prerequisites, which are fun-damental to our framework. Our input data consists of an unstruc-tured point cloud that describes someunderlying manifold surface . Each sample stores a geo-metric position as well as a set of scalar attributes,such as color or texture coordinates .

2.1 Neighborhood

Contrary to polygonal representations we base all required localcomputations on spatial proximity between samples instead ofgeodesic proximity between mesh vertices. This approach is justi-fied by the fact that both distance measures are similar, if theunderlying smooth surface is sampled sufficiently dense [Amentaet al. 1998]. In [Floater and Reimers 2001] Floater investigateddifferent approaches for defining local neighborhoods for pointclouds. We found that for our purposes the set of -nearest neigh-bors of a sample point , denoted by the index set , is most suit-able. We define the radius of the enclosing sphere of the -nearest neighbors of as .

2.2 Normal Estimation

To evaluate the surface normal we need some method for estimat-ing the tangent plane at a point . For this purpose, we use aweighted least-squares optimization. The goal is to find the plane

(1)

that minimizes the sum of weighted squared distances

(2)

subject to . The weights are based on Euclidean dis-tance, i.e. with being a smooth, positive andmonotonously decreasing function. We use the Gaussian

, (3)

where is an additional parameter controlling the locality of theapproximation. Note that the weight function depends on theradius of the neighborhood sphere and thus dynamically adaptsto the local sampling density.

Using the method of Lagrange multipliers [Sagan 1969] wesolve the constrained least-squares problem (2) by reducing it tothe eigenvector problem

, (4)

where is the covariance matrix and the weighted aver-age of the . The normal vector is then equal to the eigenvec-tor with smallest associated eigenvalue .

3 MULTIRESOLUTION REPRESENTATION

In this section we introduce multiresolution point clouds as a newgeometry representation for two-manifold surfaces. Each sample

is represented by a point plus a sequence of normal dis-placement offsets . To reconstruct the position of ata certain level we recursively displace in normal direction,i.e.

, (5)

where denotes the surface normalestimate at on level , which isbased on and its neighboringsamples from the same resolutionlevel. Hence, every sample pointis active on all levels of resolutionand the polygon definesits normal trajectory, since eachpolygon edge is aligned to the normal vector .

At each level the point set defines the model surface atthe corresponding scale, with level being the smoothest approx-imation. Note again that resolution refers to the amount of geomet-ric detail, not to the sampling rate. This also explains why we donot subsample the lower resolution levels. To unambiguouslyreconstruct the normal trajectory each intermediate point isneeded. However, starting from all can be represented byone scalar coordinate . This amounts to only moderate storageoverhead for typical decomposition levels of up to 5 in practice.

To eventually create this multilevel hierarchy, we require thefollowing kernel functions: A smoothing operator (low-pass filter)is needed to compute the different levels of geometric detail (Sec-tion 3.1). A fast point cloud simplification method (Section 3.2)reduces the number of samples for efficient v-cycle computation(Section 3.3). A decomposition algorithm encodes the detailed sur-

P pi 1 i N≤ ≤ =S p P∈

p x y z, ,( )=c u v,( )

kp N

r kp r maxi N∈ p pi–=

T p

T : pi n d–⋅ 0=

pi n⋅ d–( )2φii N∈∑

n 1= φiφi φ pi p–( )= φ

φ x( ) ex

2– σ2r

2( )⁄

=

σ

r

C v⋅ λ v,⋅= C

pi1p–

…pik

p–

Tpi1

p–

…pik

p–

ij N∈,⋅=

C 3 3× ppi n

vi λi

p p0d0 … dn 1–, , p

k p0

pk p0 d0n0 d1n1 … dk 1– nk 1–+ + + +=

...

p0

n0

pk

Level 0

Level k

Level k-1

d0

S0

Sk 1–

Sk

njpj j

pj

p

p0 … pn, ,

pjpj 1+ nj

k Sk0

pjp0 pj

dj

Page 4: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

3

face with respect to its smooth approximant. By applying low-passfiltering and decomposition recursively, we extent this two-banddecomposition to a multilevel scheme.

3.1 Smoothing Operators

One way to describe surface smoothing is by the diffusion equa-tion [Desbrun et al. 1999]

, (6)

where denotes the Laplace-Beltrami operator on the surface.Using explicit Euler integration, this leads to the common iterativeformula for Gaussian smoothing

, (7)

where is some discrete approximation of the Laplacian at apoint . A standard method for computing such a discretization isthe use of umbrella operators, which can be transferred to pointsets by

, (8)

where defines a set of weights. The uniform umbrella isobtained for while a scale-dependent umbrella opera-tor results from . The degree of smoothness canbe controlled by the scale parameter , the number of iterationsand the size of the local neighborhoods of the sample points. Theseneighborhoods are computed at the beginning of the smoothingoperation and cached during the iteration. This improves the effi-ciency of the computations and also increases the stability of thesmoothing filter, since it prevents clustering effects due to the tan-gential drift of sample points.

Volumetric constraints. A common problem of iterativesmoothing operators is shrinkage. Taubin addresses this problemby alternating between smoothing steps with positive and negativescale parameters [Taubin 1995]. Desbrun et al. [Desbrun et al.1999] measure the total object volume before and after thesmoothing operation and rescale the model uniformly. Shrinkage isa non-uniform problem, however, and should thus be handled by alocally adaptive technique. Therefore we approximate the localchange in volume in the neighborhood of a point caused by asmoothing step as the cylindrical volume

(9)

and compensate the shrinkage by displacing the neighbors of by(see Figure 1). This approach is similar to the local

volume preservation strategy introduced by Hubeli et al. [Hubeliand Gross 2001] for triangle meshes. They observed, however, thatapplying the local anti-shrinkage technique decreases the impactof the smoothing operator.

3.2 Point Set Simplification

To apply efficient multilevel smoothing algorithms to point-sam-pled geometry, we need operators for sub- and supersampling. Asubsampling scheme coarsifies the representation and we derivethe corresponding operator by adapting a mesh decimationscheme. Supersampling will then be achieved by inverting thisoperator.

Mesh simplification algorithms are usually based on an atomicdecimation operator that decreases the number of vertices in themesh by one. The most common operator is the edge collapse orvertex contraction where two vertices and are replaced by anew one . This new vertex inherits the neighborhood relationfrom both its ancestors. In [Garland and Heckbert 1997] the deci-mation of triangle meshes is controlled by a quadric error metric.The idea is to approximate the surface locally by a set of tangentplanes and to estimate the geometric deviation of a vertex fromthe surface by the sum of the squared distances to these planes.The error quadrics for each mesh vertex are initialized with a setof planes defined by the triangles around that vertex and can berepresented by a symmetric matrix . The quality of thecollapse is then rated according to the minimum ofthe error functional . In order to adapt this tech-nique to the decimation of unstructured point clouds we have touse the -nearest neighbor relation, since manifold surface con-nectivity is not available. To initialize the error quadrics for everypoint sample , we estimate a tangent plane for every neigh-bor in the -neighborhood of . This tangent plane is spannedby the vector and , where is the esti-mated normal vector at .

After this initialization the point cloud decimation worksexactly like mesh decimation with the point inheriting theneighborhoods of its ancestors and and being assigned theerror functional . Figure 2 shows the effective-ness of the algorithm.

3.3 Multilevel Smoothing

Since iterative Gaussian smoothing as defined by Equation (7)exhibits slow convergence on lower frequencies [Kobbelt et al.1998], we combine the smoothing operator and simplificationmethod described above into an efficient multilevel smoothingalgorithm. Using a complete v-cycle of alternating smoothing anddown- resp. upsampling steps, we obtain a smooth approximation

of a surface as

, (10)

where is the smoothing operator and the downsampling(simplification) operator. The upsampling (refinement) operator

implements the inverse of the simplification method of Sec-tion 3.2. To re-insert a point back into the model, we store the localneighborhood of the point prior to the point pair contraction. Thenwe can compute the position of the inserted point as a weightedaverage of the points of its neighborhood.

t∂∂S λ∆S=

p′ p λ p∆+=

∆pp

p∆ 1Ω---- ωi pi p–( ),

i N∈∑= Ω ωi

i N∈∑=

ωi ωi 1 i∀,=

ωi 1 pi p–⁄=λ

p

∆V πr2 p′ p–⋅=

pp p′–( ) k⁄

Figure 1: Local vs. global volume preservation: The middle imageshows the original noisy surface. On the left 100 smoothing itera-tions with the local strategy, on the right 100 iterations with globalrescaling.

v1 v2v

v

v

4 4× Qvv1 v2,( ) v→

Qv

Qv1Qv2

+=

k

p Eipi k p

ei p pi–= bi ei n×= np

pp1 p2

Qv

Qv1Qv2

+=

S′ S

S′ Φ Ψ 1– Φ … Φ Ψ Φ S( )⋅ ⋅ ⋅ ⋅ ⋅ ⋅=

Φ Ψ

Ψ 1–

Page 5: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

4

3.4 Decomposition

For a given geometric model, the decomposition operator sepa-rates global (low frequency) shape information from local (highfrequency) detail. The input to our multiresolution point clouddecomposition operator usually consists of a given set of samples

and a low frequency approximation . The major difficultyarises from the fact that the low-pass filter operator that generates

does not guarantee a specific sampling pattern: We cannot eas-ily find pairs of samples and such that the displace-ment is in normal direction. Hence, the decompositionoperator has to resample the surface defined by the sample setto a new point cloud such that we obtain a normal base point

for every original sample without changing the actualgeometric shape represented by .We perform this resampling bya variant of the moving least-squares surface [Alexa et al. 2001],[Levin ]. The basic idea is to find weighted least-squares planesthat approximate a local neighborhood of point samples. Then alocal polynomial approximation is computed with respect to aparameterization over this plane. For our purposes it turns out to besufficient to use the initial weighted least-squares plane for normalestimation. Hence we omit the second polynomial approximationstep. By this we are, in fact, using moving least-squares surfaces ofpolynomial degree one. Finding the best fitting plane requires anon-linear optimization step since we have to determine a basepoint and a normal direction . In earlier approaches the non-linear optimization has been implemented by a Powell's method.We, however, prefer a Newton-type iteration which is based on ageometric gradient estimation. Suppose we have the sample point

and the point cloud and we are looking for a base pointlying on the moving least-squares surface defined by

such that the displacement is perpendicular to . We start withan initial estimate for by choosing that point from the cloud

, which lies closest to . Finding the best fitting plane to alocal neighborhood of samples that passes through isan easy task since it only requires the solution of a linear system(see Section 2.2). By orthogonal projection of onto weobtain another estimate for . We iterate this procedure. Ineach step we find another estimate by projecting onto theplane which is the weighted least-squares plane fitted to a localneighborhood of samples of around the previous estimate .This iterative procedure corresponds to a Newton-iteration schemeto find the point on the surface for which the displacement

is in normal direction. This becomes obvious from the obser-vation that we replace the surface by its linear approximation

in every step and then solve the base point problem for thisapproximation. To avoid oscillatory behavior of the iterativescheme, we introduce an additional damping factor , i.e. insteadof fully updating from to we use the damped update from

to . Choosing works well in mostcases (see Figures 3 and 4).

In this context, two issues deserve further discussion: Since we areusing a Newton-type optimization with damping, the iteration isguaranteed to find the correct base point , if the initial estimate

lies sufficiently close. In our experiments we never experi-enced any difficulties with the convergence of the process, sincethe point cloud represents a smooth low-frequency surface. Afurther issue relates to the low-pass filtering we use for the compu-tation of the base domain. It is important to note that the movingleast-squares surface intrinsically includes a low-pass filtering ofthe such-defined surface. In principle we could apply the decom-position operation directly to the point cloud such that we findbase points lying on the moving least-squares surface defined by

. However, there are two drawbacks. The first is that using a sec-ond independently generated point cloud provides more flexi-ble control (including local volume preservation) on the amount ofgeometric detail that is removed when going from to . More-over, the v-cycle smoothing algorithm (see Section 3.3) turns outto be much more efficient for aggressive smoothing. Another

Figure 2: Simplification with quadric error metrics. The originalmodel has been downsampled from 2,000,606 points to 100,000 in(a) in 183 seconds, and to 10,000 in (b) in 197 seconds. We showthe sampling distribution of the subsampled models as blue dots onthe original geometry. In (c) the region around the eye of (a) is mag-nified, while (d) shows the same part of the original model.

(a) David 100k (b) David 10k

(c) Eye 100k (d) Eye 2,000k

P Q

Qp P∈ q Q∈

pqQ

Rr R∈ p

Q

r n

p Qr Q∉ S Q

rp Sr0 r

Figure 3: Multiresolution decomposition. The left image illustratesone step of the Newton iteration. On the right, the (green) basepoints for the (red) points on the detailed surface are shown.

Figure 4: Decomposition: Original (left), smooth base domain(middle) and color-mapped detail coefficients, ranging from maxi-mum negative (blue) to maximum positive (red) displacement.

... ...

detailed surface

smooth surface base points ri

p

r0 r1

Sk

Sk 1–

Q p E0Q\ r0 r0

p E0r1 r

ri 1+ pEi

Q ri

r Srp

SEi

λri ri 1+

ri 1 λ–( )ri λri 1++ λ 0.5=

rr0

Q

Pr

PQ

P Q

Page 6: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

5

aspect is that the moving least-squares surface tends to instabilitiesand oscillations, if the point cloud is noisy. Hence a pre-smoothing step would be necessary anyway.

4 EDITING

Given a point-based multiresolution representation of a surfacewe now need some modeling technique to manipulate the pointcloud that defines the surface of a certain level . For thispurpose we have implemented a point-based free-form deforma-tion algorithm. Deformations can be controlled by a continuoustensor-field, which is created using a simple and flexible userinteraction metaphor.

The tensor-field defines a transformation (translation and rota-tion) for each point in space based on a continuously varying scaleparameter . This scale parameter is computed using spa-tial distance functions that are defined by specifying two regions inspace, a zero-region and a one-region subjectto . Points within will not be displaced at all( ), while points within will experience the maximumdisplacement ( ). All other points are displaced in such a waythat we create a smooth transition between zero- and maximumdisplacement.

This is achieved as follows: For each point , we compute thedistance to the zero-region and the distance to theone-region defined as

(11)

for . The new position for is then determined as. is a deformation function and

(12)

the scale parameter for with blending function . In our imple-mentation we use the continuous blending function

, satisfying and and. This ensures a smooth transition of displace-

ments at the boundaries of and .

Currently we use two deformation functions:

• Translation: , with being the translationvector. Many choices for are conceivable, e.g. a constant vec-tor, the surface normal at or a vector pointing to a user-speci-fied point

• Rotation: , where is thematrix that specifies a rotation around axis with angle .

The user specifies and by marking points on the pointcloud . This is achieved with a simple paint tool that supportsdrawing of points, lines, circles and other primitives, as well asflood filling and erasure. This provides very flexible and powerfulediting functionality as illustrated in Figure 5. Yet the user inter-face and handling of the deformation is simple and intuitive. Inprinciple, there is no need to restrict and to be part of themodel surface. Instead the zero- and one-regions can also be speci-fied on a different model that can be arbitrarily aligned with .

Our deformation method shows some similarities to the wiresscheme introduced by Singh and Fiume [Singh and Fiume 1998].We use the same blending function and our zero- and one-regions

are similar to the wires and domain curves. However, we believethat our technique provides more flexibility, since and aredefined as volumetric entities, not one-dimensional curves.

5 DYNAMIC RESAMPLING

Large deformations may cause strong distortions in the distributionof sample points on the surface that can lead to an insufficient localsampling density. To prevent the point cloud from ripping apartand maintain a high surface quality, we have to include new sam-ples where the sampling density becomes too low. To achieve thiswe first need to measure the surface stretch to detect regions ofinsufficient sampling density (Section 5.1). Then we have to insertnew sample points and determine their position on the surface.Additionally, we have to preserve and interpolate the scalarattributes, e.g. detail coefficients or color values (Section 5.2).

5.1 Measuring surface stretch

During the deformation we have to measure the local stretching todecide when new sample points need to be inserted to preserve theoverall sampling density. We use the first fundamental formknown from differential geometry to measure the local distortionat each sample point [DoCarmo 1976]. Let and be two tan-gent vectors at a sample point . We initialize and such thatthey are of unit length and orthogonal to each other. The first fun-damental form at is now defined by the matrix

P

S

Pk Sk k

t 0 1,[ ]∈

χ0 R3∈ χ1 R3∈χ0 χ1∩ ∅= χ0

t 0= χ1t 1=

pd0 p( ) d1 p( )

dj p( )0 p χj∈minq χj∈ p q–( ) p χj∉

=

j 0 1,= pp′ F p t,( )= F

t β d0 p( ) d0 p( ) d1 p( )+( )⁄( )=

p βC

1

β x( ) x2

1–( )2

= β 0( ) 1= β 1( ) 0=β' 0( ) β' 1( ) 0= =

χ0 χ1

FT p t,( ) p t v⋅+= vvp

p

FR p t,( ) R r t α⋅,( ) p⋅= R r α,( )r α

χ0 χ1P

χ0 χ1

P

Figure 5: Two examples of elementary free-form deformations.The upper left image shows the zero-region (blue) and the one-re-gion (red). The scale parameter is visualized below using a rainbowcolor map. On the right the final textured surface.

χ0 χ1

(a) 90 degree rotation around an axis through the plane center

(b) Translation in the direction of the plane normal

that spans an angle of 45 degrees with the plane normal

u vp u v

p 2 2×

Page 7: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

6

(13)

The eigenvalues of this matrix yield the minimum and maximumstretch factors and the corresponding eigenvectors define the prin-cipal directions into which this stretching occurs. Since we initial-ized the two tangent vectors and to an orthonormal system,the resulting first fundamental form is the identity matrix indicat-ing that initially there is no tension on the surface.When we apply a deformation to the point samples, the point isshifted to a new position and the two tangent vectors aremapped to new vectors and . Local stretching implies that

and might no longer be orthogonal to each other nor do theypreserve their unit length. The amount of this distortion can bemeasured by computing the eigenvalues of 13. We can rate the dis-tortion by taking the ratio of the two eigenvalues (local anisotropy)or by taking their product (local change of surface area). When thelocal distortion becomes too strong, we have to insert new samplesto re-establish the prescribed sampling density. Since 13 defines anellipse in the tangent plane centered at with the principal axesdefined by the eigenvectors and eigenvalue, we can easily replace

by two new samples and that we position on the mainaxis of the ellipse (cf. Figure 6).

5.2 Filter operations

Whenever a splitting operation is applied, we need to determineboth the geometric position and the scalar function values for thenewly generated sample points. Both these operations can bedescribed as the application of a filtering operator: If we apply thefilter operator to the sample position we call it a relaxation filter,while we call it an interpolation filter, if we apply it to the functionvalues. This definition makes sense if we look at the way how weuse these filters in the context of our resampling framework:

Relaxation. Introducing new sample points through a splittingoperation creates local imbalances in the sampling distribution. Toobtain a more uniform sampling pattern, we apply a relaxationoperator that moves the sample points within the surface. Similarto [Turk 1992] we use a simple point repulsion scheme with arepulsion force that drops linearly with distance. We can thus con-fine the radius of influence of each sample point to its local neigh-borhood, which allows very efficient computation of the relaxationforces. The resulting displacement vector is then projected into thepoint’s tangent plane to keep the samples on the surface.

Interpolation. Once we have fixed the position of a new samplepoint using the relaxation filter, we need to determine its associ-ated function values. This can be achieved using an interpolation

Figure 6: Dynamic resampling: The top row shows the samplingpositions, while the bottom row illustrates the scalar function val-ues (indicated by lines). After deforming the model, splitting cre-ates new samples and zombies. The latter are ignored whencomputing sample positions during relaxation, but are used whencomputing function values during interpolation.

u2 u v⋅

u v⋅ v2

u v

pp′

u′ v′u′ v′

p

p p1 p2

Zombie

New Samples

Deformation RelaxtionSplitting

Deletion of zombies Interpolation

p

u

v p ′

u′v ′

Figure 7: A very large deformation of a plane. The left column il-lustrates intermediate steps of the deformation. The right columnshows the final deformations in two resolutions: The top imagecontains 69,346 points (original plane 10,000), while the bottomimage contains 621,384 points (original plane 160,000). The zoomsshow the four stages of the dynamic resampling (cf. Figure 6). Notethat we interpolate texture coordinates, not color, in which case theimage would be much more blurry.

Original Points (become zombies) Resampling

Relaxation Interpolation

p

Page 8: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

7

filter by computing a local average of the function values of neigh-boring sample points. We need to be careful, however. The relax-ation filter potentially moves all points of the neighborhood of .This tangential drift leads to distortions in the associated scalarfunctions. To deal with this problem we create a copy of each pointthat carries scalar attributes and keep its position fixed duringrelaxation. In particular, we maintain for each sample that is split acopy with its original data. These points will only be used for inter-polating scalar values, they are not part of the current geometrydescription. Since these samples are dead but their function valuesstill live, we call them zombies. Zombies will undergo the sametransformation during a deformation operation as living points, buttheir positions will not be altered during relaxation. Thus zombiesaccurately describe the scalar attributes without distortions. There-fore, we use zombies only for interpolation, while for relaxationwe only use living points. After an editing operation is completed,all zombies will be deleted from the representation.

Figure 7 illustrates our dynamic resampling method for a verylarge deformation that leads to a substantial increase in the numberof sample points. It clearly demonstrates the robustness and scal-ability of our method, even in regions of extreme surface stretch.

5.3 Downsampling

Apart from lower sampling density caused by surface stretching,deformations can also lead to an increase in sampling density,where the surface is squeezed. It might be desirable to eliminatesamples in such regions while editing, to keep the overall samplingdistribution uniform. However, dynamically removing samplesalso has some drawbacks. Consider a surface that is first squeezedand then stretched back to its original shape. If samples getremoved during squeezing, surface information such as detailcoefficients or color will be lost, which leads to blurring when thesurface is re-stretched. Also, from an implementation point ofview, data structures can be held much more simple if onlydynamic insertions, but no deletions are allowed. Thus instead ofdynamic sample deletion we do an optional “garbage collection” atthe end of the editing operation, using the simplification method ofSection 3.2 to reduce the sampling rate.

6 RESULTS & DISCUSSION

We have implemented the algorithms described above and testedour method on a number of point-sampled models. Figure 8 showsan example of a multiresolution edit of a human body using fourelementary deformations, 3 rotations (head, arm, leg) and onetranslation (belly). Note how the geometric detail is preserved inan intuitive manner, e.g. the crinkles on the shirt. The same effectis demonstrated more noticeably in Figure 9, where a laser-rangescan of the ball joint of a human leg is deformed using a translationand a rotation. This model contains a lot of high-frequency detailthat is accurately reconstructed on the deformed surface.

In Figure 10 we created an embossing effect by texture-map-ping the zero- and one-regions onto a plane and applying a transla-tory deformation in a fixed direction. The zooms of the further-deformed plate illustrate the difference between editing on singleresolution (left) versus multiresolution (right). Obviously, if thedisplacements are not in normal direction, the result is counter-intuitive. This example also shows how complex geometry can becreated easily using our deformation framework in connectionwith conventional texture mapping.

Another advantage of our editing metaphor is scalability. Fig-ure 11 shows an editing operation on a scan of Michelangelo’sDavid. This model contains more than 2 million points and isclearly beyond the capabilities of our system for interactive model-ing. However, the user can interactively edit a subsampled versionof the surface until she is satisfied with the result. Then all theparameters of the deformation (e.g. zero- and one-region) can eas-ily be transferred to the high-resolution model and the deformationcan be computed offline. This is possible because our system doesnot rely on intrinsic features of the surface, such as control points.

Implementation Details. We use a kd-tree for computing theset of -nearest neighbors. We found that for all our models

is a suitable choice for stable computations. The kd-tree features fast creation time and fast nearest neighbor queries. Inour current implementation, a tree of 500,000 points is build in1.33 seconds and a query for 10 nearest neighbors in such a treetakes approx. 9 microseconds (on a Pentium 4, 1.8GHz). However,the kd-tree is a static structure and thus not suitable for dynami-cally changing point positions. Therefore, we cache neighborhoodinformation during deformations and rebuild the complete tree atthe end of the editing session (where we also do carbage collec-tion).

Performance. For the examples of Figures 8, 9 and 10 weachieve for all operations interactive framerates of between 0.3and 4 secs. Since our code is not yet optimized for performance,we believe that a substantial increase in speed is still possible. Inparticular, we believe that the efficiency of the local computationscan be improved considerably by exploiting coherence.

Stability. An important issue concerning the robustness of ourmultiresolution decomposition is the stability of the normal esti-mation. In this context it is important that the smoothing operatorguarantees bounded curvature of the resulting base domain. This isthe major motivation for using a local volume preservationmethod, as it prevents spikes in the smoothed surface that wouldcause the normal estimation of the decomposition operator to fail.

7 CONCLUSIONS & FUTURE WORK

In this paper we introduce a new multiresolution modeling frame-work for point-sampled geometry. We have implemented multi-level smoothing, fast simplification and a multiresolutiondecomposition operator for unstructured point clouds. In connec-tion with a flexible and powerful editing metaphor and a dynamicresampling strategy, our system handles large deformations withintuitive detail reconstruction. Future work will be directedtowards adding more semantics to the editing operations includingmultiresolution feature detection and extraction. We also believethat our framework is well suited for morphing and animation.

References

ALEXA, M., BEHR, J., COHEN-OR, D., FLEISHMAN, S., LEVIN, D., AND SILVA,C. T. 2001. Point set surfaces. In Proceedings of Visualization 2001.

AMENTA, N., BERN, M., AND KAMVYSSELIS, M. 1998. A new Voronoi-basedsurface reconstruction algorithm. In Proceedings of SIGGRAPH 1998.

DESBRUN, M., MEYER, M., SCHRÖDER, P., AND BARR, A. H. 1999. Implicitfairing of irregular meshes using diffusion and curvature flow. In Pro-ceedings of SIGGRAPH 1999.

DOCARMO, M. P. 1976. Differential Geometry Of Curves and Surfaces.Prentice Hall.

FLOATER, M. AND REIMERS, M. 2001. Meshless parameterization and sur-face reconstruction. In Comp. Aided Geom. Design 18.

p

k8 k 12≤ ≤

Page 9: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

8

FORSEY, D. AND BARTELS, R. 1988. Hierarchical b-spline refinement. InProceedings of SIGGRAPH 1988.

GARLAND, M. AND HECKBERT, P. S. 1997. Surface simplification usingquadric error metrics. In Proceedings of SIGGRAPH 1997.

GUSKOV, I., SWELDENS, W., AND SCHRÖDER, P. 1999. Multiresolution signalprocessing for meshes. In Proceedings of SIGGRAPH 1999.

HUBELI, A. AND GROSS, M. H. 2001. Multiresolution methods for non-man-ifold models. In IEEE Transaction on Visualization and ComputerGraphics 2001.

KOBBELT, L., CAMPAGNA, S., VORSATZ, J., AND SEIDEL, H.-P. 1998. Inter-active multi-resolution modeling on arbitrary meshes. In Proceedings ofSIGGRAPH 1998.

KOBBELT, L. P., BAREUTHER, T., AND SEIDEL, H.-P. 2000. Multiresolutionshape deformations for meshes with dynamic vertex connectivity. Com-puter Graphics Forum, 19(3).

LEVIN, D.. Mesh-independent surface interpolation. In Advances in Comp.Math. to appear.

Figure 8: Multiresolution edits on a human body (148,138 points). The original model is shown on the left, the smooth base domain in themiddle and the deformed surface on the right.

Figure 9: Edits on a ball joint (137062 points). The top row showsthe smooth base domain with zero-region (blue) and one-region(red).

Figure 10: The Siggraph logo embossed on a plane (123464points). The top image shows zero- and one-region (blue resp. red),the lower images show zooms of the deformed surface. On the leftsingle-resolution deformation, on the right multiresolution defor-mation with proper detail reconstruction.

Figure 11: A gentle deformation of Michelangelo’s David(2,000,606 points).

Page 10: Multiresolution Modeling of Point-Sampled Geometry CS ... · directly on point-sampled geometry without the need for interme-diate local triangulation or meshing. Keywords: Multiresolution

9

PAULY, M. AND GROSS, M. 2001. Spectral processing of point-sampledgeometry. In Proceedings of SIGGRAPH 2001.

PFISTER, H., ZWICKER, M., VAN BAAR, J., AND GROSS, M. 2000. Surfels:Surface elements as rendering primitives. In Proceedings of SIGGRAPH2000.

RUSINKIEWICZ, S. AND LEVOY, M. 2000. QSplat: A multiresolution pointrendering system for large meshes. In Proceedings of SIGGRAPH 2000.

SAGAN, H. 1969. Introduction to the Calculus of Variations. Dover Publica-tions.

SINGH, K. AND FIUME, E. 1998. Wires: A geometric deformation technique.In Proceedings of SIGGRAPH 1998.

SZELISKI, R. AND TONNESEN, D. 1992. Surface modeling with oriented par-ticle systems. In Proceedings of SIGGRAPH 1992.

TAUBIN, G. 1995. A signal processing approach to fair surface design. InProceedings of SIGGRAPH 1995.

TURK, G. 1992. Re-tiling polygonal surfaces. In Proceedings of SIGGRAPH1992.

WITKIN, A. AND HECKBERT, P. 1994. Using particles to sample and controlimplicit surfaces. In Proceedings of SIGGRAPH 1994.

ZORIN, D., SCHRÖDER, P., AND SWELDENS, W. 1997. Interactive multireso-lution mesh editing. In Proceedings of SIGGRAPH 1997.

ZWICKER, M., PFISTER, H., VAN BAAR, J., AND GROSS, M. 2001. Surfacesplatting. In Proceedings of SIGGRAPH 2001.