34
How to build your own 3D scanner Jakob Wilm, PhD Student Section for Image Analysis and Computer Graphics, DTU Informatics Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet

build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

How to build your own 3D scanner!Jakob Wilm, PhD Student !Section for Image Analysis and Computer Graphics, DTU Informatics Department of Clinical Physiology, Nuclear Medicine and PET, Rigshospitalet

Page 2: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

Microsoft Kinect

!2

Asus Xtion

Panasonic d-Imager

PMDTec Camboard Pico

MESA SwissRanger

Point Grey Bumble Bee

Microsoft Kinect 2

Fujifilm FinePix Real 3D

Occipital Structure

Page 3: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

Microsoft Kinect

!3

Cost dkr 1.600,-

Resolution 640 x 480

Speed 25 fps

Accuracy ~1 mm

Robustness high

Hackability ;-|

Page 4: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!4

RGB Camera

IR Camera

IR Projector

Page 5: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!5

Page 6: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

Mesa SwissRanger SR4000

!6

Cost dkr 22.000,-

Resolution 176 x 144

Speed 54 fps

Accuracy ~5 mm

Robustness low

Hackability ;-(

Page 7: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

Microsoft Kinect 2

!7

RGB Camera

IR Camera IR Light

Cost dkr 2.000,-

Resolution 512 x 424

Speed 30 fps

Accuracy ~5 mm

Robustness good

Hackability ;-|

Page 8: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

DTU Scanner — SLStudio

!8

Cost dkr 10.000,-

Resolution variable

Speed variable

Accuracy variable

Robustness variable

Hackability ;->

Page 9: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!9

Page 10: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!10

Triangulation

u

vy

xzx

x

2

4u

v

1

3

5 ⇡

2

4f

x

0 c

x

0 f

y

c

y

0 0 1

3

5

2

4x

y

z

3

5

K

Page 11: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!11

Triangulation

u

vy

xzx

x 2

4u

v

1

3

5 ⇡

2

4f

x

0 c

x

0 f

y

c

y

0 0 1

3

5

2

4x

y

z

3

5

K

2

4uc

vc

1

3

5 ⇡ Kc

2

4x

y

z

3

5

2

4up

vp

1

3

5 ⇡ Kp [R|T ]

2

4x

y

z

3

5

P p{

Page 12: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!12

Triangulation

2

4uc

vc

1

3

5 ⇡ Kc

2

4x

y

z

3

5

2

4up

vp

1

3

5 ⇡ Kp [R|T ]

2

4x

y

z

3

5

P p{

2

4ucP c(3)� P c(1)vcP c(3)� P c(2)upP p(3)� P p(1)

3

5 ·Q = 0

u

vy

xzx

x 2

4u

v

1

3

5 ⇡

2

4f

x

0 c

x

0 f

y

c

y

0 0 1

3

5

2

4x

y

z

3

5

K

Page 13: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!13

Phase Shifting Interferometry

Ipnðxp; ypÞ ¼12þ 12cos

!2π

!nN

− yp""

; ð1Þ

where ðxp; ypÞ are the column and row coordinates of apixel in the projector, Ipn is the intensity of that pixel in aprojector with dynamic range from 0 to 1, and n repre-sents the phase-shift index over the N total patterns.For reconstruction, a camera captures each image

where the sine wave pattern is distorted by the scannedsurface topology, resulting in the patterned imagesexpressed as

Icnðxc; ycÞ ¼ Ac þ Bc cos!2πnN

− θ"; ð2Þ

where ðxc; ycÞ are the coordinates of a pixel in the cam-era, while Icnðxc; ycÞ is the intensity of that pixel. The termAc is the averaged pixel intensity across the pattern setthat includes the ambient light component, which can bederived according to

Ac ¼ 1N

XN−1

n¼0

Icnðxc; ycÞ: ð3Þ

Correspondingly, the term Bc is the intensity modulationof a given pixel and is derived from Icnðxc; ycÞ in terms ofreal and imaginary components where

BcR ¼

XN−1

n¼0

Icnðxc; ycÞ cos!2πnN

"ð4Þ

and

BcI ¼

XN−1

n¼0

Icnðxc; ycÞ sin!2πnN

"ð5Þ

such that

Bc ¼ ‖BcR þ jBc

I‖ ¼ fBc2R þ Bc2

I g12; ð6Þ

which is the amplitude of the observed sinusoid.If Icnðxc; ycÞ is constant or less affected by the projected

sinusoid patterns, Bc will be close to zero. Thus Bc is em-ployed as a shadow noise detector/filter [6] such that theshadow-noised regions, with small Bc values, are dis-carded from further processing. Of the reliable pixelswith sufficiently large Bc, θ represents the phase valueof the captured sinusoid pattern derived as

θ ¼ ∠ðBcR þ jBc

IÞ ¼ arctan#BcI

BcR

$; ð7Þ

which is used to derive the projector row accordingto θ ¼ 2πyp.If we treat the samples of a subject camera pixel as a

single period of a discrete-time signal, x½n& for n ¼1; 2;…; N − 1, then we can define the Fourier terms,X ½k& for k ¼ 1; 2;…; N − 1, using a discrete-time Fouriertransform. From X ½k&, we then note that Ac is related tothe DC component according to

Ac ¼ 1NX ½0&; ð8Þ

while Bc and θ are related to X ½1& ¼ X'½N − 1&according to

Bc ¼ 2N‖X ½1&‖ ¼ 2

N‖X ½N − 1&‖ ð9Þ

and

θ ¼ ∠X ½1& ¼ −∠X ½N − 1&: ð10Þ

Because of these relationships, we refer to the frequencyterms, X ½0&, X ½1&, and X ½N − 1&, as the principal frequencycomponents, while the remaining terms are referred to asthe nonprincipal terms and are the harmonics of X ½1&.

Under ideal conditions, the nonprincipal frequencyterms, X ½k& for k ¼ 2; 3;…; N − 2, are always equal tozero, but in the presence of sensor noise, the termsare, themselves, drawn from an additive, white-noise pro-cess. At the same time, if a target surface is moving to-ward or away from the camera sensor at a slow rate, thenthe frequency of the sinusoid represented in the time se-quence, x½n&, is modulated to have a slightly higher orlower frequency, and as such, we expect the spectralcomponents X ½2& and X ½N − 2& to increase as the energyin X ½1& and X ½0& becomes less concentrated. Combiningslow motion with additive noise, there exists somethreshold, T , at which the magnitude of these nonprinci-pal frequency terms can be assumed to reside.

Given the lack of an information bearing signal on theharmonic frequencies, one might be tempted to use thesenonprincipal frequency terms to carry additional phaseinformation for the purpose, for example, of phase un-wrapping. But doing so involves the reduction in ampli-tude of the component sinusoids to fit the assortedsignals within the dynamic range of both the cameraand projector, thereby reducing the signal-to-noise ratioon these channels. Absent this use, if a pixel falls off theedge of a foreground object or if there is a dramaticchange in surface texture, we expect to see a large mag-nitude term in the harmonics as the foreground phaseterm is essentially modulated by a step edge, while thebackground phase term is modulated by the inverse ofthis same step.

Step edges and impulses, having large high-frequencyspectral components, introduce high-frequency energy in

Fig. 1. Calibration target (left) and corresponding first harmo-nic magnitude image (right).

2488 OPTICS LETTERS / Vol. 35, No. 14 / July 15, 2010

Ipnðxp; ypÞ ¼12þ 12cos

!2π

!nN

− yp""

; ð1Þ

where ðxp; ypÞ are the column and row coordinates of apixel in the projector, Ipn is the intensity of that pixel in aprojector with dynamic range from 0 to 1, and n repre-sents the phase-shift index over the N total patterns.For reconstruction, a camera captures each image

where the sine wave pattern is distorted by the scannedsurface topology, resulting in the patterned imagesexpressed as

Icnðxc; ycÞ ¼ Ac þ Bc cos!2πnN

− θ"; ð2Þ

where ðxc; ycÞ are the coordinates of a pixel in the cam-era, while Icnðxc; ycÞ is the intensity of that pixel. The termAc is the averaged pixel intensity across the pattern setthat includes the ambient light component, which can bederived according to

Ac ¼ 1N

XN−1

n¼0

Icnðxc; ycÞ: ð3Þ

Correspondingly, the term Bc is the intensity modulationof a given pixel and is derived from Icnðxc; ycÞ in terms ofreal and imaginary components where

BcR ¼

XN−1

n¼0

Icnðxc; ycÞ cos!2πnN

"ð4Þ

and

BcI ¼

XN−1

n¼0

Icnðxc; ycÞ sin!2πnN

"ð5Þ

such that

Bc ¼ ‖BcR þ jBc

I‖ ¼ fBc2R þ Bc2

I g12; ð6Þ

which is the amplitude of the observed sinusoid.If Icnðxc; ycÞ is constant or less affected by the projected

sinusoid patterns, Bc will be close to zero. Thus Bc is em-ployed as a shadow noise detector/filter [6] such that theshadow-noised regions, with small Bc values, are dis-carded from further processing. Of the reliable pixelswith sufficiently large Bc, θ represents the phase valueof the captured sinusoid pattern derived as

θ ¼ ∠ðBcR þ jBc

IÞ ¼ arctan#BcI

BcR

$; ð7Þ

which is used to derive the projector row accordingto θ ¼ 2πyp.If we treat the samples of a subject camera pixel as a

single period of a discrete-time signal, x½n& for n ¼1; 2;…; N − 1, then we can define the Fourier terms,X ½k& for k ¼ 1; 2;…; N − 1, using a discrete-time Fouriertransform. From X ½k&, we then note that Ac is related tothe DC component according to

Ac ¼ 1NX ½0&; ð8Þ

while Bc and θ are related to X ½1& ¼ X'½N − 1&according to

Bc ¼ 2N‖X ½1&‖ ¼ 2

N‖X ½N − 1&‖ ð9Þ

and

θ ¼ ∠X ½1& ¼ −∠X ½N − 1&: ð10Þ

Because of these relationships, we refer to the frequencyterms, X ½0&, X ½1&, and X ½N − 1&, as the principal frequencycomponents, while the remaining terms are referred to asthe nonprincipal terms and are the harmonics of X ½1&.

Under ideal conditions, the nonprincipal frequencyterms, X ½k& for k ¼ 2; 3;…; N − 2, are always equal tozero, but in the presence of sensor noise, the termsare, themselves, drawn from an additive, white-noise pro-cess. At the same time, if a target surface is moving to-ward or away from the camera sensor at a slow rate, thenthe frequency of the sinusoid represented in the time se-quence, x½n&, is modulated to have a slightly higher orlower frequency, and as such, we expect the spectralcomponents X ½2& and X ½N − 2& to increase as the energyin X ½1& and X ½0& becomes less concentrated. Combiningslow motion with additive noise, there exists somethreshold, T , at which the magnitude of these nonprinci-pal frequency terms can be assumed to reside.

Given the lack of an information bearing signal on theharmonic frequencies, one might be tempted to use thesenonprincipal frequency terms to carry additional phaseinformation for the purpose, for example, of phase un-wrapping. But doing so involves the reduction in ampli-tude of the component sinusoids to fit the assortedsignals within the dynamic range of both the cameraand projector, thereby reducing the signal-to-noise ratioon these channels. Absent this use, if a pixel falls off theedge of a foreground object or if there is a dramaticchange in surface texture, we expect to see a large mag-nitude term in the harmonics as the foreground phaseterm is essentially modulated by a step edge, while thebackground phase term is modulated by the inverse ofthis same step.

Step edges and impulses, having large high-frequencyspectral components, introduce high-frequency energy in

Fig. 1. Calibration target (left) and corresponding first harmo-nic magnitude image (right).

2488 OPTICS LETTERS / Vol. 35, No. 14 / July 15, 2010

Ipnðxp; ypÞ ¼12þ 12cos

!2π

!nN

− yp""

; ð1Þ

where ðxp; ypÞ are the column and row coordinates of apixel in the projector, Ipn is the intensity of that pixel in aprojector with dynamic range from 0 to 1, and n repre-sents the phase-shift index over the N total patterns.For reconstruction, a camera captures each image

where the sine wave pattern is distorted by the scannedsurface topology, resulting in the patterned imagesexpressed as

Icnðxc; ycÞ ¼ Ac þ Bc cos!2πnN

− θ"; ð2Þ

where ðxc; ycÞ are the coordinates of a pixel in the cam-era, while Icnðxc; ycÞ is the intensity of that pixel. The termAc is the averaged pixel intensity across the pattern setthat includes the ambient light component, which can bederived according to

Ac ¼ 1N

XN−1

n¼0

Icnðxc; ycÞ: ð3Þ

Correspondingly, the term Bc is the intensity modulationof a given pixel and is derived from Icnðxc; ycÞ in terms ofreal and imaginary components where

BcR ¼

XN−1

n¼0

Icnðxc; ycÞ cos!2πnN

"ð4Þ

and

BcI ¼

XN−1

n¼0

Icnðxc; ycÞ sin!2πnN

"ð5Þ

such that

Bc ¼ ‖BcR þ jBc

I‖ ¼ fBc2R þ Bc2

I g12; ð6Þ

which is the amplitude of the observed sinusoid.If Icnðxc; ycÞ is constant or less affected by the projected

sinusoid patterns, Bc will be close to zero. Thus Bc is em-ployed as a shadow noise detector/filter [6] such that theshadow-noised regions, with small Bc values, are dis-carded from further processing. Of the reliable pixelswith sufficiently large Bc, θ represents the phase valueof the captured sinusoid pattern derived as

θ ¼ ∠ðBcR þ jBc

IÞ ¼ arctan#BcI

BcR

$; ð7Þ

which is used to derive the projector row accordingto θ ¼ 2πyp.If we treat the samples of a subject camera pixel as a

single period of a discrete-time signal, x½n& for n ¼1; 2;…; N − 1, then we can define the Fourier terms,X ½k& for k ¼ 1; 2;…; N − 1, using a discrete-time Fouriertransform. From X ½k&, we then note that Ac is related tothe DC component according to

Ac ¼ 1NX ½0&; ð8Þ

while Bc and θ are related to X ½1& ¼ X'½N − 1&according to

Bc ¼ 2N‖X ½1&‖ ¼ 2

N‖X ½N − 1&‖ ð9Þ

and

θ ¼ ∠X ½1& ¼ −∠X ½N − 1&: ð10Þ

Because of these relationships, we refer to the frequencyterms, X ½0&, X ½1&, and X ½N − 1&, as the principal frequencycomponents, while the remaining terms are referred to asthe nonprincipal terms and are the harmonics of X ½1&.

Under ideal conditions, the nonprincipal frequencyterms, X ½k& for k ¼ 2; 3;…; N − 2, are always equal tozero, but in the presence of sensor noise, the termsare, themselves, drawn from an additive, white-noise pro-cess. At the same time, if a target surface is moving to-ward or away from the camera sensor at a slow rate, thenthe frequency of the sinusoid represented in the time se-quence, x½n&, is modulated to have a slightly higher orlower frequency, and as such, we expect the spectralcomponents X ½2& and X ½N − 2& to increase as the energyin X ½1& and X ½0& becomes less concentrated. Combiningslow motion with additive noise, there exists somethreshold, T , at which the magnitude of these nonprinci-pal frequency terms can be assumed to reside.

Given the lack of an information bearing signal on theharmonic frequencies, one might be tempted to use thesenonprincipal frequency terms to carry additional phaseinformation for the purpose, for example, of phase un-wrapping. But doing so involves the reduction in ampli-tude of the component sinusoids to fit the assortedsignals within the dynamic range of both the cameraand projector, thereby reducing the signal-to-noise ratioon these channels. Absent this use, if a pixel falls off theedge of a foreground object or if there is a dramaticchange in surface texture, we expect to see a large mag-nitude term in the harmonics as the foreground phaseterm is essentially modulated by a step edge, while thebackground phase term is modulated by the inverse ofthis same step.

Step edges and impulses, having large high-frequencyspectral components, introduce high-frequency energy in

Fig. 1. Calibration target (left) and corresponding first harmo-nic magnitude image (right).

2488 OPTICS LETTERS / Vol. 35, No. 14 / July 15, 2010

Ipnðxp; ypÞ ¼12þ 12cos

!2π

!nN

− yp""

; ð1Þ

where ðxp; ypÞ are the column and row coordinates of apixel in the projector, Ipn is the intensity of that pixel in aprojector with dynamic range from 0 to 1, and n repre-sents the phase-shift index over the N total patterns.For reconstruction, a camera captures each image

where the sine wave pattern is distorted by the scannedsurface topology, resulting in the patterned imagesexpressed as

Icnðxc; ycÞ ¼ Ac þ Bc cos!2πnN

− θ"; ð2Þ

where ðxc; ycÞ are the coordinates of a pixel in the cam-era, while Icnðxc; ycÞ is the intensity of that pixel. The termAc is the averaged pixel intensity across the pattern setthat includes the ambient light component, which can bederived according to

Ac ¼ 1N

XN−1

n¼0

Icnðxc; ycÞ: ð3Þ

Correspondingly, the term Bc is the intensity modulationof a given pixel and is derived from Icnðxc; ycÞ in terms ofreal and imaginary components where

BcR ¼

XN−1

n¼0

Icnðxc; ycÞ cos!2πnN

"ð4Þ

and

BcI ¼

XN−1

n¼0

Icnðxc; ycÞ sin!2πnN

"ð5Þ

such that

Bc ¼ ‖BcR þ jBc

I‖ ¼ fBc2R þ Bc2

I g12; ð6Þ

which is the amplitude of the observed sinusoid.If Icnðxc; ycÞ is constant or less affected by the projected

sinusoid patterns, Bc will be close to zero. Thus Bc is em-ployed as a shadow noise detector/filter [6] such that theshadow-noised regions, with small Bc values, are dis-carded from further processing. Of the reliable pixelswith sufficiently large Bc, θ represents the phase valueof the captured sinusoid pattern derived as

θ ¼ ∠ðBcR þ jBc

IÞ ¼ arctan#BcI

BcR

$; ð7Þ

which is used to derive the projector row accordingto θ ¼ 2πyp.If we treat the samples of a subject camera pixel as a

single period of a discrete-time signal, x½n& for n ¼1; 2;…; N − 1, then we can define the Fourier terms,X ½k& for k ¼ 1; 2;…; N − 1, using a discrete-time Fouriertransform. From X ½k&, we then note that Ac is related tothe DC component according to

Ac ¼ 1NX ½0&; ð8Þ

while Bc and θ are related to X ½1& ¼ X'½N − 1&according to

Bc ¼ 2N‖X ½1&‖ ¼ 2

N‖X ½N − 1&‖ ð9Þ

and

θ ¼ ∠X ½1& ¼ −∠X ½N − 1&: ð10Þ

Because of these relationships, we refer to the frequencyterms, X ½0&, X ½1&, and X ½N − 1&, as the principal frequencycomponents, while the remaining terms are referred to asthe nonprincipal terms and are the harmonics of X ½1&.

Under ideal conditions, the nonprincipal frequencyterms, X ½k& for k ¼ 2; 3;…; N − 2, are always equal tozero, but in the presence of sensor noise, the termsare, themselves, drawn from an additive, white-noise pro-cess. At the same time, if a target surface is moving to-ward or away from the camera sensor at a slow rate, thenthe frequency of the sinusoid represented in the time se-quence, x½n&, is modulated to have a slightly higher orlower frequency, and as such, we expect the spectralcomponents X ½2& and X ½N − 2& to increase as the energyin X ½1& and X ½0& becomes less concentrated. Combiningslow motion with additive noise, there exists somethreshold, T , at which the magnitude of these nonprinci-pal frequency terms can be assumed to reside.

Given the lack of an information bearing signal on theharmonic frequencies, one might be tempted to use thesenonprincipal frequency terms to carry additional phaseinformation for the purpose, for example, of phase un-wrapping. But doing so involves the reduction in ampli-tude of the component sinusoids to fit the assortedsignals within the dynamic range of both the cameraand projector, thereby reducing the signal-to-noise ratioon these channels. Absent this use, if a pixel falls off theedge of a foreground object or if there is a dramaticchange in surface texture, we expect to see a large mag-nitude term in the harmonics as the foreground phaseterm is essentially modulated by a step edge, while thebackground phase term is modulated by the inverse ofthis same step.

Step edges and impulses, having large high-frequencyspectral components, introduce high-frequency energy in

Fig. 1. Calibration target (left) and corresponding first harmo-nic magnitude image (right).

2488 OPTICS LETTERS / Vol. 35, No. 14 / July 15, 2010

Ipnðxp; ypÞ ¼12þ 12cos

!2π

!nN

− yp""

; ð1Þ

where ðxp; ypÞ are the column and row coordinates of apixel in the projector, Ipn is the intensity of that pixel in aprojector with dynamic range from 0 to 1, and n repre-sents the phase-shift index over the N total patterns.For reconstruction, a camera captures each image

where the sine wave pattern is distorted by the scannedsurface topology, resulting in the patterned imagesexpressed as

Icnðxc; ycÞ ¼ Ac þ Bc cos!2πnN

− θ"; ð2Þ

where ðxc; ycÞ are the coordinates of a pixel in the cam-era, while Icnðxc; ycÞ is the intensity of that pixel. The termAc is the averaged pixel intensity across the pattern setthat includes the ambient light component, which can bederived according to

Ac ¼ 1N

XN−1

n¼0

Icnðxc; ycÞ: ð3Þ

Correspondingly, the term Bc is the intensity modulationof a given pixel and is derived from Icnðxc; ycÞ in terms ofreal and imaginary components where

BcR ¼

XN−1

n¼0

Icnðxc; ycÞ cos!2πnN

"ð4Þ

and

BcI ¼

XN−1

n¼0

Icnðxc; ycÞ sin!2πnN

"ð5Þ

such that

Bc ¼ ‖BcR þ jBc

I‖ ¼ fBc2R þ Bc2

I g12; ð6Þ

which is the amplitude of the observed sinusoid.If Icnðxc; ycÞ is constant or less affected by the projected

sinusoid patterns, Bc will be close to zero. Thus Bc is em-ployed as a shadow noise detector/filter [6] such that theshadow-noised regions, with small Bc values, are dis-carded from further processing. Of the reliable pixelswith sufficiently large Bc, θ represents the phase valueof the captured sinusoid pattern derived as

θ ¼ ∠ðBcR þ jBc

IÞ ¼ arctan#BcI

BcR

$; ð7Þ

which is used to derive the projector row accordingto θ ¼ 2πyp.If we treat the samples of a subject camera pixel as a

single period of a discrete-time signal, x½n& for n ¼1; 2;…; N − 1, then we can define the Fourier terms,X ½k& for k ¼ 1; 2;…; N − 1, using a discrete-time Fouriertransform. From X ½k&, we then note that Ac is related tothe DC component according to

Ac ¼ 1NX ½0&; ð8Þ

while Bc and θ are related to X ½1& ¼ X'½N − 1&according to

Bc ¼ 2N‖X ½1&‖ ¼ 2

N‖X ½N − 1&‖ ð9Þ

and

θ ¼ ∠X ½1& ¼ −∠X ½N − 1&: ð10Þ

Because of these relationships, we refer to the frequencyterms, X ½0&, X ½1&, and X ½N − 1&, as the principal frequencycomponents, while the remaining terms are referred to asthe nonprincipal terms and are the harmonics of X ½1&.

Under ideal conditions, the nonprincipal frequencyterms, X ½k& for k ¼ 2; 3;…; N − 2, are always equal tozero, but in the presence of sensor noise, the termsare, themselves, drawn from an additive, white-noise pro-cess. At the same time, if a target surface is moving to-ward or away from the camera sensor at a slow rate, thenthe frequency of the sinusoid represented in the time se-quence, x½n&, is modulated to have a slightly higher orlower frequency, and as such, we expect the spectralcomponents X ½2& and X ½N − 2& to increase as the energyin X ½1& and X ½0& becomes less concentrated. Combiningslow motion with additive noise, there exists somethreshold, T , at which the magnitude of these nonprinci-pal frequency terms can be assumed to reside.

Given the lack of an information bearing signal on theharmonic frequencies, one might be tempted to use thesenonprincipal frequency terms to carry additional phaseinformation for the purpose, for example, of phase un-wrapping. But doing so involves the reduction in ampli-tude of the component sinusoids to fit the assortedsignals within the dynamic range of both the cameraand projector, thereby reducing the signal-to-noise ratioon these channels. Absent this use, if a pixel falls off theedge of a foreground object or if there is a dramaticchange in surface texture, we expect to see a large mag-nitude term in the harmonics as the foreground phaseterm is essentially modulated by a step edge, while thebackground phase term is modulated by the inverse ofthis same step.

Step edges and impulses, having large high-frequencyspectral components, introduce high-frequency energy in

Fig. 1. Calibration target (left) and corresponding first harmo-nic magnitude image (right).

2488 OPTICS LETTERS / Vol. 35, No. 14 / July 15, 2010

Page 14: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!14

Frequence Domain Interpretation

Ipnðxp; ypÞ ¼12þ 12cos

!2π

!nN

− yp""

; ð1Þ

where ðxp; ypÞ are the column and row coordinates of apixel in the projector, Ipn is the intensity of that pixel in aprojector with dynamic range from 0 to 1, and n repre-sents the phase-shift index over the N total patterns.For reconstruction, a camera captures each image

where the sine wave pattern is distorted by the scannedsurface topology, resulting in the patterned imagesexpressed as

Icnðxc; ycÞ ¼ Ac þ Bc cos!2πnN

− θ"; ð2Þ

where ðxc; ycÞ are the coordinates of a pixel in the cam-era, while Icnðxc; ycÞ is the intensity of that pixel. The termAc is the averaged pixel intensity across the pattern setthat includes the ambient light component, which can bederived according to

Ac ¼ 1N

XN−1

n¼0

Icnðxc; ycÞ: ð3Þ

Correspondingly, the term Bc is the intensity modulationof a given pixel and is derived from Icnðxc; ycÞ in terms ofreal and imaginary components where

BcR ¼

XN−1

n¼0

Icnðxc; ycÞ cos!2πnN

"ð4Þ

and

BcI ¼

XN−1

n¼0

Icnðxc; ycÞ sin!2πnN

"ð5Þ

such that

Bc ¼ ‖BcR þ jBc

I‖ ¼ fBc2R þ Bc2

I g12; ð6Þ

which is the amplitude of the observed sinusoid.If Icnðxc; ycÞ is constant or less affected by the projected

sinusoid patterns, Bc will be close to zero. Thus Bc is em-ployed as a shadow noise detector/filter [6] such that theshadow-noised regions, with small Bc values, are dis-carded from further processing. Of the reliable pixelswith sufficiently large Bc, θ represents the phase valueof the captured sinusoid pattern derived as

θ ¼ ∠ðBcR þ jBc

IÞ ¼ arctan#BcI

BcR

$; ð7Þ

which is used to derive the projector row accordingto θ ¼ 2πyp.If we treat the samples of a subject camera pixel as a

single period of a discrete-time signal, x½n& for n ¼1; 2;…; N − 1, then we can define the Fourier terms,X ½k& for k ¼ 1; 2;…; N − 1, using a discrete-time Fouriertransform. From X ½k&, we then note that Ac is related tothe DC component according to

Ac ¼ 1NX ½0&; ð8Þ

while Bc and θ are related to X ½1& ¼ X'½N − 1&according to

Bc ¼ 2N‖X ½1&‖ ¼ 2

N‖X ½N − 1&‖ ð9Þ

and

θ ¼ ∠X ½1& ¼ −∠X ½N − 1&: ð10Þ

Because of these relationships, we refer to the frequencyterms, X ½0&, X ½1&, and X ½N − 1&, as the principal frequencycomponents, while the remaining terms are referred to asthe nonprincipal terms and are the harmonics of X ½1&.

Under ideal conditions, the nonprincipal frequencyterms, X ½k& for k ¼ 2; 3;…; N − 2, are always equal tozero, but in the presence of sensor noise, the termsare, themselves, drawn from an additive, white-noise pro-cess. At the same time, if a target surface is moving to-ward or away from the camera sensor at a slow rate, thenthe frequency of the sinusoid represented in the time se-quence, x½n&, is modulated to have a slightly higher orlower frequency, and as such, we expect the spectralcomponents X ½2& and X ½N − 2& to increase as the energyin X ½1& and X ½0& becomes less concentrated. Combiningslow motion with additive noise, there exists somethreshold, T , at which the magnitude of these nonprinci-pal frequency terms can be assumed to reside.

Given the lack of an information bearing signal on theharmonic frequencies, one might be tempted to use thesenonprincipal frequency terms to carry additional phaseinformation for the purpose, for example, of phase un-wrapping. But doing so involves the reduction in ampli-tude of the component sinusoids to fit the assortedsignals within the dynamic range of both the cameraand projector, thereby reducing the signal-to-noise ratioon these channels. Absent this use, if a pixel falls off theedge of a foreground object or if there is a dramaticchange in surface texture, we expect to see a large mag-nitude term in the harmonics as the foreground phaseterm is essentially modulated by a step edge, while thebackground phase term is modulated by the inverse ofthis same step.

Step edges and impulses, having large high-frequencyspectral components, introduce high-frequency energy in

Fig. 1. Calibration target (left) and corresponding first harmo-nic magnitude image (right).

2488 OPTICS LETTERS / Vol. 35, No. 14 / July 15, 2010

Ipnðxp; ypÞ ¼12þ 12cos

!2π

!nN

− yp""

; ð1Þ

where ðxp; ypÞ are the column and row coordinates of apixel in the projector, Ipn is the intensity of that pixel in aprojector with dynamic range from 0 to 1, and n repre-sents the phase-shift index over the N total patterns.For reconstruction, a camera captures each image

where the sine wave pattern is distorted by the scannedsurface topology, resulting in the patterned imagesexpressed as

Icnðxc; ycÞ ¼ Ac þ Bc cos!2πnN

− θ"; ð2Þ

where ðxc; ycÞ are the coordinates of a pixel in the cam-era, while Icnðxc; ycÞ is the intensity of that pixel. The termAc is the averaged pixel intensity across the pattern setthat includes the ambient light component, which can bederived according to

Ac ¼ 1N

XN−1

n¼0

Icnðxc; ycÞ: ð3Þ

Correspondingly, the term Bc is the intensity modulationof a given pixel and is derived from Icnðxc; ycÞ in terms ofreal and imaginary components where

BcR ¼

XN−1

n¼0

Icnðxc; ycÞ cos!2πnN

"ð4Þ

and

BcI ¼

XN−1

n¼0

Icnðxc; ycÞ sin!2πnN

"ð5Þ

such that

Bc ¼ ‖BcR þ jBc

I‖ ¼ fBc2R þ Bc2

I g12; ð6Þ

which is the amplitude of the observed sinusoid.If Icnðxc; ycÞ is constant or less affected by the projected

sinusoid patterns, Bc will be close to zero. Thus Bc is em-ployed as a shadow noise detector/filter [6] such that theshadow-noised regions, with small Bc values, are dis-carded from further processing. Of the reliable pixelswith sufficiently large Bc, θ represents the phase valueof the captured sinusoid pattern derived as

θ ¼ ∠ðBcR þ jBc

IÞ ¼ arctan#BcI

BcR

$; ð7Þ

which is used to derive the projector row accordingto θ ¼ 2πyp.If we treat the samples of a subject camera pixel as a

single period of a discrete-time signal, x½n& for n ¼1; 2;…; N − 1, then we can define the Fourier terms,X ½k& for k ¼ 1; 2;…; N − 1, using a discrete-time Fouriertransform. From X ½k&, we then note that Ac is related tothe DC component according to

Ac ¼ 1NX ½0&; ð8Þ

while Bc and θ are related to X ½1& ¼ X'½N − 1&according to

Bc ¼ 2N‖X ½1&‖ ¼ 2

N‖X ½N − 1&‖ ð9Þ

and

θ ¼ ∠X ½1& ¼ −∠X ½N − 1&: ð10Þ

Because of these relationships, we refer to the frequencyterms, X ½0&, X ½1&, and X ½N − 1&, as the principal frequencycomponents, while the remaining terms are referred to asthe nonprincipal terms and are the harmonics of X ½1&.

Under ideal conditions, the nonprincipal frequencyterms, X ½k& for k ¼ 2; 3;…; N − 2, are always equal tozero, but in the presence of sensor noise, the termsare, themselves, drawn from an additive, white-noise pro-cess. At the same time, if a target surface is moving to-ward or away from the camera sensor at a slow rate, thenthe frequency of the sinusoid represented in the time se-quence, x½n&, is modulated to have a slightly higher orlower frequency, and as such, we expect the spectralcomponents X ½2& and X ½N − 2& to increase as the energyin X ½1& and X ½0& becomes less concentrated. Combiningslow motion with additive noise, there exists somethreshold, T , at which the magnitude of these nonprinci-pal frequency terms can be assumed to reside.

Given the lack of an information bearing signal on theharmonic frequencies, one might be tempted to use thesenonprincipal frequency terms to carry additional phaseinformation for the purpose, for example, of phase un-wrapping. But doing so involves the reduction in ampli-tude of the component sinusoids to fit the assortedsignals within the dynamic range of both the cameraand projector, thereby reducing the signal-to-noise ratioon these channels. Absent this use, if a pixel falls off theedge of a foreground object or if there is a dramaticchange in surface texture, we expect to see a large mag-nitude term in the harmonics as the foreground phaseterm is essentially modulated by a step edge, while thebackground phase term is modulated by the inverse ofthis same step.

Step edges and impulses, having large high-frequencyspectral components, introduce high-frequency energy in

Fig. 1. Calibration target (left) and corresponding first harmo-nic magnitude image (right).

2488 OPTICS LETTERS / Vol. 35, No. 14 / July 15, 2010Ipnðxp; ypÞ ¼12þ 12cos

!2π

!nN

− yp""

; ð1Þ

where ðxp; ypÞ are the column and row coordinates of apixel in the projector, Ipn is the intensity of that pixel in aprojector with dynamic range from 0 to 1, and n repre-sents the phase-shift index over the N total patterns.For reconstruction, a camera captures each image

where the sine wave pattern is distorted by the scannedsurface topology, resulting in the patterned imagesexpressed as

Icnðxc; ycÞ ¼ Ac þ Bc cos!2πnN

− θ"; ð2Þ

where ðxc; ycÞ are the coordinates of a pixel in the cam-era, while Icnðxc; ycÞ is the intensity of that pixel. The termAc is the averaged pixel intensity across the pattern setthat includes the ambient light component, which can bederived according to

Ac ¼ 1N

XN−1

n¼0

Icnðxc; ycÞ: ð3Þ

Correspondingly, the term Bc is the intensity modulationof a given pixel and is derived from Icnðxc; ycÞ in terms ofreal and imaginary components where

BcR ¼

XN−1

n¼0

Icnðxc; ycÞ cos!2πnN

"ð4Þ

and

BcI ¼

XN−1

n¼0

Icnðxc; ycÞ sin!2πnN

"ð5Þ

such that

Bc ¼ ‖BcR þ jBc

I‖ ¼ fBc2R þ Bc2

I g12; ð6Þ

which is the amplitude of the observed sinusoid.If Icnðxc; ycÞ is constant or less affected by the projected

sinusoid patterns, Bc will be close to zero. Thus Bc is em-ployed as a shadow noise detector/filter [6] such that theshadow-noised regions, with small Bc values, are dis-carded from further processing. Of the reliable pixelswith sufficiently large Bc, θ represents the phase valueof the captured sinusoid pattern derived as

θ ¼ ∠ðBcR þ jBc

IÞ ¼ arctan#BcI

BcR

$; ð7Þ

which is used to derive the projector row accordingto θ ¼ 2πyp.If we treat the samples of a subject camera pixel as a

single period of a discrete-time signal, x½n& for n ¼1; 2;…; N − 1, then we can define the Fourier terms,X ½k& for k ¼ 1; 2;…; N − 1, using a discrete-time Fouriertransform. From X ½k&, we then note that Ac is related tothe DC component according to

Ac ¼ 1NX ½0&; ð8Þ

while Bc and θ are related to X ½1& ¼ X'½N − 1&according to

Bc ¼ 2N‖X ½1&‖ ¼ 2

N‖X ½N − 1&‖ ð9Þ

and

θ ¼ ∠X ½1& ¼ −∠X ½N − 1&: ð10Þ

Because of these relationships, we refer to the frequencyterms, X ½0&, X ½1&, and X ½N − 1&, as the principal frequencycomponents, while the remaining terms are referred to asthe nonprincipal terms and are the harmonics of X ½1&.

Under ideal conditions, the nonprincipal frequencyterms, X ½k& for k ¼ 2; 3;…; N − 2, are always equal tozero, but in the presence of sensor noise, the termsare, themselves, drawn from an additive, white-noise pro-cess. At the same time, if a target surface is moving to-ward or away from the camera sensor at a slow rate, thenthe frequency of the sinusoid represented in the time se-quence, x½n&, is modulated to have a slightly higher orlower frequency, and as such, we expect the spectralcomponents X ½2& and X ½N − 2& to increase as the energyin X ½1& and X ½0& becomes less concentrated. Combiningslow motion with additive noise, there exists somethreshold, T , at which the magnitude of these nonprinci-pal frequency terms can be assumed to reside.

Given the lack of an information bearing signal on theharmonic frequencies, one might be tempted to use thesenonprincipal frequency terms to carry additional phaseinformation for the purpose, for example, of phase un-wrapping. But doing so involves the reduction in ampli-tude of the component sinusoids to fit the assortedsignals within the dynamic range of both the cameraand projector, thereby reducing the signal-to-noise ratioon these channels. Absent this use, if a pixel falls off theedge of a foreground object or if there is a dramaticchange in surface texture, we expect to see a large mag-nitude term in the harmonics as the foreground phaseterm is essentially modulated by a step edge, while thebackground phase term is modulated by the inverse ofthis same step.

Step edges and impulses, having large high-frequencyspectral components, introduce high-frequency energy in

Fig. 1. Calibration target (left) and corresponding first harmo-nic magnitude image (right).

2488 OPTICS LETTERS / Vol. 35, No. 14 / July 15, 2010

DC Component

Energy

Phase

Lau, Liu, Hassebrook (2010). Opt letters, 35(14)

x(n)

n0 1 2

∠||X

(n)||

0 1 2

X(n)

n0 1 2

F(x)

N = 3

Page 15: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!15

SLStudio

•Modular platform

•Enables 20fps pointclouds (3 frame PSI)

•Key components:

–Projection interface

–Industrial camera interface

–Coding/Decoding & Fast reconstruction

–Calibration

–In development: Rigid body tracking

•~10k LOC

Page 16: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!16

Reconstruction

2

4uc

vc1

3

5 = P cQ and

2

4up

vp1

3

5 = P pQ

2

4ucP c(3)� P c(1)vcP c(3)� P c(2)upP p(3)� P p(1)

3

5 ·Q = 0

Qk = Ck1,2,1 � upC

k3,2,1 � vpC

k1,3,1 � ucC

k1,2,2 + ucupC

k3,2,2 + ucvpC

k1,3,2

Cki,j,l = det (P c(i),P c(j),P p(l), ek)

Valkenburg, McIvor (1998). Img Vis Comp. 16(2), 99–110.

250k points = 20ms!!

uc

vc

up

Page 17: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!17

Page 18: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!18

Page 19: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!19 Radiohead, House of Cards

Page 20: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!20

Hardware Trigger

Frame Rate

Trig

Page 21: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!21 Trig

Out

HD

MI

Trig In

Fire

Wire

Page 22: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!22

Page 23: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!23

Parallelization

T rigidmain/gui

cam/projector

decoder

reconstructor

tracker

SLStudio

for(int i = 0; i < mat.rows; i++){! for(int j = 0; j < mat.cols; j++){! ...! }! } #if CV_SSE2! if( USE_SSE2 )! {! ...! __m128i r0 = _mm_loadu_si128(...);! __m128i r1 = _mm_loadu_si128(...);!

Page 24: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!24

Parallelization

T rigidmain/gui

cam/projector

decoder

reconstructor

tracker

SLStudio

for(int i = 0; i < mat.rows; i++){! for(int j = 0; j < mat.cols; j++){! ...! }! }

#if CV_SSE2! if( USE_SSE2 )! {! ...! __m128i r0 = _mm_loadu_si128(...);! __m128i r1 = _mm_loadu_si128(...);!

Page 25: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!25

Page 26: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

I

up

vp

H = argminH

kHqci � qpik 8 qci 2 Nqc , qpi 2 Nqp

qp = Hqc

+ qc

Nqp

Nqc

Moreno, Taubin (2012). 3DIMPVT. 2012(77)

Local Homographies

!26

Page 27: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

I

up

vp

H = argminH

kHqci � qpik 8 qci 2 Nqc , qpi 2 Nqp

qp = Hqc

+qc

Nqp

Nqc

Moreno, Taubin (2012). 3DIMPVT. 2012(77)

Local Homographies

!27

cv::Mat H = cv::findHomography(N_qc, N_qp, CV_LMEDS);!cv::Point3d Q = cv::Point3d(cv::Mat(H*cv::Mat(cv::Point3d(qc.x, qc.y, 1.0))));!cv::Point2f qp = cv::Point2f(Q.x/Q.z, Q.y/Q.z);!!...!!//calibrate the camera!double cam_error = cv::calibrateCamera(Q, qc, frameSize, Kc, kc, cam_rvecs, cam_tvecs, flags,! cv::TermCriteria(cv::TermCriteria::COUNT + cv::TermCriteria::EPS, 50, DBL_EPSILON));!!//calibrate the projector!double proj_error = cv::calibrateCamera(Q, qp, screenSize, Kp, kp, proj_rvecs, proj_tvecs, flags,! cv::TermCriteria(cv::TermCriteria::COUNT + cv::TermCriteria::EPS, 50, DBL_EPSILON));!!//stereo calibration!cv::Mat Rp, Tp, E, F;!double stereo_error = cv::stereoCalibrate(Q, qc, qp, Kc, kc, Kp, kp, frameSize, Rp, Tp, E, F,! cv::TermCriteria(cv::TermCriteria::COUNT + cv::TermCriteria::EPS, 50, DBL_EPSILON),! cv::CALIB_FIX_INTRINSIC);!

Page 28: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!28

Computer Vision assisted Motion Correction in Medical Imaging

Photo courtesy of Oline V. Olesen, DTU

Page 29: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!29 Courtesy of Oline V. Olesen, DTU

Page 30: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

Motion Tracking

!30 Courtesy of Oline V. Olesen, DTU

Page 31: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

Pose Estimation

!31 Courtesy of Oline V. Olesen, DTU

Page 32: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

Pose Estimation

!32 Wilm et al. (2011), Proc SCIA, LNCS 6688

Page 33: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

!33

Page 34: build your own - Alexandraviscomp.alexandra.dk/wp-content/uploads/2014/01/Future-of-3D-Build-Your-Own-3D-Scanner...How to build your own 3D scanner! Jakob Wilm, PhD Student ! ... For

Fin

!34