22
The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday, May 26, 2010

The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Embed Size (px)

Citation preview

Page 1: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

The CompCore Immersive Display

William ThibaultProfessor

Math/CS, CSUEB

Multimedia ForumMay 26, 2010

Wednesday, May 26, 2010

Page 2: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Overview

• Why immersion?

• What makes it easier now?

• How can you do it?

• What next?

Wednesday, May 26, 2010

Page 3: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Why immersion?

• The world surrounds us: we're built for it.

• Body is key

• emotion

• memory

• Flat media are boring

Wednesday, May 26, 2010

Page 4: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Wednesday, May 26, 2010

Page 5: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Wednesday, May 26, 2010

Page 6: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

History

• architecture

• diorama

• planetaria

• CAVE

Wednesday, May 26, 2010

Page 7: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

History

• architecture

• diorama

• planetaria

• CAVE

!"#$%"&'()*+',-+"$+# ./&&'()*+',-+"$+#

!"#"$%&'!()*'+,*%$*-.!"#"$%&'!()*'+,*%$*-.

Wednesday, May 26, 2010

Page 8: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Wednesday, May 26, 2010

Page 9: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Wednesday, May 26, 2010

Page 10: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

What's makes it easier now?

• projector prices plummeting

• commodity PC and graphics hardware

• PC clustering software (rocksclusters.org)

• projector-camera systems research (procams.org)

• soon: cameraphones with projectors

Wednesday, May 26, 2010

Page 11: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

How can you do it?

if $, buy it

• planetarium vendors

• Scalable Displays (scaleabledisplay.com)

• Mersive (mersive.com)

else build it

Wednesday, May 26, 2010

Page 12: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Build it!

• calibrate a camera

• find projector coverage

• 2-pass rendering

• synchronized rendering

• content

Wednesday, May 26, 2010

Page 13: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Calibrating a lens

• math for lens model : r=f(theta)

• calibration object: known positions

• feature detection (corners)

• numerical minimization: find the parameters that explain the observed feature positions

• result: equation to convert to and from:

• camera pixel locations

• directions into the scene

Wednesday, May 26, 2010

Page 14: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Projector-Camera Correspondences

• temporal coding

• draw circle i in image 2j if j-th bit of i is one

• draw circle i in image 2j+1 if j-th bit of i is zero

• result = grid of points (camx, camy) -> (projx, projy)

bit 6

bit 5

bit 4

bit 3

0 1

Wednesday, May 26, 2010

Page 15: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Warping Images for Raskar's 2-pass algorithm

• build "distortion map" by interpolating correspondences

• R,G = pixel location in unwarped image

• B = invalid pixel

sweet spot

display surface

projector

(pass 1) Render desired image from point-of-view at sweet spot

(pass 2) Warp and project

Wednesday, May 26, 2010

Page 16: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Synchronize Rendering

• single program, same scene, different camera

• Equalizer framework (equalizergraphics.com)

Part I.

User Guide1. Introduction

Equalizer is the standard middleware for the development and deployment of paral-

lel OpenGL applications. It enables applications to benefit from multiple graphics

cards, processors and computers to scale rendering performance, visual quality and

display size. An Equalizer-based application runs unmodified on any visualization

system, from a simple workstation to large scale graphics clusters, multi-GPU work-

stations and Virtual Reality installations.

This User and Programming Guide introduces parallel rendering concepts, the

configuration of Equalizer-based applications and programming using the Equalizer

parallel rendering framework.

Equalizer is the most advanced middleware for scalable 3D visualization, provid-

ing the broadest set of parallel rendering features available in an open source library

to any OpenGL application. Many commercial and open source applications in a

variety of different markets rely on Equalizer for flexibility and scalability.

Equalizer provides the domain-specific parallel rendering know-how and abstracts

configuration, threading, synchronization, windowing and event handling. It is a

‘GLUT on steroids’, providing parallel and distributed execution, scalable rendering

features, network data distribution and fully customizable event handling.

If you have any question regarding Equalizer programming, this guide, or other

specific problems you encountered, please direct them to the eq-dev mailing list5.

1.1. Parallel Rendering

yesyesyes

begin frame

clear

draw

end frame

event handling

exit ?

update data

exit config

stop

start

init config

init windows

exit?

start

stop

noexit?

start

stop

no

swap

clear

draw

swap

init windows

no

Figure 1: Parallel Rendering

Figure 1 illustrates the ba-

sic principle of any parallel

rendering application. The

typical OpenGL application,

for example GLUT, has an

event loop which redraws the

scene, updates data based on

received events, and eventu-

ally redraws a new frame.

A parallel rendering appli-

cation uses the same basic

execution model and extends

it by separating the render-

ing code from the main event

loop. The rendering code is

then executed in parallel on

different resources, depending

on the configuration chosen

at runtime.

This model is naturally

followed by Equalizer, thus

making application develop-

ment as easy as possible.

5http://www.equalizergraphics.com/lists.html

1

Wednesday, May 26, 2010

Page 17: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

Internet

front-endnode

audioserver

inputserver

renderingnodes

network

operatorconsole

(to loudspeakers)

projectors

(top view)

camera

CompCore Immersive Display Architecture

Wednesday, May 26, 2010

Page 18: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

panoramic photgraphy

Gigapan Epic (gigapan.org)Wednesday, May 26, 2010

Page 19: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

surround video

Point Gray Ladybug2(ptgray.com)

dodecahedral camera(immersivemedia.com)

Wednesday, May 26, 2010

Page 20: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

surround CGI• cubemap rendering

• 6 cameras, 90-deg FOV

• standard cameras

• non-linear projections

• linear fisheye

• latitude-longitude

• special "cameras"

• typical distribution format: fisheye

• real-time rendering

• 2-pass algorithm

DRAFT COPY - NOT FOR DISTRIBUTION Obscura Digital, Inc. 2

a mapping between camera and immersive image pixels, are used to create

the warps to be applied to the images displayed by each projector. The

correspondences are also used to create edge-blending masks for each pro-

jector. Tools for editing correspondences have been created to deal with

discontinuous display surfaces. The immersive images to be displayed are

stored in an immersive format, such as cubic environment maps (cube-

maps), fisheye (equipolar) panoramas, or latitude-longitude (equirectan-

gular) panoramas. See Figure 1.

Figure 1. An immersive image shown in 3 different formats:

(a) cubemap, (b) fisheye, and (c) equirectangular.

A warp, computed using a mapping from projector coordinates to

image texture coordinates, is applied to the immersive images, producing

frames for individual projectors. Edge-blending masks computed in the

registration phase are also applied in this step.

The results of warping and masking the immersive content are individ-

ual projector frames used to create a compressed video file for playback

by each computer. Playback uses a client-server distributed application,

in which the computer connected to each projector accepts commands

from a separate control computer over TCP/IP. The playback of all files

is synchronized, producing a synchronized display.

This paper is organized as follows. First is a survey of past work

in camera-based registration for multi-projector displays, and systems

for displaying immersive video content. Subsequent sections discuss our

camera-based projector registration techniques, the creation of immersive

content, processing media for display, and playback of processed media.

We conclude with a discussion of our experiences using our displays in a

number of commercial applications.

2 Past Work

Large displays built using multiple projectors have been very expensive

until recently. Falling prices of projectors and computers have made these

displays feasible for organizations without a million or more dollars to

spend. However, manual alignment of multiple projectors is extremely

DRAFT COPY - NOT FOR DISTRIBUTION Obscura Digital, Inc. 2

a mapping between camera and immersive image pixels, are used to create

the warps to be applied to the images displayed by each projector. The

correspondences are also used to create edge-blending masks for each pro-

jector. Tools for editing correspondences have been created to deal with

discontinuous display surfaces. The immersive images to be displayed are

stored in an immersive format, such as cubic environment maps (cube-

maps), fisheye (equipolar) panoramas, or latitude-longitude (equirectan-

gular) panoramas. See Figure 1.

Figure 1. An immersive image shown in 3 different formats:

(a) cubemap, (b) fisheye, and (c) equirectangular.

A warp, computed using a mapping from projector coordinates to

image texture coordinates, is applied to the immersive images, producing

frames for individual projectors. Edge-blending masks computed in the

registration phase are also applied in this step.

The results of warping and masking the immersive content are individ-

ual projector frames used to create a compressed video file for playback

by each computer. Playback uses a client-server distributed application,

in which the computer connected to each projector accepts commands

from a separate control computer over TCP/IP. The playback of all files

is synchronized, producing a synchronized display.

This paper is organized as follows. First is a survey of past work

in camera-based registration for multi-projector displays, and systems

for displaying immersive video content. Subsequent sections discuss our

camera-based projector registration techniques, the creation of immersive

content, processing media for display, and playback of processed media.

We conclude with a discussion of our experiences using our displays in a

number of commercial applications.

2 Past Work

Large displays built using multiple projectors have been very expensive

until recently. Falling prices of projectors and computers have made these

displays feasible for organizations without a million or more dollars to

spend. However, manual alignment of multiple projectors is extremely

DRAFT COPY - NOT FOR DISTRIBUTION Obscura Digital, Inc. 2

a mapping between camera and immersive image pixels, are used to create

the warps to be applied to the images displayed by each projector. The

correspondences are also used to create edge-blending masks for each pro-

jector. Tools for editing correspondences have been created to deal with

discontinuous display surfaces. The immersive images to be displayed are

stored in an immersive format, such as cubic environment maps (cube-

maps), fisheye (equipolar) panoramas, or latitude-longitude (equirectan-

gular) panoramas. See Figure 1.

Figure 1. An immersive image shown in 3 different formats:

(a) cubemap, (b) fisheye, and (c) equirectangular.

A warp, computed using a mapping from projector coordinates to

image texture coordinates, is applied to the immersive images, producing

frames for individual projectors. Edge-blending masks computed in the

registration phase are also applied in this step.

The results of warping and masking the immersive content are individ-

ual projector frames used to create a compressed video file for playback

by each computer. Playback uses a client-server distributed application,

in which the computer connected to each projector accepts commands

from a separate control computer over TCP/IP. The playback of all files

is synchronized, producing a synchronized display.

This paper is organized as follows. First is a survey of past work

in camera-based registration for multi-projector displays, and systems

for displaying immersive video content. Subsequent sections discuss our

camera-based projector registration techniques, the creation of immersive

content, processing media for display, and playback of processed media.

We conclude with a discussion of our experiences using our displays in a

number of commercial applications.

2 Past Work

Large displays built using multiple projectors have been very expensive

until recently. Falling prices of projectors and computers have made these

displays feasible for organizations without a million or more dollars to

spend. However, manual alignment of multiple projectors is extremely

Wednesday, May 26, 2010

Page 21: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

our approach

• assume viewer at "sweet spot"

• find portions of field-of-view lit by each projector

• for rendering, use a different camera for each projector

• place each camera at the sweet spot

• give each camera a different direction and fov

• render scene to offscreen buffer

• warp rendering to account for distortion

sweet spot

display surface

projector

(pass 1) Render desired image from point-of-view at sweet spot

(pass 2) Warp and project

Wednesday, May 26, 2010

Page 22: The CompCore Immersive Displaytebo/TheCompCoreImmersiveDisplay.pdf · The CompCore Immersive Display William Thibault Professor Math/CS, CSUEB Multimedia Forum May 26, 2010 Wednesday,

cooler apps

OpenGL

Equalizer

Linux

Rocks

mife

OpenSceneGraph

Python

Hardware

MPI OpenCV

developer heaven

Mathematica

audio server

calibrationdcmapper

iipDemo

Wednesday, May 26, 2010