Upload
m
View
212
Download
0
Embed Size (px)
Citation preview
Projected Reality – Enhancing Projected Augmentations by DynamicallyChoosing the Best Among Several Projection Systems
Jochen Ehnes∗ Michitaka Hirose†
RCAST, The University of Tokyo, 4-6-1 Komaba, Meguro-ku,Tokyo, Japan
ABSTRACT
A relatively new form of Augmented Reality (AR), projector basedAR using stationary projectors, makes use of rotatable projectiondevices to augment objects. However, the range of such systems islimited. In order to exceed these limits, we introduced the ability toswitch the projection of the augmentation between several projec-tion systems. This way the system with the best position in relationto the object to be augmented can be selected to do the projection,and thus the quality of the augmentation can be optimized. A cru-cial part of this process is the estimation of the achievable projectionquality of all projection systems, which may also be dependent onthe type of application.
CR Categories: H.5.2 [INFORMATION INTERFACESAND PRESENTATION]: User Interfaces—Input devicesand strategies; I.3.2 [COMPUTER GRAPHICS]: GraphicsSystems—Distributed/network graphics C.2.4 [COMPUTER-COMMUNICATION NETWORKS]: Distributed Systems—Distributed applications
Keywords: Projector based Augmented Reality, ApplicationRoaming, Estimation of Quality of Projection, steerable ProjectorCamera Systems
1 INTRODUCTION
Aiming for a technology that supports the user without being a hin-derance, we developed an Augmented Reality (AR) system basedon a combination of a video projector and a video camera mountedon a pan and tilt device. Using this technology, we built a systemthat can project augmentations on fixed as well as movable objectsaround the projection device. However, the nature of the projectionintroduces some limitations: Objects have to be close enough tothe AR-projection system so that the system can detect the markersand the resolution of the projection is still high enough to be read-able. Surfaces that shall be augmented have to face the projector.While certain angles between the surface normal and the directionof projection can be compensated by pre-distorting the projectedaugmentation, the quality of the projection decreases with increas-ing angles. At 90 degrees a projection becomes impossible. Finally,the augmentation may be shadowed by objects between the projec-tor and the augmented surface. While it does not seem viable toovercome these limitations with a single projection system, it be-comes possible by using several similar, networked AR-projectionsystems. The challenge hereby is to coordinate the different sys-tems, to find the system with the ’best view’, and to switch to adifferent system if the augmented object is moved in a way thatmakes an other system have the ’best view’ on it.
∗e-mail: [email protected]†e-mail: [email protected]
2 PREVIOUS AND RELATED WORK
This work builds on our previous work [1, 2]. Other controllableprojector camera systems have been presented in [5] and [4]. In [3]a system is presented that eliminates the shadows cast by persons infront of a screen by using several projectors. However, this systemrequires exact calibration of the projectors, cameras and the displayscreen, something that is not possible when everything is movable.And the requirement that only projectors may be occluded whilethe camera needs to be able to observe the whole screen at all timesis something that contradicts with our basic idea of projector andcamera forming one unit, being together as closely as possible.
3 PROJECTED APPLICATIONS AND APPLICATION ROAM-ING
While conventional applications communicate with the user viawindows and widgets on a computer’s screen, our Projected Ap-plications (PA) use tracked objects for interaction. The AR-systemcan be regarded as an operating system that loads projected appli-cations (identified by the marker on the tracked object) from an ap-plication server and executes them. Furthermore, it provides an ab-straction layer for these applications to communicate with the user.Besides serving PA for marker IDs, the application server keeps thestates of the PA while they are not executed on any system. It alsohas to ensure that all PA are executed on only one AR-system at thetime in order to keep their states consistent. The basic way to dothat is to give the display rights for an application to the first projec-tion system that request them and wait until they are returned fromthere. Only then they may be given out to an other system togetherwith the new state.
4 OPTIMIZING THE QUALITY OF PROJECTION
In order to maximize the visible quality of the projected augmenta-tion, as well as to make the transition between two projection sys-tems as smooth as possible, we extended the management of thedisplay rights. The application server can actively withdraw dis-play rights and give the them to better suited projection systems.
In order to decide which system is suited the best, we introduceda scalar quality value (section 4.1) that is provided by every projec-tion system that detects the relevant tracked object. The system withthe highest quality value is considered the best and consequently isassigned the display rights. The projected applications have to im-plement a function that calculates the quality value, since the crite-ria for quality can be very task specific.
4.1 The ”Quality Value”
A crucial point in order to find the optimal projection system is theestimation of the quality of the projection that each system couldachieve. Since we only need to consider things that change at run-time, our quality value depends only on the geometric relation be-tween projector/camera and the tracked object. We assume that the
289
IEEE Virtual Reality 2006March 25 - 29, Alexandria, Virginia, USA1-4244-0223-9/06/$20.00 ©2006 IEEE
Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE
quality of projection will be perceived to be better the higher theresolution is. Hereby we looked at individual factors like distanceand orientation and how to combine these corresponding values.Alternatively we also used a more pragmatic approach.
Image Plane
Object
Area the object occupies
in the image plane
Image Plane
Object
Area the object occupies
in the image plane
Image Plane
Object
Area the object occupies
in the image plane
Figure 1: A reduced area in the image plane results in fewer pixelsprojected onto the object.
The first two sketches of figure 1 illustrate the reduction of theprojected pixels when the object is moved further away. Since thisreduction of the resolution appears horizontally as well as vertically,the aspect ratio of the pixels stays the same. A quality value thatcorresponds to the number of projected pixels can be calculatedusing equation 1.
QVDistance =(
ConstDistance
)2
(1)
A reasonable value for Const is the minimal distance between pro-jection system and projection surface, as that normalizes the re-sulting quality values. The first and the third sketch illustrate theeffect of rotation on the number of pixels used for an augmenta-tion. Accordingly the quality value depending on the rotation canbe calculated as
QVRotation = dot( ̂Nsur f ace,− ̂Dirpro jection) (2)
The shadowing of the augmentation also depends on the spatialrelationship between projector and the augmented surface. How-ever, it also depends on the position and shape of other objects,which are usually unknown to the projection system. Consequentlythe shadowing of the augmentation can not be taken into accountfor the quality value properly. As long as the marker of an objectis visible, we assume that the projection surface is unobstructed aswell. For applications which do not project onto the tracked ob-jects directly, such as our Drill Guide example which augments thewalls around the tracked drill, the shadowing of augmentations iseven harder to estimate. As illustrated in figure 2, we chose to rely
(a) Drill Guide (b) Right projector is less likely to be
shadowed.
Figure 2: Augmenting the environment around the tracked object.
only on QVRotation. The distance can be neglected when displayinglines representing cables. The direction from where the projection
is done is more important in order to prevent shadowing by the user.By attaching the marker slightly sideways to the back of the drill,the user can define the preferred direction of projection.
QVDistance and QVRotation can be multiplied to get a combinedquality value QV . One may also perform this multiplication usingweighting factors as suited best for the application. In many caseshowever, an equal weighting is adequate. In these cases we can usea more pragmatic approach of using the area of markers in pixels,assuming that the number of pixels occupied by a marker in thecamera’s image corresponds well with the number of pixels pro-jected to augment the same surface, since the camera and the pro-jector are mounted together closely. AR-ToolKit provides the sizeof a marker in pixels, which can be used as a quality value directly.For multi markers we used the sum of the areas of their visible in-dividual markers. This way not only distance and projection anglecan be taken into account, but also if some of the sub markers areblocked by the user. By placing several markers around the displayarea (e.g. at every corner), a user blocks the line of sight towardsat least one of the markers before blocking the display area. Thisresults in a lower total marker area (= quality value) compared to asystem that does not get blocked and thus the non obstructed systemwould be chosen. Example applications using this kind of qualityvalue can be seen in figure 3.
(a) Volume Slicer. (b) Guiding Ticket. (c) Distance Arrow.
Figure 3: Example applications using the marker area in pixels asQuality Value.
5 RESULTS
Our informal tests showed that the selection of the active projectionsystem based on quality values works very well. The strategy tomaximize the number of pixels the markers occupy in the cameraimage results in switching between the systems when the number ofpixels and thus the visual appearance of the augmentation is nearlythe same. As a result the switching is hardly noticeable.
6 ACKNOWLEDGMENT
This work has been supported by a research grant from the Foun-dation for Fusion of Science & Technology (FOST).
REFERENCES
[1] Jochen Ehnes, Koichi Hirota, and Michitaka Hirose. Projected applica-
tions - taking applications from the desktop onto real world objects. In
Proceedings of HCII2005, 2005.
[2] Jochen Ehnes, Koichi Hirota, and Michitaka Hirose. Projected augmen-
tation ii - a scalable architecture for multi projector based ar-systems
based on ”projected applications”. In ISMAR, pages 190–191, 2005.
[3] C. Jaynes, S. Webb, R. Steele, M. Brown, and W. Seales. Dynamic
shadow removal from front projection displays, 2001.
[4] Claudio S. Pinhanez. The everywhere displays projector: A device to
create ubiquitous graphical interfaces. In UbiComp ’01: Proceedingsof the 3rd international conference on Ubiquitous Computing, pages
315–331, London, UK, 2001. Springer-Verlag.
[5] John Underkoffler and Hiroshi Ishii. Illuminating light: An optical
design tool with a luminous-tangible interface. In CHI, pages 542–549,
1998.
290Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE