9
An Augmented Reality System for Astronomical Observations Andrei Lint ¸u MPI Informatik, Germany Marcus Magnor TU Braunschweig, Germany ABSTRACT Anyone who has gazed through the eyepiece of an astronomical telescope knows that, with the exception of the Moon and the plan- ets, extra-solar astronomical objects are disappointing to observe visually. This is mainly due to their low surface brightness, but also depends on the visibility, sky brightness and telescope aper- ture. We propose a system which projects images of astronomical objects (with focus on nebulae and galaxies), animations and addi- tional information directly into the eyepiece view of an astronomi- cal telescope. As the telescope orientation is queried continuously, the projected image is adapted in real-time to the currently visible field of view. For projection, a custom-built video projection mod- ule with high contrast and low maximum luminance value was de- veloped. With this technology visitors to public observatories have the option to experience the richness of faint astronomical objects while directly looking at them through a telescope. CR Categories: H.5.1 [Information Interfaces And Presenta- tion]: Multimedia Information Systems—Artificial, augmented, and virtual realities; I.4.8 [Image Processing And Computer Vi- sion]: Scene Analysis—Tracking; J.2 [Computer Applications]: Physical Sciences and Engineering—Astronomy; Keywords: augmented reality, projection, tracking, astronomy 1 I NTRODUCTION Since its beginning, mankind has always been fascinated by the starry sky. It is one of the first natural phenomena that was investi- gated by humans. An observer starts by learning the constellations in the sky and their relative position by stargazing with the ”naked eye”. More advanced amateur astronomers make use of binoculars and telescopes with which many more objects can be observed. Goto (motorized) telescopes are gaining in popularity. Many newcomers purchase one as their first telescope. These telescopes find a target object in the sky by following its selection from a hand controller. The user then expects for an ”instant success” in the observations. This surely happens when ”easy to observe”, bright objects such as the Moon or the planets are the targets. However, when observing faint deep sky objects (astronomical objects which lie outside our solar system) like galaxies or nebulae, much of the initial enthusiasm is lost. These appear as fuzzy grey spots in the telescope’s eyepiece and leave the observer quite unimpressed, pos- sibly thinking about giving up his new hobby. The main reason is that the image in the eyepiece is not really similar to the well known images recorded by the Hubble Space Telescope [24] (or some other large aperture Earth-bound telescopes) which are famil- iar to the newcomer through astronomy websites or magazines. One possibility to attain interactivity during astronomical obser- vation is to use an electronic eyepiece,a webcam or a custom CCD camera to display the live image formed at the telescope focus on e-mail:[email protected] e-mail:[email protected] a TV or computer screen. However, this approach only works for bright astronomical objects and needs a skilled observer to fit the image of the desired object on the CCD chip. We propose a system which augments the view through the eye- piece of an astronomical telescope in order to provide additional online information to the user during observation. The system over- lays the currently visible object with a long exposure image of it, as a visual aid. The present orientation of the telescope is continu- ously queried and the currently visible sky section is computed in real time by a portable computer. It is also possible to overlay an- imations for specific objects. The system is not meant to replace traditional deep sky observation. Rather, it is intended more as ed- ucational and visual aid for the novice astronomer. The main motivation of our work is to increase the interest in astronomy among the general public who are quite unimpressed af- ter their first glimpse through a telescope. As the human eye has a limited capacity to integrate capture light over time, faint deep sky objects are perceived vaguely, and the observer cannot determine their structure. Our system gives observers the ability to simulta- neously compare the visually perceived object with a photograph acquired using long exposure. When observing the planets it is of- ten difficult to identify their natural satellites because of their fast movement relative to the position of the planet due to short orbital periods. Our system can help the stargazer by overlaying the names of the visible natural satellites. The same approach is used when observing dim stars which can also be hard to identify in a field containing a considerable number of stars. In the following section we present an overview of the related work in the field of tracking and registration AR applications. Sec- tion 3 provides a short system description. We describe the tele- scope’s optical characteristics in Section 4 and discuss about its pointing and tracking accuracy in Section 5. We present the custom- built projection device in Section 6. The human visual system’s adaptation in shortly reviewed in Section 7. Section 8 introduces the custom planetarium software used in our application and Sec- tion 9 describes the input unit in use. We bring up some possible applications in Section 10 and we conclude and point out possible future work in Section 11. 2 RELATED WORK Most AR applications that perform tracking are concerned with tracking the user’s head and / or body movements in order to deter- mine its position and orientation [2]. In contrast, in our approach, the telescope together with the projection unit are moving on a fixed tripod pointing to the sky hemisphere, with two degrees of freedom. Our system constantly queries the position of the telescope and syn- chronizes it with the projected sky section. The system developed by researchers at the Fraunhofer IGD [28, 36] uses a video see-through approach. This work focuses on boost- ing the performance of a coin-operated telescope by equipping it with a display, a standard PC, a hardware tracking device and a camera. Thus the system can be used to augment views of the at- traction sites where the telescope is installed. Reeves et al. [27] depict another example of a video see-through application that al- lows the visitors to an exhibition to gather additional information about some of the presented objects. For these applications, the telescope’s optical assembly is not utilized. It is replaced by a video 119 IEEE Virtual Reality 2006 March 25 - 29, Alexandria, Virginia, USA 1-4244-0224-7/06/$20.00 ©2006 IEEE Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

[IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

  • Upload
    m

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

An Augmented Reality System for Astronomical Observations

Andrei Lintu∗MPI Informatik, Germany

Marcus Magnor†

TU Braunschweig, Germany

ABSTRACT

Anyone who has gazed through the eyepiece of an astronomicaltelescope knows that, with the exception of the Moon and the plan-ets, extra-solar astronomical objects are disappointing to observevisually. This is mainly due to their low surface brightness, butalso depends on the visibility, sky brightness and telescope aper-ture. We propose a system which projects images of astronomicalobjects (with focus on nebulae and galaxies), animations and addi-tional information directly into the eyepiece view of an astronomi-cal telescope. As the telescope orientation is queried continuously,the projected image is adapted in real-time to the currently visiblefield of view. For projection, a custom-built video projection mod-ule with high contrast and low maximum luminance value was de-veloped. With this technology visitors to public observatories havethe option to experience the richness of faint astronomical objectswhile directly looking at them through a telescope.

CR Categories: H.5.1 [Information Interfaces And Presenta-tion]: Multimedia Information Systems—Artificial, augmented,and virtual realities; I.4.8 [Image Processing And Computer Vi-sion]: Scene Analysis—Tracking; J.2 [Computer Applications]:Physical Sciences and Engineering—Astronomy;

Keywords: augmented reality, projection, tracking, astronomy

1 INTRODUCTION

Since its beginning, mankind has always been fascinated by thestarry sky. It is one of the first natural phenomena that was investi-gated by humans. An observer starts by learning the constellationsin the sky and their relative position by stargazing with the ”nakedeye”. More advanced amateur astronomers make use of binocularsand telescopes with which many more objects can be observed.

Goto (motorized) telescopes are gaining in popularity. Manynewcomers purchase one as their first telescope. These telescopesfind a target object in the sky by following its selection from a handcontroller. The user then expects for an ”instant success” in theobservations. This surely happens when ”easy to observe”, brightobjects such as the Moon or the planets are the targets. However,when observing faint deep sky objects (astronomical objects whichlie outside our solar system) like galaxies or nebulae, much of theinitial enthusiasm is lost. These appear as fuzzy grey spots in thetelescope’s eyepiece and leave the observer quite unimpressed, pos-sibly thinking about giving up his new hobby. The main reasonis that the image in the eyepiece is not really similar to the wellknown images recorded by the Hubble Space Telescope [24] (orsome other large aperture Earth-bound telescopes) which are famil-iar to the newcomer through astronomy websites or magazines.

One possibility to attain interactivity during astronomical obser-vation is to use an electronic eyepiece, a webcam or a custom CCDcamera to display the live image formed at the telescope focus on

∗e-mail:[email protected]†e-mail:[email protected]

a TV or computer screen. However, this approach only works forbright astronomical objects and needs a skilled observer to fit theimage of the desired object on the CCD chip.

We propose a system which augments the view through the eye-piece of an astronomical telescope in order to provide additionalonline information to the user during observation. The system over-lays the currently visible object with a long exposure image of it,as a visual aid. The present orientation of the telescope is continu-ously queried and the currently visible sky section is computed inreal time by a portable computer. It is also possible to overlay an-imations for specific objects. The system is not meant to replacetraditional deep sky observation. Rather, it is intended more as ed-ucational and visual aid for the novice astronomer.

The main motivation of our work is to increase the interest inastronomy among the general public who are quite unimpressed af-ter their first glimpse through a telescope. As the human eye has alimited capacity to integrate capture light over time, faint deep skyobjects are perceived vaguely, and the observer cannot determinetheir structure. Our system gives observers the ability to simulta-neously compare the visually perceived object with a photographacquired using long exposure. When observing the planets it is of-ten difficult to identify their natural satellites because of their fastmovement relative to the position of the planet due to short orbitalperiods. Our system can help the stargazer by overlaying the namesof the visible natural satellites. The same approach is used whenobserving dim stars which can also be hard to identify in a fieldcontaining a considerable number of stars.

In the following section we present an overview of the relatedwork in the field of tracking and registration AR applications. Sec-tion 3 provides a short system description. We describe the tele-scope’s optical characteristics in Section 4 and discuss about itspointing and tracking accuracy in Section 5. We present the custom-built projection device in Section 6. The human visual system’sadaptation in shortly reviewed in Section 7. Section 8 introducesthe custom planetarium software used in our application and Sec-tion 9 describes the input unit in use. We bring up some possibleapplications in Section 10 and we conclude and point out possiblefuture work in Section 11.

2 RELATED WORK

Most AR applications that perform tracking are concerned withtracking the user’s head and / or body movements in order to deter-mine its position and orientation [2]. In contrast, in our approach,the telescope together with the projection unit are moving on a fixedtripod pointing to the sky hemisphere, with two degrees of freedom.Our system constantly queries the position of the telescope and syn-chronizes it with the projected sky section.

The system developed by researchers at the Fraunhofer IGD [28,36] uses a video see-through approach. This work focuses on boost-ing the performance of a coin-operated telescope by equipping itwith a display, a standard PC, a hardware tracking device and acamera. Thus the system can be used to augment views of the at-traction sites where the telescope is installed. Reeves et al. [27]depict another example of a video see-through application that al-lows the visitors to an exhibition to gather additional informationabout some of the presented objects. For these applications, thetelescope’s optical assembly is not utilized. It is replaced by a video

119

IEEE Virtual Reality 2006March 25 - 29, Alexandria, Virginia, USA1-4244-0224-7/06/$20.00 ©2006 IEEE

Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

Page 2: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

camera with a telephoto lens. Conversely, in our application theoptical assembly of the telescope is of major importance, its mainfunction of light gathering is kept. The hereby proposed system isan optical see-through application, so that the genuine visual im-pression given by the observed objects is maintained. A conceptualpresentation of the system can be found in [19].

Birkfellner et al. [3] present a related optical see-through appli-cation. The authors propose a head-mounted display customizedfor medical applications such as an on-line aid during surgery. Theoptical setup of the system uses image rectification prisms to projectthe visual and the display image onto a common focal plane. Be-cause of the relatively small maginifactions available (3.6 x – 7.2 x)the whole optical assembly is relatively small and can be used as ahead-mounted display.

Optical see-through HMDs have only recently reached contrastratios of 1:200 – 1:300 [26, 9]. The designed projection unit for oursystem provides even higher contrast ratio (1:500), and has the ad-vantage of being able to be incorporated in applications that requireoptical magnification.

3 SYSTEM DESCRIPTION

Our system augments the view of the skies through an astronomicaltelescope. The hardware part consists of a telescope fitted with agoto mounting, a custom built projection unit and a portable PC.The software part consists of the customized planetarium softwareand the integrated telescope communication module (See Figure 1).

Figure 1: Schematic description of the entire system. To improvethe tracking accuracy, a special purpose, high sensitivity CCD cameracan optionally be added (components marked with * and **). It canbe mounted together with a smaller piggyback telescope or directlyon the optical axis of the telescope using an off axis guider.

The working principle of the system is the following: the tele-scope is pointed towards the desired astronomical object selectedvia the hand controller unit. The actual telescope position is con-stantly queried by the telescope communication module. Using thecurrent coordinates, the planetarium software focuses on the cor-responding sky section which is directly projected onto the tele-scope’s image plane by the projection module. The projected imageis then combined with the optical image seen through the telescopeusing a beam splitter.

The planetarium software contains a database of locations andadditional details about the planets in our solar system, a large num-

ber of stars and interesting deep space objects. It provides highresolution photographs, 3D animations (when available) and sup-plementary data. The user can customize the exact information dis-played for individual objects, and it is possible to switch betweenvarious eyepieces (i.e. magnifications) during observation. Thisapproach provides information beyond that available via the stan-dard hand controller. The telescope used in our application is a10” aperture (250 mm), Meade LX200 GPS, Schmidt Cassegrainmodel (See Figure 3, Left), equipped with a goto mounting, butthe working principle of the system can be applied to any telescopefitted with goto mounting. Figure 2, Left shows a possible image

Figure 2: Left: Standard Eyepiece View . The image shows thevisual appearance of the Andromeda Galaxy (M31) through an eye-piece (from [10] c© Leo Taylor). Right: Augmented Eyepiece View.A sight of the same field of view with the overlayed image and addi-tional information blended into the upper left corner.

perceived by an observer when looking through the telescope’s eye-piece at the Andromeda Galaxy (M31). Figure 2, Right, presents,the same sky section, after the long exposure photograph and addi-tional information have been projected onto the telescope’s imageplane. Another important aspect of our system is the maintenanceof the dark adaptation of the observer when projecting the addi-tional information into the telescope’s field of view. For this reasonthe projection module is equipped with a knob permitting contin-uous brightness adjustment and a tray to insert additional neutraldensity filters.

Figure 3: Left: The Meade LX200 GPS 10” Astronomical Telescopeused in the application. Right: The hand controller provided withthe telescope.

120Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

Page 3: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

4 TELESCOPE OPTICAL CHARACTERISTICS

This section deals with the optical properties of the telescope in use.As shortly described earlier, the telescope used in our application isa 10” aperture (250 mm), Meade LX200GPS, Schmidt Cassegrainmodel (Figure 3, Left). It has a focal length of 2500 mm and isequipped with a goto mounting. A telescope, when used in astron-omy mainly serves three functions: magnification, to increase lightgrasp and to increase resolution [31]. The magnification providedby a given telescope eyepiece combination can be simply computedby: telescope focal length / eyepiece focal length. With our avail-able eyepieces we can obtain magnifications ranging from 78x (32mm eyepiece) up to 500x (5 mm eyepiece). The light grasping ca-pability of a telescope is given only by the diameter of it’s mainobjective. Knowing this, we can compute the limiting magnitudeof the telescope, i.e. the faintest star which can still be observedthrough the telescope. The resolution of a telescope is the angularlimit at which two close distinct stars can be still perceived as twodifferent objects. The diffraction limited resolution of the telescopefor a given wavelength can be simply computed once the apertureof the main objective D (mm) is known by the following formula:115.82 / D (arc-seconds). Our telescope has the resolution is 0.46arc-seconds for a wavelength of 555 nm. This theoretical maximumoptical resolution can hardly be achieved because of the atmospherewhich limits the seeing to a value between 2 and 10 arc-seconds,and can sometimes be 1 arc-second at good observing sites.

An observer, when using the proposed system, can compare theactual optical image of an astronomical object with a brighter andhigher resolution image. This way we augment the telescope’s lim-ited resolution and light gathering power.

5 TELESCOPE POINTING AND SIDEREAL TRACKING AC-CURACY

The telescope used in our application (See Figure 3, Left) isequipped with a GPS receiver and is capable of calibrating itselfby determining the mount’s tip, tilt and true north [23]. True northis computed using magnetic north but due to magnetic disturbancesand variations of the Earth’s magnetic field is not accurate enough.For a precise calibration two bright stars have to be successivelycentered in the telescope’s eyepiece. After calibration the user canchoose to observe one of the objects available in the telescope’shand controller unit (See Figure 3, Right), and the telescope slewsdirectly to the desired object.

There are two different types of accuracy which have to be takeninto account: pointing and sidereal tracking. Pointing accuracy isthe precision with which the telescope aims at a given astronomicalobject in the sky. The most important factor on which it depends isthe positioning of two calibration stars exactly in the center of theeyepiece. This can be achieved using an illuminated reticle with across-line intersecting in the eyepiece’s center. Also, to maximizecalibration accuracy, the two chosen stars should be situated at least90o apart in the sky. The telescope’s pointing precision given by themanufacturer is of 2 arc-minutes [23]. If the telescope is correctlyaligned as described above and the user uses the available high pre-cision feature (the telescope first slews to a nearby bright star, theuser centers the bright star in the eyepiece and the telescope slewsafterwards to the desired object with higher accuracy) much higherprecision can be attained. Also, at any point during telescope op-eration, the user has the option to manually correct / change thetelescope’s orientation. The telescope’s mount sends updated po-sition information approximately every 1 second, with an accuracyof up to 1 arc-second.

During sidereal tracking (the speed the Earth rotates around itsaxis) of an observed object, accuracy is of major importance for thecorrect overlapping of the observed and projected object. The ac-

Figure 4: Telescope tracking accuracy before and after PEC trainingover an 8-minute period. Note the visible accuracy improvement aftera simple training session. After Greiner [15]

tual position indication of the telescope’s mounting (approximatelyonce per second) could be precise, but there is also an inaccuracyin the mounting’s gears. This inaccuracy can be improved by per-forming a so called Periodic Error Correction (PEC) [23, 15] on thetelescope to ”train” the drives, which is also used for high-precisionastrophotograpy applications. The graph in Figure 4 presents thesidereal tracking accuracy of the telescope in use over a period of8 minutes (200 intervals of 2.4 seconds each). Before PEC trainingthe peak to peak tracking error is up to approximatively 25 arc-seconds, after a training session it is reduced to 5 arc-seconds orless. A tracking error of 5 arc-seconds is equivalent to the size of asmall crater on the Moon measuring 9 Km in diameter. As compar-ison, the true field of view when using the 32 mm eyepiece on ourtelescope is approximatively 30 arc-minutes or 108000 arc-seconds(the size of the Full Moon), the maximum tracking error after PECtraining equals only 0.0046 % of the full field of view.

After the first tests with the prototype version of our projectionmodule we were satisfied with system pointing and tracking accu-racy. When slewing to the Andromeda Galaxy (M31) using the 32mm eyepiece (78 x magnification), the projected image and the op-tical image of the galaxy were both in the eyepiece’s field of view.

6 PROJECTION SYSTEM

There are several criteria which have been taken into account forthe development and design of the projection module suited to ourapplication. Its size and weight should be minimized to preserve thefreedom of movement of the telescope and to maintain its weightbalance. The projection module also should have the highest pos-sible contrast ratio, so that the user is still able to discern betweenblack and white levels when projecting at low brightness levels.

Another important requirement is a low brightness at the dis-play’s ”black” value; LCDs for instance tend to emit a small amountof light even when set to black. After researching the currentlyavailable projection technologies, we concluded that the best fit toall these requirements is a solution using DLP (Digital Light Pro-cessing) [29]. A photograph of the designed projection module to-gether with the 32 mm eyepiece is presented in Figure 5.

The module utilizes a 0.7”-diagonal DMD (Digital Mirror De-vice) with XVGA (1024x768) resolution and the whole projectionunit was developed by Op-Sys Project Consulting [16]. It featuresa custom projection lens and a special color LED projection head.The LED light source reduces size considerably, as there is no need

121Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

Page 4: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

Figure 5: A photograph of the projection system together with the32 mm eyepiece. It is directly mounted on the telescope’s opticalaxis through the flange on the bottom side. The eyepieces in use areplaced in the provided holder on the right side. The DLP engine andLED head are situated in the box on the left. The box in the middlecontains the 45o mounted beam splitter.

for a color wheel used in standard DLP projectors. The maximumluminance is 1.8 cd/m2 and is dimmable down to 0.03 cd/m2. Theminimum contrast of 500 : 1 is maintained over the complete lu-minane range varying from the maximum value down to 0.3 cd/m2

and decreases for lower values. To dim the projection module downto values not affecting low adaptation levels, a fixed neutral density(ND) filter is installed in front of the LED’s. There is also an op-tional tray to insert additional ND filters (Figure 7, Right) when stilllower maximum luminance values are needed e.g. for very faintnebulae. The advantage in using the ND filters is that we can obtainlower projection luminance values without dimming the LED’s tothe range where contrast loss occurs. We have a set of ND filtersin the transmission range varying from 70% to 0.3%. However, forobserving most objects of interest, the user does not have to changeND filters, a medium transmission ND filter does the job well. Tocombine the projected and optical image we use a beam splitterwhich transmits 90% of the light coming from the telescope andreflects only 10% of the light coming from the projection engine.Figure 6, presents the projection system mounted on the telescope’soptical axis.

As mentioned above, we use a combining mirror to overlay thetelescope image and the projected image. After optimizing thebeam splitter’s anti-reflective coating and placing a polarization fil-ter in front of the projection unit, multiple ghosting artifacts show-ing up in earlier prototype versions could be minimized. In it’s cur-rent state, just one faint ghost image of the projected informationcan be observed through the eyepiece.

Because of the different image ratios and geometric shapes ofthe eyepiece view (circle, 1:1 ratio) and the projected image (rect-angle, 4:3 ratio), a compromise between the number of pixels ofthe projected image which are invisible and the surface of the eye-piece image with no projection coverage has to be found. We alsohave to take into account the possibility to use eyepieces of dif-ferent focal length which yield different magnifications. To still getacceptable coverage of the visual field by the projected image whenusing higher magnifications (24mm, 104x), the field of view whenusing the lowest magnification eyepiece (32mm, 78x) is not totallycovered by the projected image on the margins.

We propose as a solution to the coverage problem when using the24mm eyepiece the setup presented in Figure 7, Left. It offers a fairtradeoff between the area of the eyepiece view with no projectioncoverage (small top and bottom area) and the unused pixels of the

Figure 6: The projection unit together with an eyepiece mountedon the telescope.

projected image (four small corner regions) for the given eyepiece.

When overlaying the projected image on top of the optical viewwe want to have the images of the astronomical objects displayedwith the correct orientation in the frame. To find this for everyobserved object we only have to take into account the angle withwhich the projection engine is rotated in the plane perpendicular tothe viewing direction. If we rotate the entire projection unit in theplane perpendicular to the optical axis of the telescope, the projec-tion unit and the eyepiece rotate together, not influencing the overallcorrect orientation of the projected data.

An important additional control on the unit is a potentiometerknob which allows continuous adjustment of LED brightness. Thisassures one can keep low luminance values during projection thusmaintaining the user’s dark adaptation. The unit is powered by a12 V adapter so that it can be operated together with the telescopeusing a standard car battery, or other portable power supply, whenneeded. Even though the projection system was designed to have

Figure 7: Left: Projection coverage (rectangle) for the 24 mm eye-piece field of view (circle). Right: The tray provided to insert NeutralDensity filters, directly in front of the projection lens.

lowest possible weight, the telescope needs to be balanced in orderto keep the pointing and tracking accuracy unaffected and not towear out the telescope’s gears. Telescope balancing is achievedusing a standard Meade accessory, a counterweight set attached tofactory drilled holes on the bottom of the telescope’s optical tube.The projection unit without eyepiece weights approximatively 2 kg,the wide angle 32 mm eyepiece approximatively 500 g.

122Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

Page 5: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

7 ADAPTATION OF THE HUMAN VISUAL SYSTEM

The range of luminance values in our surrounding environment islarge. The human eye has an effective adaptation mechanism toallow perception of visual information along this range. A mix ofmechanical, photochemical and neural processes are responsible foradaptation at different illumination levels. The iris is responsiblefor limiting the aperture and thus the amount of light entering theeye. The size of the pupil can range from 7 mm down to 2mm. Re-sponsible for the photochemical processes are two types of sensitivecells situated on the retina, the rods and the cones. At different inci-dent luminance values, various light sensitive cells on the retina areactive [12, 5]. The rods are very light sensitive but achromatic andare active at scotopic levels of illumination, from 10−6 to 10 cd/m2.The cones are less sensitive but deliver color vision and are active atphotopic levels of illumination, from 0.01 to 108 cd/m2. For light

levels ranging from 0.01 to 10 cd/m2 called mesopic levels, boththe rods and the cones are active [12]. The range of luminance val-ues and the corresponding visual functions of the eye are presentedin Figure 8.

Figure 8: The range of luminance values and the correspondingvisual parameters. After Ferwerda et al. [12]

Adaptation to different levels of illumination is not immediate,as the human visual system needs time to activate and deactivateits photosensitive cells. Light adaptation is relatively fast; 80% ofsensitivity is recovered in 2 seconds. Dark adaptation takes muchlonger; complete adjustment requires roughly 30 minutes [12, 5].For projection applications used during astronomical observation itis very important to maintain the observer’s dark adaptation, be-cause once exposed to higher levels of illumination re-adaptationoccurs too slowly for an immediate perception of faint sky objects.

Looking at the projection module with the above mentioned factsin mind, we try to keep the projected luminance values in the rangeof scotopic or mesopic levels. Even if the projection module pos-sesses a potentiometer for dimming the intensity values and has thepossibility to use neutral density filters, an option for adjusting thebrightness of the overlayed information relative to the observed ob-ject would be welcome. This approach would support the humanvisual system’s adaptation, having similar levels of brightness atspatially close areas.

The planetarium software used as basis for our customized soft-ware implements a fast version of the tone mapping algorithm byTumblin and Rushmeier [32]. It also includes a blue-shift of thecolor hues in the case of scotopic levels of illumination [33], to beable to easier simulate night-time scenes, when viewed under nor-mal daylight illumination.

The scope of our system is not to simulate the exact levels ofincident illumination during stargazing, instead we want to providethe user visual information about the observed object in the sametime maintaining as much as possible her / his adaptation level. Inorder to achieve this we remove the blue-shift of the color huesfrom the original tone mapping scheme and apply the same levelsof tone mapping not only to the textures, but also to the additionalinformation displayed for an observed object. This way the user

can maintain her / his level of adaptation suited to perceive the ob-served object. In the case the projected captions are much brighterthan the observed object, it can happen that the observer’s detectionthreshold changes, not being able to perceive details in the observedastronomical object [12].

8 CUSTOMIZED PLANETARIUM SOFTWARE

As basis for our custom planetarium software we use the OpenGLdriven open source software Stellarium [4]. It has an easily cus-tomizable user interface and new custom objects can easily beadded. There is no telescope communication support, so it is ex-tended it by implementing the Meade Telescope Control Proto-col [22]. Support for animation playback is also implemented inour customized version of the software.

Stellarium uses the Hipparchos Catalogue, a star catalog whichcontains over 120000 stars, a number which is by far satisfactoryconsidering our application. Position information regarding ob-jects in our solar system is also available including major naturalsatellites of the planets. To have an acceptable number of nebulaeand galaxies of interest accessible, we augmented the existing listincluding only the Messier objects with the objects from the Ap-pendix E of Clark’s book [5], available also online from [6]. Thislist includes 611 deep sky objects, which as Clark states are “themost interesting for amateur astronomers”. Additional informationabout the objects have been downloaded from the NGC-IC projectwebpage [7]. We set the same coordinates for the objects in ourdatabase and the corresponding objects from the telescope’s handcontroller, to make sure no telescope pointing errors due to differ-ent coordinates can occur.

Figure 9: Screenshot of the display output produced by the planetar-ium software. In the top region of the display basic information aboutthe observed object is displayed. On the right side, the menu for spec-ifying the current eyepiece is visible. In the middle bottom region,the selected eyepiece and corresponding magnification are displayed.The photograph of the object (M88 photograph from [13], c© RobGendler) is displayed centered and with the correct orientation.

In order to place all objects in the database at the right positionon the simulated sky hemisphere various input data are necessary.Of major importance is the date and the time of the observation.The location of the telescope on Earth is also crucial; all these dataare downloaded by the software from the telescope, to assure fullconsistency. Using this input information the planetarium softwarecomputes the position of the objects on the hemisphere by trans-

123Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

Page 6: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

forming the given equatorial coordinates in local altitude-azimuthcoordinates (See Appendix). We read out the location the telescopeis pointing to and focus the planetarium software on the same skyregion.

One of the main attributes of an astronomical telescope is that byusing eyepieces of different focal length, one can achieve variousmagnifications. Therefore a prerequisite for the correct functioningof the system is to specify the current eyepiece. A snapshot from thecustomized planetarium software is presented in Figure 9. On theright side the observer can select the currently used eyepiece from amenu or an animation related to the observed object can be played.In the top region basic information about the currently observedobject is visible. The information in the bottom region provides thecurrently used eyepiece and its corresponding magnification. Thelong exposure photograph of the object is placed centered and withcorrect orientation in the frame (See Section 6). For the deep skyobjects included in our planetarium software the source for highresolution textures were [24, 13, 8].

After the first field tests of the prototype during night-time con-ditions, we found it practical for the projection approach to havethe option to offset or to turn off completely the overlayed image.This gives the user the ability to observe and compare both images.In the current implementation of the system the user can choosebetween displaying the projected image of the currently observedobject overlayed, offset or no image at all. However, if the userchooses to offset the projected image, the observer has to considerthat both images have to fit into the eyepiece’s field of view.

8.1 Field of view calibration

The planetarium software in use has the option to specify the mag-nification used during observation. For practical reason, the size ofobjects projected into the eyepiece view should match the size ofthe objects directly observed on the night sky. The main factor in-fluencing the size of the projected image perceived by the observeris provided by the OpenGL field of view (FOV) used in our soft-ware. The absolute size of the objects displayed in our planetariumsoftware can only be approximated when viewed through the eye-piece and thus we have to calibrate it. We use a simple approach:with the projection unit mounted on the telescope, we take separatephotographs of the Moon and of the projected Moon texture at themagnification corresponding to the eyepiece in use (See Figure 10).We compare the two photographs to determine the scaling factorneeded for the observed and approximated magnification factors tomatch. After matching the curvature of the Moon a scaling factorof 40% was computed with which the approximated FOV has to bedecreased.

8.2 Additional Information

To help the user navigate through the sky we blend various addi-tional information about the observed objects directly into the eye-piece view. Basic information such as name, position (Right As-cension and Declination, see Appendix for details), overall bright-ness, surface brightness and the contrast relative to the backgroundof the observed object are overlayed. Further useful informationsuch as the object’s type and classification, the page where it is ref-erenced in several widely used sky atlases, absolute magnitude orthe constellation where it is found can also be customized to be di-rectly blended into the telescope’s image plane. All this additionalinformation has high practical value during astronomical observ-ing, assisting and helping the telescope user. Another advantage isthat the information usually printed via scrolling text on the handcontroller’s display is available when looking through the eyepiece.This means that the user doesn’t always have to switch betweeneyepiece and hand controller to gather information about the ob-served object.

Figure 10: Images used for FOV calibration. Left: Picture of theMoon texture from the planetarium software before FOV calibration.Right: Picture of the Moon. Both images taken through the eye-piece. We overlayed the output of an edge detection filter on thephotographs to better distinguish the curvatures of the Moon’s sil-houette.

As a visual observation aid, there is an option of overlaying theobserved optical image with a long exposure astro-photo of thesame object. Another option is to blend in animations of nebulae orgalaxies which have an already reconstructed 3D geometry [21, 20],animations showing the object’s evolution in time or virtual flybys.With the proposed system, useful informations about practically vi-able magnifications for the target object can also be directly blendedinto the user’s field of view, thus improving the visual perception ofthe visible object.

Figure 11: Snapshot of the planetarium software focused on Saturn.Three of its major natural satellites are visible, each highlighted withthe corresponding name. Thus, locating small moons is made mucheasier.

Our system also assists the user during planetary and dim starsobservation. The major planets of our solar system are quite easyto recognize unlike their natural satellites. Their position relativeto the parent planet changes very quickly (in matter of hours) andare therefore hard to locate for the beginner. The systems helpsthe observer identify outer planets and fainter stars also, which arenot straightforward to identify due to their small size and low lu-minance values. In all the cases mentioned above, the name of thecurrently selected object is blended into the field of view of the user(See Figure 11).

124Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

Page 7: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

9 INPUT DEVICE

The input device of our prototype system was a standard key-board. The requirement of having a PC in the field can simplybe satisfied. Nowadays almost every amateur astronomer has aportable PC available during observation. It is usually used forastro-photography or as an optional control unit for pointing thetelescope to objects of interest on the sky. However, the high lu-minance value of the computer’s display (even if dimmable) canresult in partially loosing the observer’s dark adaptation. There isalso much time lost by constantly switching between the PC key-board / mouse and the telescope’s hand controller. Based on theabove mentioned facts, it is desirable that the user is able to con-trol both the telescope and the whole augmented reality system bythe means of the already provided hand controller (See Figure 3,Right). The application’s few and simple menus can easily be con-trolled with the available hand controller keys. This option was notimplemented in our first prototype version of the system.

We recently built a custom cable in order to intercept the codesof the pressed keys on the hand controller. It is attached to the PCvia a DB9 connector on the COM port. The most important menuaction which is executed during normal operation is changing themagnification i.e. the eyepiece in use. We assign the ”?” key fromthe hand controller (See Figure 3) to this command (it is normallyused to display a help message, which can be disabled by pressingthe MODE key once). The user has to press the ”?” several timesto scroll through the available magnifications and select the desiredone. After future field tests of the system we will be able to assignother hand controller key combinations to frequently used menuactions.

10 FIELDS OF APPLICATION

The projection unit developed for our application was designed tak-ing into account special requests to be applicable in the field ofobservational astronomy. However, due to its high contrast and rel-atively small size, its applicability is not limited to this domain andcan be used in any application where augmenting the view throughan optical aid is needed.

The designed projection system together with a custom softwareunit can be used in many other domains. It can be put into prac-tice in fixed applications similar to the Augmented Reality Tele-scope [36, 28] or the Timescope [1] at cultural heritage sites. Atmajor tourist attractions sites or at sites with panoramic views anapplication fitted with a software projection engine like GoogleEarth [14] can be used to overlay information like roads, build-ings, geographic features, census data or information about storesand recreation areas. The advantage of our system over any videosee-through application is the high optical resolution of details theused optical aid provides.

Because of the relatively small size of the projection unit,portable systems are another target application. In systems whereno optical aid is necessary [11], a Head Mounted Display sufficesas output for an AR application. However, for mobile applicationswhere a user is operating an optical aid such as a small telescope orspotting scope, a custom projection unit is desired as output.

11 CONCLUSION AND FUTURE WORK

We have presented a system which is capable of augmenting the im-age visible through the eyepiece of an astronomical telescope. Theuser can compare the visually observed object directly with imagesof the same object photographed using a larger aperture telescopeand long exposure time. The identification of small astronomicalobjects such as the major natural satellites of planets or dim starsis also assisted. Animations of the observed objects can also be

blended directly into the eyepiece view, showing their 3D distri-bution or their projected evolution in time. Additional informationregarding the observed object is also blended into the user’s field ofview.

In a possible future extension, the complete mapping of the lunarsurface can be added to the planetarium application, making it pos-sible to integrate a lunar atlas such as the Virtual Moon Atlas [17]or NASA’s World Wind Moon [25]. This would make lunar observ-ing much easier and features on the Moon could be identified withease. The number of deep sky objects available in the planetariumsoftware can also be extended as well as the amount of availabletextures, animations and additional information.

As a further improvement on accuracy there is the possibility ofattaching a highly sensitive CCD camera to the telescope for guid-ing (as shown in Figure 1). The guiding principle consists of detect-ing a relatively bright star situated near the observed object usingthe camera, and track its position in all the following frames. If thestar’s absolute position changes in one of the frames, a commandfor a corresponding correction movement in the opposite directionis sent to the telescope mounting.

Our telescope is going to be permanently mounted in a dome onthe roof of our institute [18]. Because of the complicated setup andcalibration, unpredictable weather and no permanent housing forthe telescope we were unable to make any studies on user feedbackyet. When the telescope is installed, we will schedule observingsessions with the setup described in this paper to be able to gatherhints about possible improvements on ease of use and understand-ability for the developed system.

ACKNOWLEDGEMENTS

The authors would first like to thank Christian Weber for the helpprovided at customizing the planetarium software used. We are in-debted also to Fabien Chereau, the creator of Stellarium for mak-ing this nice piece of planetarium software publicly available. Allthe useful advises relative to telescope operation and programingfrom the LX200GPS Yahoo group and especially Richard Seymourand Andrew Johansen were very helpful. Our thanks go also toKaleigh Smith, Grzegorz Krawczyk, Timo Stich and Markus Thielefor proof reading early drafts of this paper. Many thanks to AndreiNovak too, for his valuable help with editing the accompanyingmovie. The help provided by Thomas Hirtz at building the customserial cable for the input device was also very welcome. At last butnot least we would like to thank the peer reviewers for their valuablecomments and constructive critique to improve the final version ofthis paper.

APPENDIX - CELESTIAL COORDINATE SYSTEMS

To uniquely determine the position of an astronomical object inthe sky we need to know its coordinates in a celestial coordinatesystem. The most widely used coordinate system in astronomy isthe equatorial one. In this coordinate system, the projection of theEarth’s equator on the celestial sphere is the celestial equator andthe intersection points of the Earth’s rotation axis with the celestialsphere are called the celestial poles. The intersection of the Earth’srotational plane with the celestial sphere is called the ecliptic. Thecelestial sphere is divided into 24 hour circles, each spanning 15o.The origin of the equatorial coordinate system is the vernal equinox,one of the intersection points between the celestial equator and theecliptic (See Figure 12). An exact position in the sky can be de-scribed by its Right Ascension α and Declination δ . Right Ascen-sion is defined as the angle between the vernal equinox and the hourcircle intersecting that point, and is measured in hours, minutes andseconds, from 0 to 24 hours. The Declination is defined as theangular distance to the celestial equator, measured in degrees, arc

125Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

Page 8: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

minutes and arc seconds, from -90 to +90 with 0 for the celestialequator [30]. The equatorial coordinate system is very similar tothe geographic coordinate system used to define positions on Earth.

Figure 12: The Celestial Sphere (from [35]).

For an observer situated at a latitude φ on Earth, the elevationE - azimuth A coordinate system is more straightforward. In thiscoordinate system the elevation is given by the altitude in degreesof an object above the horizon, and the azimuth is the angle of theobject around the sky from north. The transformations from theequatorial coordinate system to the altitude (elevation) - azimuthcoordinate system are the following [34]:

sinE = sinφ sinδ + cosφ cosδ cosαcos E cosA = cosφ sinδ − sinφ cosδ cosαcosE sinA = − cosδ sinα.

REFERENCES

[1] ArtCom. Timescope. available from www.artcom.de/index.

php?option=com_acprojects&page=6&id=38&Itemid=

115&details=0&lang=en, 2005.

[2] Ronald T. Azuma. A Survey of Augmented Reality. In Presence:Teleoperators and Virtual Environments, volume 6, pages 355–385,

August 1997.

[3] W. Birkfellner, K. Huber, F. Watzinger, M. Figl, R. Wanschitz,

F.and Hanel, D. Rafolt, R. Ewers, and H. Bergmann. Development

of the Varioscope AR - A See-trough HMD for Computer-Aided

Surgery. In Proceedings of the IEEE and ACM International Sym-posium on Augmented Reality, pages 54 – 59, 2000.

[4] Fabien Chereau. Stellarium. available from stellarium.

sourceforge.net, 2005.

[5] Robert N. Clark. Visual Astronomy of the Deep Sky. Cambridge Uni-

versity Press, 1990.

[6] Robert N. Clark. Visual Astronomy of the Deep Sky - Appendix E.

available from clarkvision.com/visastro/appendix-e.html,

2002.

[7] Harold G. Corwin Jr. The NGC / IC Project. available from ngcic.

org, 2005.

[8] Russell Croman. Russel Croman Astrophotography. available from

www.rc-astro.com, 2005.

[9] Cybermind Interactive Nederland. Visette45 sxga. avail-

able from www.cybermindnl.com/images/stories/pdf/spec.

sheet%20new%20visette%2045.pdf.

[10] Kevin Daly. M31. available from members.aol.com/KDaly10475/

m31.html, 2005.

[11] Steven Feiner, Blair MacIntyre, Tobias Hollerer, and Anthony Web-

ster. A touring machine: Prototyping 3D mobile augmented reality

systems for exploring the urban environment. In In Proc. ISWC ’97(Int. Symp. on Wearable Computers), pages 74–81, October 1997.

[12] James A. Ferwerda, Sumanta N. Pattanaik, Peter Shirley, and Don-

ald P. Greenberg. A Model of Visual Adaptation for Realistic Im-

age Synthesis. In SIGGRAPH ’96: Proceedings of the 23rd annualconference on Computer graphics and interactive techniques, pages

249–258, New York, NY, USA, 1996. ACM Press.

[13] Rob Gendler. The universe in colors. available from www.

robgendlerastropics.com, 2005.

[14] Google Inc. Google Earth. available from earth.google.com, 2005.

[15] R. A. Greiner. Data on Precision Error Correction for LX200 Tele-

scopes. available from www.mailbag.com/users/ragreiner/

PEC-WORM.html, 1998.

[16] Alfred Jacobsen. Opsys project consulting. available from

opsysconsult.com, 2005.

[17] Christian Legrand and Patrick Chevalley. Virtual Moon Atlas. avail-

able from astrosurf.com/avl, 2004.

[18] Andrei Lintu. Astronomy at the MPI Informatics. available from

www.mpi-inf.mpg.de/~lintu/astronomy.html, 2005.

[19] Andrei Lintu and Marcus Magnor. Augmented Astronomical Tele-

scope. In Torsten Kuhlen, Leif Kobbelt, and Stefan Muller, editors,

Second GI-Workshop VR/AR Procedeeings, pages 203–213. Shaker

Verlag, 2005.

[20] Marcus Magnor, Kristian Hildebrand, Andrei Lintu, and Andrew J.

Hanson. Reflection Nebula Visualisation. In C.T. Silva, E. Grller,

and H. Rushmeier, editors, Proceedings of the IEEE Conference onVisualization (VIS’05), pages 255–262, Minneapolis, USA, October

2005. IEEE.

[21] Marcus Magnor, Gordon Kindlmann, Charles Hansen, and Neb Duric.

Constrained inverse volume rendering for planetary nebulae. In Proc.IEEE Visualization 2004, Austin, USA, pages 83–90, October 2004.

[22] Meade Instruments Corp. Meade Telescope Serial Command Proto-

col. available from www.meade.com/support/CommandSet.html,

2002.

[23] Meade Instruments Corp. Instruction Manual – 10”

LX200GPS Schmidt-Cassegrain Telescopes. available from

www.meade.com/manuals/TelescopeManuals/LXseries/

LX200GPS_manual.pdf, 2003.

[24] NASA. Hubble Space Telescope Gallery. available from

hubblesite.org/gallery.

[25] NASA. World Wind Moon. available from worldwind.arc.nasa.

gov/moon.html, 2005.

[26] Nvis Inc. Nvisor SX. available from www.nvisinc.com/nvisor_

sx.php, 2005.

[27] Stuart Reeves, Mike Fraser, Holger Schnadelbach, Claire O’Malley,

and Steve Benford. Engaging augmented reality in public places.

In Adjunct proceedings of SIGCHI Conference on Human Factors inComputing Systems (CHI), April 2005.

[28] Didier Stricker. Augmented reality telescope. available from www.

xc-01.de, 2005.

[29] Texas Instruments. DLP Technology Overview. available from www.

dlp.com/dlp_technology/dlp_technology_overview.asp,

2004.

[30] ThinkQuest Library. Coordinate systems. available from library.

thinkquest.org/29033/begin/coordinate.htm.

[31] Stephen Tonkin. The Three Functions of a Telescope. available from

www.astunit.com/tutorials/telescope.htm.

[32] Jack Tumblin and Holly E. Rushmeier. Tone reproduction for realistic

images. IEEE Computer Graphics and Applications, 13(6):42–48,

November 1993.

[33] Henrik Wann Jensen, Simon Premoze, Peter Shirley, William B.

Thompson, James A. Ferwerda, and Michael M. Stark. Night ren-

dering. Technical Report UUCS-00-016, Computer Science Dept.,

University of Utah, August 2000.

[34] Eric W. Weisstein. Altitude-azimuth coordinates. avail-

able from scienceworld.wolfram.com/astronomy/

Altitude-AzimuthCoordinates.html.

[35] Wikipedia. Equatorial coordinate system. available from en.

wikipedia.org/wiki/Equatorial_coordinate_system.

[36] Michael Zoellner, Didier Stricker, and Uli Bockholt. AR Telele-

scope - Taking Augmented Reality to a large Audience. COMPUTERGRAPHIK topics, 17(1):19–20, 2005.

126Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE

Page 9: [IEEE IEEE Virtual Reality Conference (VR 2006) - Alexandria, VA, USA (25-29 March 2006)] IEEE Virtual Reality Conference (VR 2006) - An Augmented Reality System for Astronomical Observations

Figure 1: Left: Snapshot of the planetarium software’s output focused on The Ring Planetary Nebula (M 57). Right: Snapshot for the HelixPlanetary Nebula (NGC 7293).

Figure 2: Screenshot of the display output produced by the planetarium software. In the top part of the display basic information about theobserved object is displayed. On the right side, the menu for specifying the current eyepiece is visible. In the middle bottom, the selectedeyepiece and corresponding magnification are displayed. The photograph of the Galaxy NGC 4414 is displayed centered and with the correctorientation.

Proceedings of the IEEE Virtual Reality Conference (VR’06) 1087-8270/06 $20.00 © 2006 IEEE