2
Varifocal Virtuality: A Novel Optical Layout for Near-Eye Display Kaan Akşit NVIDIA Corporation Ward Lopes NVIDIA Corporation Jonghyun Kim NVIDIA Corporation Josef Spjut NVIDIA Corporation Anjul Patney NVIDIA Corporation Peter Shirley NVIDIA Corporation David Luebke NVIDIA Corporation Steven A. Cholewiak University of California at Berkeley Pratul Srinivasan University of California at Berkeley Ren Ng University of California at Berkeley Martin S. Banks University of California at Berkeley Gordon D. Love Durham University CCS CONCEPTS Human-centered computing Displays and imagers; Ap- plied computing Physics; Hardware Emerging opti- cal and photonic technologies; KEYWORDS Near eye displays, See-through Displays, Varifocal Displays, Com- putational Displays, Augmented Reality Displays, Holography, Holo- graphic Optical elements ACM Reference format: Kaan Akşit, Ward Lopes, Jonghyun Kim, Josef Spjut, Anjul Patney, Peter Shirley, David Luebke, Steven A. Cholewiak, Pratul Srinivasan, Ren Ng, Martin S. Banks, and Gordon D. Love. 2017. Varifocal Virtuality: A Novel Op- tical Layout for Near-Eye Display. In Proceedings of SIGGRAPH ’17 Emerging Technologies, Los Angeles, CA, USA, July 30 - August 03, 2017, 2 pages. https://doi.org/10.1145/3084822.3084829 OVERVIEW Augmented reality (AR) has recently gained momentum in the form of a variety of available optical see-through near-eye displays (NEDs) such as the Meta 2and the Microsoft Hololens. These devices are a big step forward towards Sutherland’s vision of an ultimate display [Sutherland 1968]. The device we demonstrate attempts to deal with the main limitations of current devices. First, the graphics images are at a constant virtual distance for the eyes’ accommo- dation mechanism, while the vergence of the two eyes working in concert places the virtual object(s) at a distance other than the accommodation distance. This vergence-accommodation conflict is one of the main problems in many AR and VR systems [Kress and Starner 2013]. The second limitation is achieving a wide FOV with compact optics. Cakmakci et al. [2006] contend that achieving a wide field-of-view (FOV) is the major optical design challenge in AR NEDs. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s). SIGGRAPH ’17 Emerging Technologies, July 30 - August 03, 2017, Los Angeles, CA, USA © 2017 Copyright held by the owner/author(s). ACM ISBN 978-1-4503-5012-9/17/07. https://doi.org/10.1145/3084822.3084829 We demonstrate a novel wide FOV optical design (see Figure 1) that can adjust the accommodation depth dynamically so that the graphical object the user is looking at will be at the correct accommodation distance to match the vergence. The eye moves dynamically and can have objects at differing distances visible with in the fovea. For objects not at the eye’s current accommodation distance, we computationally blur the graphics using the most up-to-date blurring technology: ChromaBlur. By accounting for high-order and chromatic aberrations of human eyes, ChromaBlur generates a blur signal that is more effective in terms of depth perception and driving accommodation. CHROMABLUR We include ChromaBlur in our prototype, now under review at ACM TOG [Cholewiak et al. 2017]. Blur occurs naturally when the eye is focused at one distance and an object is present at another distance. Computer-graphics engineers and vision scientists often wish to create display images that reproduce such depth-dependent blur. Typical methods correctly account for the scene geometry, pupil size, and focal distances, but does not take into account the optical aberrations of the person who will view the resulting images. This approach makes sense when the viewer’s position relative to the display is not known. But in head-mounted displays (HMDs), viewer position is known. We developed a method that, by incor- porating the viewer’s optics, yields displayed images that produce retinal images close to the ones that occur in natural viewing. In particular, we create displayed images that properly take into ac- count the chromatic aberration of the viewer’s eye. This produces different chromatic effects in the retinal image for objects farther or nearer than current focus. We call the method ChromaBlur. Accom- modation is driven accurately and quickly when ChromaBlur is used and is essentially not driven at all when conventional methods are used. Perceived realism is greater with imagery created by our ChromaBlur method than in imagery created conventionally. ACHIEVING VARIFOCAL WIDE FOV AND PHYSICAL COMPACTNESS A common approach to AR NEDs is to use a beam splitter where a partial mirror is used to reflect an image of a computer display. Looking through the beam splitter allows the user to see the image of the display while also observing the user’s environment directly

Varifocal Virtuality: A Novel Optical Layout for Near-Eye Display...Title Varifocal Virtuality: A Novel Optical Layout for Near-Eye Display Author Kaan Akşit, Ward Lopes, Jonghyun

  • Upload
    others

  • View
    4

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Varifocal Virtuality: A Novel Optical Layout for Near-Eye Display...Title Varifocal Virtuality: A Novel Optical Layout for Near-Eye Display Author Kaan Akşit, Ward Lopes, Jonghyun

Varifocal Virtuality: A Novel Optical Layout for Near-Eye DisplayKaan Akşit

NVIDIA CorporationWard Lopes

NVIDIA CorporationJonghyun Kim

NVIDIA Corporation

Josef SpjutNVIDIA Corporation

Anjul PatneyNVIDIA Corporation

Peter ShirleyNVIDIA Corporation

David LuebkeNVIDIA Corporation

Steven A. CholewiakUniversity of California at Berkeley

Pratul SrinivasanUniversity of California at Berkeley

Ren NgUniversity of California at Berkeley

Martin S. BanksUniversity of California at Berkeley

Gordon D. LoveDurham University

CCS CONCEPTS•Human-centered computing→Displays and imagers; •Ap-plied computing → Physics; • Hardware → Emerging opti-cal and photonic technologies;

KEYWORDSNear eye displays, See-through Displays, Varifocal Displays, Com-putational Displays, Augmented Reality Displays, Holography, Holo-graphic Optical elements

ACM Reference format:Kaan Akşit, Ward Lopes, Jonghyun Kim, Josef Spjut, Anjul Patney, PeterShirley, David Luebke, Steven A. Cholewiak, Pratul Srinivasan, Ren Ng,Martin S. Banks, and Gordon D. Love. 2017. Varifocal Virtuality: A Novel Op-tical Layout for Near-Eye Display. In Proceedings of SIGGRAPH ’17 EmergingTechnologies, Los Angeles, CA, USA, July 30 - August 03, 2017, 2 pages.https://doi.org/10.1145/3084822.3084829

OVERVIEWAugmented reality (AR) has recently gained momentum in theform of a variety of available optical see-through near-eye displays(NEDs) such as theMeta 2and theMicrosoft Hololens. These devicesare a big step forward towards Sutherland’s vision of an ultimatedisplay [Sutherland 1968]. The device we demonstrate attempts todeal with the main limitations of current devices. First, the graphicsimages are at a constant virtual distance for the eyes’ accommo-dation mechanism, while the vergence of the two eyes workingin concert places the virtual object(s) at a distance other than theaccommodation distance. This vergence-accommodation conflict isone of the main problems in many AR and VR systems [Kress andStarner 2013]. The second limitation is achieving a wide FOV withcompact optics. Cakmakci et al. [2006] contend that achieving awide field-of-view (FOV) is the major optical design challenge inAR NEDs.

Permission to make digital or hard copies of part or all of this work for personal orclassroom use is granted without fee provided that copies are not made or distributedfor profit or commercial advantage and that copies bear this notice and the full citationon the first page. Copyrights for third-party components of this work must be honored.For all other uses, contact the owner/author(s).SIGGRAPH ’17 Emerging Technologies, July 30 - August 03, 2017, Los Angeles, CA, USA© 2017 Copyright held by the owner/author(s).ACM ISBN 978-1-4503-5012-9/17/07.https://doi.org/10.1145/3084822.3084829

We demonstrate a novel wide FOV optical design (see Figure1) that can adjust the accommodation depth dynamically so thatthe graphical object the user is looking at will be at the correctaccommodation distance to match the vergence. The eye movesdynamically and can have objects at differing distances visible within the fovea. For objects not at the eye’s current accommodationdistance, we computationally blur the graphics using the mostup-to-date blurring technology: ChromaBlur. By accounting forhigh-order and chromatic aberrations of human eyes, ChromaBlurgenerates a blur signal that is more effective in terms of depthperception and driving accommodation.

CHROMABLURWe include ChromaBlur in our prototype, now under review atACM TOG [Cholewiak et al. 2017]. Blur occurs naturally when theeye is focused at one distance and an object is present at anotherdistance. Computer-graphics engineers and vision scientists oftenwish to create display images that reproduce such depth-dependentblur. Typical methods correctly account for the scene geometry,pupil size, and focal distances, but does not take into account theoptical aberrations of the person whowill view the resulting images.This approach makes sense when the viewer’s position relative tothe display is not known. But in head-mounted displays (HMDs),viewer position is known. We developed a method that, by incor-porating the viewer’s optics, yields displayed images that produceretinal images close to the ones that occur in natural viewing. Inparticular, we create displayed images that properly take into ac-count the chromatic aberration of the viewer’s eye. This producesdifferent chromatic effects in the retinal image for objects farther ornearer than current focus. We call the method ChromaBlur. Accom-modation is driven accurately and quickly when ChromaBlur isused and is essentially not driven at all when conventional methodsare used. Perceived realism is greater with imagery created by ourChromaBlur method than in imagery created conventionally.

ACHIEVING VARIFOCALWIDE FOV ANDPHYSICAL COMPACTNESSA common approach to AR NEDs is to use a beam splitter wherea partial mirror is used to reflect an image of a computer display.Looking through the beam splitter allows the user to see the imageof the display while also observing the user’s environment directly

Page 2: Varifocal Virtuality: A Novel Optical Layout for Near-Eye Display...Title Varifocal Virtuality: A Novel Optical Layout for Near-Eye Display Author Kaan Akşit, Ward Lopes, Jonghyun

SIGGRAPH ’17 Emerging Technologies, July 30 - August 03, 2017, Los Angeles, CA, USA K. Akşit et al.

Figure 1: We will demonstrate a see-through, varifocal, near-eye display. Left: A light engine projects an image onto a thin,light-weight, see-through screen directly in front of the eye. The resulting image is relayed to a desired optical depth de-termined by the distance from the screen to the curved beam combiner. Middle left: Importantly, the screen is a hologramdesigned so that laser beams coming from each side act differently – light from the light engine diffuses towards the beamcombiner while external light from the environment (and reflected light from the beam combiner) pass through the screen.Middle right: A wearable prototype uses two light engines through two optical folds to project onto the screens. Right: Adaylight photo taken from the eye position demonstrates the brightness achieved by our design.

through the beam splitter. This necessitated placing the displayto the side of the user’s field of view and angling the mirror toreflect the light from the display to the viewer’s eyes. The maininnovation in what we present is to place a viewing screen directlyin front of the user’s eye so that a curved beam splitter can beplaced on axis with the viewer’s eyes. This can only work if thescreen is transparent to the viewer (or it would block the user’sview). This is achieved by using a one-way diffusing holographicscreen of our own design that is directly invisible to the viewer, butthe virtual image of the screen can be seen in high quality and infocus by looking through the curved beam-splitters. This designmaintains the wide field of view associated with beam splitterdesigns which allowing for a more compact optical layout relativeto other designs. The geometry of the beam-splitter and positionof the screen determines the focal plane of the virtual objects. Bydynamically positioning the screen along the optical axis we candynamically move the focal plane. Both the holographic screenor the mirrors are light and small parts so moving either linearlywith a small actuator is straightforward and achieves this varifocalproperty.

REFERENCESOzan Cakmakci and Jannick Rolland. 2006. Head-worn displays: a review. Journal of

display technology 2, 3 (2006).SA Cholewiak, GD Love, P Srinivasan, R Ng, and M Banks. 2017. ChromaBlur: Render-

ing chromatic eye aberration improves accommodation and realism. submitted toACM Transactions on Graphics (TOG) (2017).

Bernard Kress and Thad Starner. 2013. A review of head-mounted displays (HMD)technologies and applications for consumer electronics. In SPIE Defense, Security,and Sensing. International Society for Optics and Photonics.

Ivan E Sutherland. 1968. A head-mounted three dimensional display. In Proceedings ofthe December 9-11, 1968, fall joint computer conference, part I. ACM.