Snapchat AR Research - June 20th, 2016

Embed Size (px)

Citation preview

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    1/17

     

    4th Draft - 6/20/2016 page 1

    Snapchat will lead the Augmented Reality revolution 

    The wildly popular “social media / camera” company already has most of the necessary pieces

    to bring augmented reality to the masses with the help of the younger generations that are

    constantly using their applications.

    Key Points

    ● Userbase

    ● Snapchat’s AR History

    ○ Overlays

    ○ Geofilters

    ○ Lenses

    ○ 3D Stickers

    ● Acquisitions

    ○ Vergence Labs

    ○ Looksery

    ○ Seene

    ● Public Spotting

    ● Engineering and Research Talent

    ● Patent

    ● Forecasts

    ● Conclusion

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    2/17

     

    4th Draft - 6/20/2016 page 2

    Userbase

    Current estimates put Snapchat’s userbase around

    150 million daily active users; up from 110m in

    December 2015. This makes them more popularthan Twitter, in regards to daily usage. [1,2]

    Compared to Facebook, Snapchat has more video

    views per day (~10 billion) despite a drastically

    smaller userbase. [3,4,5,6]

    Their main audience is between 13 and 24, who are

    the primary people that will most likely be early

    adopters of AR when the technology finally comes to

    fruition.[7]

    https://www.snapchat.com/adshttp://fortune.com/2016/01/12/snapchat-facebook-video-views/http://finance.yahoo.com/news/snapchat-video-viewers-grew-significantly-150840321.htmlhttp://fortune.com/2015/11/04/facebook-video-growth/http://www.bloomberg.com/news/articles/2016-04-28/snapchat-user-content-fuels-jump-to-10-billion-daily-video-viewshttp://www.foliomag.com/2016/report-reveals-snapchat-still-growing-users/http://www.bloomberg.com/news/articles/2016-06-02/snapchat-passes-twitter-in-daily-usage

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    3/17

     

    4th Draft - 6/20/2016 page 3

    Snapchat’s AR History

    Pretty much from the beginning, Snapchat has been adding augmented reality features. Among

    the earliest capabilities was overlays that involved drawing over pictures to decorate them.

    In addition to drawing over photos, users can also add Geofilters that show information linked to

    a person’s physical location. This included items like temperature, speed, illustrations...etc

    One of the more popular additions to the Snapchat apps is the new Lenses feature. This allows

    people to track people’s faces by manually selecting them through the camera with thetouchscreen. Sponsored lenses are added to the app everyday, thus increasing their revenue.

    https://medium.com/@markracette/snapchat-s-future-lies-in-augmented-reality-afbfe1834e7ahttps://medium.com/@markracette/snapchat-s-future-lies-in-augmented-reality-afbfe1834e7a

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    4/17

     

    4th Draft - 6/20/2016 page 4

    Further experimentation in the same line as Lenses is the recent update of 3D Stickers. This

    gives users the option to track any object in a video, rather than just faces.

    Acquisitions

    Over the last few years, Snapchat has secretly acquired a couple of companies to integrate

    features into their app. The information has mostly only surfaced through leaked emails and

    hacks, meaning that Snapchat is trying to keep quiet about their AR plans.

    Vergence Labs

    This company started in 2011. According to their crunchbase page, their goal was to “ reinventthe future of the human-computer paradigm with stylish wearable computing, computer vision,

    computer graphics, biometrics, learning algorithms, and the web to enhance humanity and

    redefine reality.”[8] They raised over $70K through Indiegogo,[9] and then were acquired by

    Snapchat late 2014 for $15 million. [10]

    http://venturebeat.com/2014/12/16/snapchat-paid-15m-for-vergence-labs-a-google-glass-like-startup/https://www.indiegogo.com/projects/social-video-glasses#/https://www.crunchbase.com/organization/vergence-labs#/entity

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    5/17

     

    4th Draft - 6/20/2016 page 5

    The primary product that Snapchat was most likely interested in was Vergence Labs’ Epiphany

    Eyewear. These were stylish glasses that lay the foundation for an AR product produced by

    Snapchat. In order for AR to take off into the mainstream, the glasses have to be fashionable

    and sexy. Currently, the AR market is flooded with clunky devices. Epiphany Eyewear, on the

    other hand, actually looks like something that the public would wear on a regular basis.

    The main problem with the Epiphany Eyewear glasses though is that when they were originally

    released, there was no AR functionalities in it. Instead, they focused on video/photo capture with

    the ability to transmit data wirelessly via Wi-Fi. [11] It was by no means a standalone AR device.

    In order to understand where Snapchat could be going though, diving into Vergence Lab’s early

    experimentations is a must. A video posted on Feb 13, 2012, by Erick Miller, Vergence Lab’s

    CEO, shows the vision of augmented reality glasses with a working prototype.[12] The

    technology was developed by Jon Rodriguez during his studies at Stanford University.

    What was shown essentially looks like a Virtual Reality headset. It was the culmination of

    research and development of a hyper-realistic 3D display invented by Jon Rodriguez during his

    Computer Science Honors Thesis. A summary of his work is documented in PDF form.[13]

    Interestingly, Rodriguez was experimenting with eye-tracking in addition to solving major

    technical problems at the time. That eye-tracking ability could provide useful information in an

     AR headset.

    The resulting wireless AR device was built off of Android and utilized a pass-through camera to

    send video signals to two displays inside the wearable. It included facial recognition through

    computer vision technology. Then the software could detect who the person in front was by

    integrating with Facebook APIs. Furthermore, Erick Miller also talks about potentially integrating

    biometric sensors over time.

    https://dl.dropboxusercontent.com/u/13352595/volumetric-world-display.pdfhttps://www.youtube.com/watch?v=OECh3-R_MA8https://en.wikipedia.org/wiki/Epiphany_Eyewear

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    6/17

     

    4th Draft - 6/20/2016 page 6

    Diving back even further, a video by Jon Rodriguez uploaded to Youtube in 2011 depicts a

    smaller pair of AR glasses.[14] It was a wired prototype that sent video feeds from the camera to

    a laptop and then back to the headset. The software could then detect edges of objects,

    producing psychedelic outlines. Another video shows more experimentation of a 3rd

    prototype.[15]

    https://www.youtube.com/watch?v=kEJoSSs5UBohttps://www.youtube.com/watch?v=vTmhwGrIB5Q

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    7/17

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    8/17

     

    4th Draft - 6/20/2016 page 8

     Another example of the types of AR glasses that Snapchat could produce comes from Vergence

    Labs’ Product Designer - David Meisenholder. He designed the GL-20 Polaroid video glasses

    that was marketed with Lady Gaga in 2011.[20]

    Looksery

     According to their website, this company was founded in 2013 and “is a San Francisco Bay

     Area company that specializes in facial tracking and modification technologies for real-time

    video communication.”[21]

    Through a Kickstarter, Looksery raised $46,152 to bring the product to life.[ 22] Late 2015,

    Snapchat acquired the firm for a reported $150 million.[23] This technology was then integrated

    into Snapchat’s app and labeled as “Lenses.”

    In a sense, these features are allowing people to augment themselves as how they want. On a

    philosophical level, this is tapping into the idea that through AR people can define themselves

    with these kind of filters. If they want to be an animal, they can share with their friends what that

    would look like. Or perhaps they want to show a new hairdo, or a superpower, or a new type of

    eyes. All this is possible with this type of software.

    http://www.businessinsider.com/snapchat-buys-looksery-2015-9https://www.kickstarter.com/projects/looksery/looksery/descriptionhttp://looksery.com/pr/http://www.engadget.com/2011/01/06/lady-gaga-and-polaroid-launch-grey-label-instant-camera-printer/

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    9/17

     

    4th Draft - 6/20/2016 page 9

    It is essentially giving people the options to look inwards and to define what they want to

    become.

    Seene

     As mentioned on their website, Seene “has developed a portfolio of advanced computer vision

    technologies designed from the ground up for use on mobile devices in real-time applications.

    [They] enable these devices to locate themselves in space, map visual environments, and

    recreate in 3D what is seen through the camera, turning a standard smartphone into a 3D

    scanner without the use of additional hardware or off-device processing.”[24]

    Sometime in 2016, Seene was acquired by Snapchat for an undisclosed amount.[ 25]

     A video uploaded to Youtube in 2015 shows some of the capabilities of their platform. For

    instance, using just an RGB camera in a phone or tablet, the software can create stunning

    visual animations. [26] Swirls of colors and sparkles can float around objects. Lego characters

    can talk to the user, and this just shows a glimpse into what is possible. With technology like

    this, the creative potential is endless.

    https://www.youtube.com/watch?time_continue=24&v=f01O3FEpfIohttp://techcrunch.com/2016/06/03/snapchat-secretly-acquires-seene-a-computer-vision-startup-that-lets-mobile-users-make-3d-selfies/https://seene.co/

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    10/17

     

    4th Draft - 6/20/2016 page 10

    Public Spotting

    On June 17, 2016, Business Insider revealed that Snapchat's CEO wore the company's

    secret-camera sunglasses in public — and nobody noticed. They mention that “Evan Spiegel

    was pictured wearing a very different pair of sunglasses from his usual ones while on vacation

    in Corsica with his girlfriend, the model Miranda Kerr.”

    “Compared with Spiegel's normal aviator style, the

    sunglasses photographed in August [2015] appear to

    have thicker frames and the attached cameras. They're

    similar to but aren't exactly clones of the glasses made

    by Vergence Labs, a company Snapchat acquired in

    2014.”

    One can speculate that those glasses seen in the

    paparazzi pictures have at least the ability to changethe level of tint on the lenses. The cameras on the

    frames depict a version that can snap photos.

    It is unclear if the glasses are a standalone headset or if they connect to a cell phone wirelessly.

    Based on the engineering history of Vergence Labs, the latter is most likely. It is also not know if

    the glasses have any augmented reality features involved. But, looking into the past, even to the

    time of Voxel Vision, the engineering talent at Snapchat looks like they are capable of adding

    thin screens to the inside of glasses.

    In the past, Jon Rodriguez implemented a similar feature before (although the camera was thebiggest part of a previous iteration). As long as they have been able to reduce the components,

    it is plausible that the glasses seen on Evan’ Spiegel’s face include some kind of overlay AR

    features.

    http://www.businessinsider.com/evan-spiegel-photographed-wearing-alleged-snapchat-glass-prototype-with-miranda-kerr-2016-6http://www.businessinsider.com/evan-spiegel-photographed-wearing-alleged-snapchat-glass-prototype-with-miranda-kerr-2016-6

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    11/17

     

    4th Draft - 6/20/2016 page 11

    Engineering and Research Talent

    Over the last couple years, Snapchat has been snatching up a ton of talent. Several people

    have worked for other high-profile companies, making the move to the more forward thinking

    company of Snapchat. A key division within the company is their “Snap Lab.”

    One of the people who made the transition to Snapchat is Gareth Griffiths, Ph.D. He used to

    work at Oculus as their User Research Lead and is now the Head of User Research at

    Snapchat. [27]

    Microsoft’s Hololens Recruiter, Mark Dixon, now works for Snapchat too.[28]

    His profile mentions that he loves “building teams that are passionate, driven

    and obsessed with bringing new and exciting products to market. Being able to

    help build a team creating a new product not yet seen by the public's eye is

    what [he’s] about, and what excites me about being in tech recruiting.”

    VP Engineering for Qualcomm on the Vuforia team left to join Snapchat in

    January 2016. His name is Eitan Pilipski and is Snapchat’s new Engineering

    Director. [29] His specialities include: computer vision, augmented reality,

    virtual reality, facial recognition, gestures, wearable, HMD, profiling, analytics,

    and cloud recognition.

    From Vergence Labs, Jon Rodriguez has become Snapchat’s System

     Architect.[30] Because of his previous AR work, it can be concluded that he is

    working close with the R&D teams within the company. If any of his

    experiments from Vergence Labs and Stanford are being used, the AugmentedReality community is in for a treat.

     Another person now at Snapchat is Lauryn Morris - who is one of their

    designers. She has designed eyewear solutions for people/companies;

    including: Marchon Eyewear, Michael Kors, Innovega, and Zac Posen. [31]

    Lauryn Morris has also designed conceptual and manufactured housewares,

    small appliances, furniture, eyewear, jewelry and consumer electronics. Her

    website lists some of the glasses she has designed.[32] Among those is a pair

    of AR glasses called Innovega iOptik that works with custom contact lenses.[33]

    https://www.youtube.com/watch?v=H5qMCl4SgVwhttp://www.lauryndesign.com/https://www.linkedin.com/in/laurynmorrishttps://www.linkedin.com/in/jonr1https://www.linkedin.com/in/eitanpilipskihttps://www.linkedin.com/in/markdixonshttps://www.linkedin.com/in/gareth-griffiths-ph-d-434102b

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    12/17

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    13/17

     

    4th Draft - 6/20/2016 page 13

    Ning Zhang is yet another Research Scientist at Snapchat. She has a Ph.D. in

    Computer Science from UC Berkeley, and a B.S in Computer Science from

    Tsinghua University, China.[41] Her role a Snapchat involves conducting

    computer vision and deep learning research. Ning Zhang used to be a

    Research Intern at Facebook and has interests in Artificial Intelligence too.

    Linjie Luo is a Research Scientist who used to work at Disney and Microsoft.

    He specializes in 3D reconstruction - 3D printing - Geometry processing. He

    obtained my PhD in the Department of Computer Science, Princeton

    University, advised by Prof. Szymon Rusinkiewicz.[42] He also provided

    high-quality physically-based smoke simulation for SIGGRAPH projects while

    at Microsoft.

     Another of the key players in the software engineering at Snapchat looks to be

    Yunchao Gong who specializes in machine learning, computer vision, anddeep learning; and he has been working on related fields for more than 10

    years.[43] He also used to work at Facebook, Google, and Microsoft over his

    career. Gong does large-scale video encoding pipeline at Snapchat now which

    could be useful in the AR architecture that is being built.

    Rong Yan is the Engineering Director at Snapchat who started working for the company late

    2014. His group makes the camera application better and smarter with the state-of-the-art

    mobile / CV / AI technology. [44] Other people on the software side include Xiaoyu Wang,

    Jianchao Yang, and Jiayuan Ma (among others).

    Snapchat obviously has a ton of software talent, and it will be interesting to see who else they

    will bring in on the hardware side. Granted the technology currently being used in the AR market

    is essentially made up of cell phone parts, but adding more computer vision cameras...etc will

    require better battery usage and faster processors. Battery technology will be critical in building

    a long lasting AR device, which will probably be developed elsewhere. They might need a good

    radio engineer to broadcast signals to external devices if they want to add controllers too.

    It is also unclear how Snapchat would display information to the users. Whether that will be

    related to using screens with a passthrough camera, like a sleeker VR headset, using reflection

    technology, or if they will beam light directly into the users’ eyes. Either way, Snapchat will have

    to build that team out more. The optics will be by far one of the most important (if not the most

    important) part of the potential device they are working on.

    They have the researchers and designers already in place, now they need to get the hardware

    engineers on board if they are going to develop a pair of AR glasses. Because of this, they are

    https://www.linkedin.com/in/rong-yan-2004692https://www.linkedin.com/in/yunchao-gong-150a32ahttps://www.linkedin.com/in/linjie-luo-981b8147https://www.linkedin.com/in/ning-zhang-16580125

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    14/17

     

    4th Draft - 6/20/2016 page 14

    currently searching for a Computer Vision Engineer and a 3D Computer Vision Engineer.

    [45,46]

    Patents 

    Snapchat has applied for several patents. Although none of the directly relate to augmentedreality, a couple of them could become useful if they eventually release an AR headset. Most of

    them center around accessing content based on geographical locations.

    Imagine a person wearing an AR headset. If they walk through a school, park, house, or another

    place where their friends and family frequently visits, they could essentially receive photo/video

    messages placed in specific areas. Those images would appear, play, and then disappear in

    typical Snapchat fashion.

    The following patents could allow this to happen:

    ❖  Priority based placement of messages in a geo-location based event gallery [47] ➢   “ A computer implemented method includes creating an event gallery associated with an event using messages

    received from devices located within a geo-location fence. The messages are scanned to identify a specific

    brand in a message. An endorsement of the message is received from an owner of the specific brand. In

    response to the endorsement, the placement of the message is prioritized in the event gallery. The event gallery

    is supplied in response to a request from a user. ”

    ❖  User interface for accessing media at a geographic location [48] ➢   “A system and method for accessing a media item on a mobile device are described. The mobile device

    includes a media placement application and a media display application. The media placement application

    receives a selection of a media item generated by the mobile device. The media placement application

    generates access conditions for the media item based on geolocation and position information of the mobile

    device associated with the selected media item. The media display application monitors the geolocation and

    position of the mobile device and determines whether the geolocation and position of the mobile device meet

    the access conditions of the selected media item. The media display application generates a notification that the

    selected media item is available to view in a display of the mobile device in response to determining that the

    geolocation and position of the mobile device meet the access conditions of the selected media item.”

    ❖  Device and method for photo and video capture [49] ➢   “A single user input element in an image capture device is used for both photo and video capture. Based on a

    first user interface activity, a timing reference is engaged at a first reference time. In a first case, photo capture

    is performed. In the first case, a second reference time is based on a second user interface activity, and the

    timing reference indicates a passage of time between the two reference times is shorter than a particular time

    interval. In a second case, video capture is performed. In the second case, the timing reference indicates a

    passage of time since the first reference time is equal to or longer than the particular time interval. Video

    capture is stopped based on a subsequent user interface activity. The user interface activities may comprise

    detecting actions based on the same type of physical manipulation of the single user input.”

    ❖   Geo-Location Based Event Gallery [50] ➢    “A computer implemented method may include receiving geo-location data from a device of a user; comparing

    the geo-location data with a geo-location fence associated with an event; determining that the geo-location data

    corresponds to the geo-location fence associated with the event; responsive to the determining that the

    geo-location data corresponds to the geo-location fence associated with the event, supplying user-selectable

    http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=/netahtml/PTO/srchnum.html&r=1&f=G&l=50&s1=20150365795.PGNR.http://patft.uspto.gov/netacgi/nph-Parser?Sect2=PTO1&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-bool.html&r=1&f=G&l=50&d=PALL&RefSrch=yes&Query=PN/9277126http://patft.uspto.gov/netacgi/nph-Parser?Sect2=PTO1&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-bool.html&r=1&f=G&l=50&d=PALL&RefSrch=yes&Query=PN/9015285http://patft.uspto.gov/netacgi/nph-Parser?Sect2=PTO1&Sect2=HITOFF&p=1&u=/netahtml/PTO/search-bool.html&r=1&f=G&l=50&d=PALL&RefSrch=yes&Query=PN/9094137https://boards.greenhouse.io/snapchat/jobs/118167#.V1Noy1YrKUkhttps://boards.greenhouse.io/snapchat/jobs/193727#.V1NoslYrKUk

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    15/17

     

    4th Draft - 6/20/2016 page 15

    event gallery information, associated with an event gallery of the event, to the device for addition to a

    destination list on the device; detecting selection of the user-selectable event gallery information in the

    destination list by the user of the device; and/or responsive to the detecting of the selection of the

    user-selectable event gallery information by the user of the device, adding the user as a follower of the event,

    with access to the event gallery.”

    ❖  User Interface to Augment an Image [51] ➢   “A system and method for a media filter publication application are described. The media filter publication

    application receives a content item and a selected geolocation, generates a media filter based on the content

    item and the selected geolocation, and supplies the media filter to a client device located at the selected

    geolocation.”

    ❖   Geolocation-Based Pictographs [52] ➢   “A system and method for geolocation-based pictographs are provided. In example embodiments, a current

    geolocation of a user device is determined. A pictograph is identified based on the current geolocation of the

    user device. The identified pictograph is presented on a user interface of the user device.”

    Looksery also has a few patents that could be useful in an Augmented Reality scenario. It is

    assumed that when Snapchat acquired the company, the $150 million spent includes theacquisition of those patents as well.

    ❖  Emotion Recognition in Video Conferencing [53] ➢   “Methods and systems for videoconferencing include recognition of emotions related to one videoconference

    participant such as a customer. This ultimately enables another videoconference participant, such as a service

    provider or supervisor, to handle angry, annoyed, or distressed customers. One example method includes the

    steps of receiving a video that includes a sequence of images, detecting at least one object of interest (e.g., a

    face), locating feature reference points of the at least one object of interest, aligning a virtual face mesh to the at

    least one object of interest based on the feature reference points, finding over the sequence of images at least

    one deformation of the virtual face mesh that reflect face mimics, determining that the at least one deformation

    refers to a facial emotion selected from a plurality of reference facial emotions, and generating a communication

    bearing data associated with the facial emotion.”

    ❖  Background Modification in Video Conferencing [54] ➢   “Methods and systems for real-time video processing can be used in video conferencing to modify image quality

    of background. One example method includes the steps of receiving a video including a sequence of images,

    identifying at least one object of interest (e.g., a face) in one or more of the images, detecting feature reference

    points of the at least one object of interest, and tracking the at least one object of interest in the video. The

    tracking may comprise aligning a virtual face mesh to the at least one object of interest in one or more of the

    images. Further, a background is identified in the images by separating the at least one object of interest from

    each image based on the virtual face mesh. The background is then modified in each of the images by blurring,

    changing a resolution, colors, or other parameters.” 

    http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=/netahtml/PTO/srchnum.html&r=1&f=G&l=50&s1=20150195491.PGNR.http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=/netahtml/PTO/srchnum.html&r=1&f=G&l=50&s1=20150286858.PGNR.http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=/netahtml/PTO/srchnum.html&r=1&f=G&l=50&s1=20160085773.PGNR.http://appft1.uspto.gov/netacgi/nph-Parser?Sect1=PTO1&Sect2=HITOFF&d=PG01&p=1&u=/netahtml/PTO/srchnum.html&r=1&f=G&l=50&s1=20160085863.PGNR.

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    16/17

     

    4th Draft - 6/20/2016 page 16

    Forecasts 

    Looking towards the future, Snapchat will continue to integrate innovative features into their

    smartphone applications. That’s what they have been doing since their inception, and it makes

    sense to keep the trend going. For instance, now that they have acquired Seene, 3D capture

    and sharing is likely to be added soon. How long that will take is still yet to be determined.

    In regards to Virtual Reality, it seem unlikely that they will

    develop VR capabilities. However, Seene does already

    allow for Google Cardboard support, so that could happen.

    360 photos and videos is another area that could be

    experimented with, but there are problems related to

    making that happen. For one, there is no really good way to

    capture panoramas with just a phone yet. Connecting to

    devices like a Ricoh Theta or Samsung Gear 360 Camwould mean partnerships with those companies, and

    Snapchat tends to create stuff in-house or through

    acquisitions.

    With that in mind, it would be easy to add 360 video playback in the app allowing for users to

    scroll through with their fingers (or potentially in a headset). Still, the problem lies in recording

    those 360-degree images and videos. Creating their own 360 camera would be a waste of time;

    but it is fun to imagine overlaying objects in a 360-degree video by feeding them through

    Snapchat’s software.

    The reason it wouldn’t make sense for Snapchat to focus on VR is because that industry is still

    in the ‘here and now’ phase. Snapchat is more forward thinking than that. Some VR features

    could be integrated over time, sure. But if Snapchat is going to continue to alter the way society

    interacts with each other, focusing their attention on the Augmented Reality sector is a better

    way forward.

    Because of their acquisitions, starting with Vergence Labs, research and development within

    Snapchat on an AR headset is almost guaranteed. They have the software talent and that will

    keep showing up in the apps. Yet, in order for them to create the ultimate AR headset, they

    need to buckle down on the hardware.

    Snapchat has already begun adding more hardware

    people on board. Jon Rodriguez is one of the people to

    keep watching closely. He has already prototyped AR

    headsets over the years, so building out the team to work

    with him is going happen. Mark Dixon, Paul Sledd, and

    Timothy Sehn, will probably help with that.[55]

    https://www.linkedin.com/in/timothysehnhttps://www.linkedin.com/in/paulsledd

  • 8/15/2019 Snapchat AR Research - June 20th, 2016

    17/17

     

    4th Draft - 6/20/2016 page 17

    It is expected that Snapchat will also keep secretly acquiring talent rather than publically

    announcing anything. This is because they don’t want to confuse their users by mentioning an

     AR headset so early on. Rather, it is better to keep quiet until they are ready, and then they will

    release their vision accordingly.

    Conclusion

    Snapchat is primed to bring AR into the mainstream consciousness. They have already begun

    “training” their user base to use augmented reality capabilities. Currently, those features are

    stuck inside a phone. However, they have the talent to bring it all into a standalone AR headset.

    That device will most likely be built on Android or Windows, unless they decide to develop their

    own operating system. iOS is too far behind. It will also need to connect to cell phone towers

    too.

    With that in mind, there are still quite a few challenges that they need to overcome. Even if theyare able to produce a spectacularly sexy headset prototype, the manufacturing of those devices

    is going to be a huge obstacle. Yet, they can learn from the virtual reality industry in that

    regards.

    If they focus on the hardware and get something ready for market though, they will be the

    leaders in the industry. They already have the user base needed. Now they just need the

    hardware.

    It can be argued that Facebook’s social media platform is why the world has mobile phones is

     just about everyone’s pockets. Facebook’s acquisition of Oculus is certainly why virtual reality isgoing mainstream. So, looking at those trends, then Snapchat will be the reason everyone will

    have spectacular Augmented Reality glasses on their faces sharing content on a daily basis.