16

THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

  • Upload
    others

  • View
    7

  • Download
    0

Embed Size (px)

Citation preview

Page 1: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

THE GUITAMMER COMPANY

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Page 2: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 2

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

TABLEOF

CONTENTS

An Introduction to Haptic-Tactile Broadcasting — White Paper 3

The Difference between “Seeing”, “Hearing” and “Feeling” an Event 4

What is Haptic-Tactile Broadcasting? 4

End to End Process and System Architecture 5

Haptic-Tactile Broadcast End User CE Device Ecosystem 6

Sensors or “Capture” Devices 6

Live Event Production Considerations 7

Encoding 8

Professional Broadcast Transport 8

Broadcast Plant Processing 8

Distribution to the End User 8

End Users and Consumer Electronics Hardware 8

Standardization 10

Synchronization & Latency 10

IPTV and Streaming Media 10

The Internet of Things (IoT); and Point of View Cameras (POV) 11

Haptic-Tactile Broadcasting and Live Virtual Reality 12

The Economic Benefits of Haptic-Tactile Broadcasting - Monetization Considerations & Strategies 13

Conclusions 14

Case Studies 14

Case Study One: The NHRA on ESPN2 14

Case Study Two: The San Jose Sharks on Comcast SportsNet CA 15

Intellectual Property 15

Page 3: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 3

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

“Immersive”, “Personalized”, “Virtual Reality”, “Second Screen”, “Over-the-Top Content”, “TV Everywhere” are the words now being used to describe the new ways to create, produce and distribute all types of content to consumers. Continuing advances in picture quality, now up to “4K” with “8K” not far behind, streaming of post produced and live content, including sports, new audio formats, growing interest in and

increasing adoption of Virtual Reality, combined with viewers at home and on the go using their smart phones and tablets as their primary or “second screen” for watching TV, are creating challenges and opportunities for new technologies to come online to give consumers the type of personalized and immersive experience they are looking for.

For example, broadcast, home and cinema sound is changing with the creation of object based audio formats by proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will feature a new audio element giving broadcasters and viewers the ability to personalize content in new and exciting ways.

Speaking of the new ATSC 3.0 standard, Dan Daley in his August 6, 2015 “Sports Video Group” blog wrote:

“The two platforms vying to become the audio element for the ATSC 3.0 standard showed their stuff …. ……..There will also be far more flexibility for broadcasters in the presentation of multiple languages, accessibility features, immersive sound elements and other effects through the use of audio ‘objects.’ Viewers will be able to personalize their TV sound experience by adjusting the level of dialog, changing the position of certain sound elements, changing dialog language, selecting different narration (like ‘home’ vs. ‘away’ announcers for sporting events) or adding commentary tracks, and so on.”

Dan Daley, Sports Video Group, August 6, 2015

However, even with all these advancements in video and audio, there is still one important aspect missing, the ability to let the viewer actually “feel” , “sense” or “perceive” the on-screen action creating a truly immersive and personalized experience.

By definition “haptics” and “tactile” both are understood as “relating to the sense of touch”. Used in the broader sense, “haptics” or “tactile” are the ways in which individuals can perceive the world around them through

“feeling” as opposed to other senses such as “seeing”, “hearing”, “tasting” and “smelling”.

This White Paper is an introduction to the technical, production and economic aspects and benefits of adding haptics and tactile effects to broadcasts of live events.

Page 4: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 4

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

What is Haptic-Tactile Broadcasting?

Although haptic-tactile events may have a visual and auditory portion to them, i.e. a viewer at home can typically both see and hear the impact of a hockey player hitting the boards, it is separate and distinct from both the visual and auditory portion of the event as it relates to the actual “feeling” or “force” of the event*.

Perhaps two examples will assist in clarifying the distinction and relationship between auditory and haptic-tactile events. Take for instance an airplane. If you were piloting the plane or flying in the plane you might hear the sound of the engine(s) but you may very well feel the hum of the engine; you may be able to feel the plane moving up, down, sideways and the jarring and bouncing as the plane lands. If you are a nervous flyer, hopefully all of those feelings, i.e. haptic-tactile events, would be gentle and controlled enough not to cause you alarm. Now assume you want to experience a broadcast of an airplane race and that you have a flight simulator or Virtual Reality setup at home. Certainly you would want to see out of the cockpit as if you were the pilot; you would want to hear the sounds the engines make and even the dialogue between the pilot and co-pilot and the pilot and the control tower to have as realistic an experience as possible. But you would also want to feel the pitch, roll and yaw of the plane, feel the rumble of the engines, and if you are the adventurous sort, feel the plane aggressively bucking up and down in bad weather. In this scenario, a broadcast that combines the senses of seeing, hearing and feeling, a haptic-tactile enabled or enhanced broadcast, would be the solution and provide a truly immersive experience for the viewer.

Use an American football game as the second example. The linebacker “shoots the gap” between blockers and “plants” [tackles] the running back – hard – into the ground. Certainly in today’s sports production broadcasts multiple images and angles of the hit would be recorded by one or more cameras, some undoubtedly in slow motion; the sound of the “crack” of the hit would be recorded perhaps with a parabolic microphone and the announcer will excitedly exclaim - “did you feel that hit!”. The answer to that question will unfortunately be “no, I did not feel that hit.” Now imagine if either or both of the players, the linebacker and the running back were wearing sensors and that actual impact, the feeling, the haptic-tactile event was captured in addition to the sight and sound of the impact. Imagine if the feeling of that impact were able to be broadcasted along with the video and audio and then used by a device that imparted that physical impact to the viewer at home, or in a sports bar or even in the football stadium itself in a specially equipped seat. If you can imagine that, then you have imagined what haptic-tactile broadcasting is.

*While it is true that certain frequencies of sound combined with certain sound pressure levels are able to be both heard and felt by the end user, those experiences relate to the end user’s perception of the event and do not necessarily coincide with the concept of specifically capturing the events feeling or impact and then separately and distinctly broadcasting and transferring those experiences to the end user.

The Difference between “Seeing”, “Hearing” and “Feeling” an Event

Haptic-tactile broadcasting is the end to end use of technology to capture, encode, broadcast – transmit, transport, by any means - decode, convert and deliver the “feeling” or “impact” or “motion” of a live event so that a remote viewer can experience the same haptic-tactile experience of the broadcast event.

Page 5: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 5

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

End to End Process and System Architecture

PROCESSING AND EMISSION FOR

DISTRIBUTION BY BROADCAST

FACILITY

TRANSPORTENCODING

DISTRIBUTION TO REMOTE END

USER

HAPTIC-TACTILE

EVENT CAPTURE

DECODED AND USED BY REMOTE END USER

WHO EXPERIENCES THE HAPTIC-TACTILE

EVENT’S EFFECTS

Figure 1 - Simpli� ed Haptic-Tactile Signal Path (Note: Haptic-Tactile signal path is synchronous with the event’s audio and video content)

Figure 2 - “Event Capture - Production Truck – Broadcast Facility – End User” Flow Chart

A simplifi ed system architecture is shown below in fi g. 1. where the “event capture” represents the animate (athlete) or inanimate (race car) object whose “feeling”, “force”, “impact”, i.e. “haptic-tactile eff ect” is “captured” by means of one or more electro-mechanical sensors. Once the haptic-tactile eff ect is captured it is then encoded for transport, transported to the broadcaster’s plant or facility, processed and then emitted for distribution to the cable, FTTH, DBS, OTA or IP providers, distributed to the end user and then decoded so that the haptic-tactile signal can be utilized by various types of consumer electronics and experienced by the end user (viewer) in conjunction with, and synchronous with, the broadcasts audio and video content.

A more detailed look at the end to end process and system architecture in a traditional broadcast environment is shown below in Fig. 2.

• Sensors on physical animate or inanimate objects to capture haptic-tactile effects

• Sensors may be digital or analog

LIVE EVENT; I.E. RACE, HOCKEY GAME, ETC

• Located in or near live event, most likely not in production truck(s).

• Similar to how RF cameras and in-car video is sub-mixed.

• Operator can be called an "H1" - like an audio "A1"

• Data from multiple sensors is monitored, managed, sub-mixed, etc. as needed, and then sent to the production truck for inclusion in the broadcast.

“HAPTIC 1” SUB-MIX POSITION / SENSOR DATA COLLECTION LOCATION

• Haptic-tactile data received from "H1" sub-mix / sensor data collection location.

• Data included in program content either automatically or as called by producer.

• Haptic-tactile data encoded in either the audio or video streams.

• Haptic-tactile data processed as part of normal production workflow and survives all processing in the truck.

REMOTE PRODUCTION TRUCK

• Program received from production truck via normal back haul means; satellite or fiber

• Haptic-tactile data is processed at broadcast facility along with the program's audio and video content.

• Program content, including haptic-tactile data, is emitted for distribution to all distribution providers.

BROADCAST FACILITY

• Distributor - whether cable, DBS, FTTH, IP or OTA provider - proceses program - audio, video and haptic-tactile content with normal processes in place

• Haptic-Tactile essence is distributed to end users and remains encoded in the program data

DISTRIBUTION

• Haptic-Tactile data received as part of the program from all provider types and means of transmission: cable, DBS, FTTH, IP, OTA.

• Decoded by end user supplied CE hardware.

• End user (viewer) experiences haptic-tactile effects in conjunction with and synchronous with, the event's audio and video content.

END USER

Page 6: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 6

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Haptic-Tactile Broadcast End User CE Device Ecosystem

Sensors or “Capture” Devices

Figure 3 - Haptic-Tactile Broadcast Data Ecosystem

Haptic-tactile broadcast signals can be decoded and used by a wide range of CE devices including home theater, gaming, PC, mobile, TV’s, STB’s and IP streaming devices creating a large potential licensing ecosystem.

By defi nition, a sensor, from the Latin “sentire” to perceive, is:

“a device that responds to a physical stimulus (as heat, light, sound, pressure, magnetism, or a particular motion) and transmits a resulting impulse (as for measurement or operating a control).”

http://www.merriam-webster.com/dictionary/sensor

When used as part of a haptic-tactile broadcast, sensor(s) “capture” or “record” the haptic-tactile event of the person or object to be used as part of the broadcast architecture described above. Sensors are to haptic-tactile broadcasts as microphones are to the audio and cameras to the video.

Sensors may be placed on the athletes themselves; on physical objects that are part of the sporting event such as a bat, a ball, a basketball rim, hockey boards, skis, etc.; or on the inanimate objects that are central to the event itself meaning a race car, a motorcycle, a surfb oard or water ski, an airplane, etc.

CINEMAS AND LIVE VENUES

Live streaming of alternative content, including sports.

PC AND MOBILE DEVICES

For Gaming, racing & flight simulators, etc

SPORTS STADIUMS

Special “fan zones” or premium seating areas upgraded to include haptic-tactile devices

CE DEVICES

Activates integrated or standalone haptic-tactile or motion devices, furniture, etc.

STB & STREAMING DEVICES

CONSOLE GAMING

Haptic-TactileBroadcast Data

Page 7: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 7

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Live Event Production ConsiderationsFigure 2 above, “Event Capture - Production Truck – Broadcast Facility – End User” Flow Chart provides a relatively detailed overview of the entire haptic-tactile broadcast process from the point the haptic-tactile data is captured from an event, all the way to being used by the end user in a traditional broadcast environment.

Successful haptic-tactile broadcasting requires forethought and planning in order to seamlessly integrate it into the production, as such it is important to note the following:

ӹ The capture and initial processing of haptic-tactile data may (most likely will) require the use of a “sub-mix” or “haptic-data collection point” outside of, or separate from the existing audio and video production.

ӹ Haptic-tactile data intensity settings; choices between haptic-tactile data sources, the aggregation or combination of haptic-tactile data from multiple objects* for “entire field” type effects (i.e. the feeling of a portion of, or the entire field of race cars as they round a turn as opposed to the forces from one single car); and other capture related production decisions are made at the haptic-tactile sub-mix or collection point. An H1 (Haptic 1) would be in charge of these decisions much like an A2 can be responsible for FX mixing and RF camera operators are responsible for their remote camera.

ӹ The capturing and onsite workflow production of haptic-tactile data, although separate from the audio production, is more closely analogous to audio production than video production in the sense that haptic-tactile data is captured at the same time the video of the event is captured in order to create the initial synchronization with the program content.

ӹ Just as with audio and video content, the producer will “call the shot” and decide whether or not haptic-tactile data will be included and whether or not it should be part of the live action or part of an instant replay or both.

ӹ Like the audio and video content, thought must be given as to how much, how intense and how often haptic-tactile data is included in the broadcast by the producer.

* It is understood that the inclusion of haptic-tactile data from multiple sources will require new and creative production techniques to avoid confusion and consternation at the event and for the end user. Not unlike how multiple camera angles, on-player cameras and field microphones, 3D and Virtual Reality required and are continuing to require the same.

Today many of the items mentioned above already have sensors attached to them and are now capturing data for use in the visual portions of the broadcast, or simply for the athletes or teams internal benefit. This existing sensor data can, in many instances, be used natively or with the assistance of a data conversion routine (i.e. converting “positional” or “location based” data into “speed” or “force”) to provide the input data necessary for a haptic-tactile broadcast.

The proliferation of sensors that are already deployed is a significant benefit in terms of reduced costs and production efforts in allowing haptic-tactile broadcasting to scale as a natural extension of live broadcast production.

Page 8: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 8

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Encoding

Professional Broadcast Transport

Broadcast Plant Processing

Distribution to the End User

End Users and Consumer Electronics Hardware

Haptic-tactile data is encoded into the broadcast at the event level (either in the production truck or at the remote “haptic-tactile data sub-mix or collection point”). Data can be encoded into the broadcast per the transport schemes below, either as non-audio AES3 or via SDI.

Encoded haptic-tactile data can be transported at the on-site event to the production truck and then to the broadcast facility as part of the broadcast by means of either AES3 or SDI (HANC / VANC) or a combination thereof.

Additionally, haptic-tactile data can be broadcast to the “cloud” for use by the remote user (viewer), by-passing the traditional broadcast facility. In this use case to maintain synchronization only a small piece of metadata will need to be transported to and through the broadcast facility and then through distribution to the end user.

Note: Already completed haptic-tactile broadcasts combined with on-going work are enabling robust amounts of data to fit within the existing payload constraints of either AES3 or SDI transport methods.

Combined with, and now part of the entire broadcast, the haptic-tactile data is processed in the broadcaster’s facility and provisions are made for encoders, decoders, and other processes that the broadcaster uses should be updated to allow for the survivability of the haptic-tactile data until final emission for distribution.

As with processing at the broadcaster’s plant, the haptic-tactile data is an integral part of the entire broadcast and survives the entire process all the way to final distribution to the end user.

Either before emission from the broadcast plant or after receipt for distribution by the cable, FTTH, DBS, IP, etc. provider, the haptic-tactile data may be embedded into the audio format (For further information, refer to the “Note” at the end of the section following, “End Users and Consumer Electronics Hardware.”)

For the end user, whether at their home, at a sporting venue, cinema or other location, the haptic-tactile data is decoded and converted into a digital or analog signal that is used by the appropriate electro-mechanical haptic-tactile consumer electronics hardware so that the end user can experience substantially the same haptic-tactile effects as the event’s original haptic-tactile event.

These consumer electronics devices can be used with or integrated into furniture; home theater type seating; cinema seats; racing and flight simulators; gaming vests; haptic enabled phones or tablets; or other such devices are used to provide the end user’s own haptic-tactile experience, typically in conjunction with their existing audio and video system.

Page 9: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 9

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

In Fig. 4 above, a typical home consumer environment is shown. Th e haptic-tactile data is sent as part of the entire broadcast and is received by the end user (in this example, in their home). Th e haptic-tactile data is then:

ӹ Decoded at the set-top-box (STB), TV, streaming device (Roku, Apple TV, Amazon Fire, Google Chromecast, etc.), by the audio video receiver (AVR) or by a smart phone, tablet, PC or any other type of device capable of receiving a broadcast regardless of transmission means and then sent to the haptic-tactile end user CE hardware via HDMI, Optical* or Digital Coaxial* outputs; or wirelessly via Bluetooth. See Fig 4A and 4C.

Or,

ӹ Decoded aft er or behind the STB or similar device and the AVR. In that case, the end user haptic-tactile hardware (or a haptic-tactile decoding accessory) would connect to the AVR and decode the haptic-tactile data for use by the end user CE hardware’s HDMI, Optical*, Digital Coaxial* outputs. See Fig. 4A.

Or, ӹ Decoding of haptic-tactile data may take place ahead of the STB or similar device by the end

user haptic-tactile hardware or even an accessory haptic-tactile decoding device. In that case, the end user device is placed ahead of or in front of the STB, TV, etc., receives the full broadcast stream, and then decodes the haptic-tactile data for use by the end user CE hardware, and also acts as a pass through, so that the full broadcast stream goes to the STB, TV, streaming device, AVR, etc. See Fig. 4B.

Additionally, as haptic-tactile broadcasting becomes more prevalent, AVR or potentially even STB and / or TV manufacturers may allow for assignable RCA “pre-outs”, currently only for subwoofer use, that could be used for both subwoofer and haptic-tactile CE end user devices.

* It may also be possible for the haptic-tactile data to be delivered with, but separate from, the broadcast audio and then separately decoded from either the S/PDIF audio stream using TOS Link (optical connections) or coaxial (RCA connections). In that case, agreements may have to be in place between the providers of various audio formats and Guitammer to allow for combined or co-mingled encoding, transport and decoding.

• End user's STB, TV, Streaming Device, AVR, Smart Phone, Tablet or similar device receives haptic-tactile enhanced broadcast and decodes the haptic-tactile data from the entire broadcast stream for use with end user hardware.

• Or, decoding of haptic-tactile data may take place outside of the STB or similar device by the end user hardware or even an accessory decoder device. In that case then the data either simply passes through the end user device or decoder device and then goes to the STB, TV, etc., or,

• Full broadcast data stream output via HDMI from the STB, TV, etc. to the haptic-tactile decoding accessory or device itself.

• Output from the decoding device to the end user hardware via HDMI, Optical or Bluetooth.

END USER'S STB, TV, STREAMING DEVICE, AVR, SMART PHONE, TABLET, PC, ETC.

• Provided in varying form factors based on use. For example: furniture; home theater type seating; cinema seats; racing and flight simulators; gaming vests; haptic enabled phones or tablets; or other such devices.

• Manufacturers provide connectivity to the the STB, Flat Panel, other device, etc. via HDMI, Optical or Bluetooth.

• Manufacturer needs license to decode haptic-tactile data.

END USER CONSUMER ELECTRONICS HARDWARE

Broadcast

STB / PANEL , PC

END USER DEVICE

HAPTIC TACTILE DECODER

Broadcast

END USER DEVICE

TABLET / MOBILE DEVICE

Broadcast

STB / PANEL

END USER DEVICE

HAPTIC TACTILE DECODER

Figure 4 - Decoding of Haptic-Tactile Data from STB, TV or Streaming Device for Use by End User

Figure 4A Figure 4CFigure 4B

Page 10: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 10

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Standardization

Synchronization & Latency

IPTV and Streaming Media

As of the date of this White Paper, standardization efforts are in process with SMPTE (Society of Motion Picture and Television Engineers) in order to provide interoperability throughout the entire broadcast ecosystem.

Studies with end users regarding haptic–tactile effects suggest that end users associate haptic-tactile effect with both the video and the audio content of the programs. In the case of high action video content, users may associate haptic effects even more closely with what they see on screen than with what they hear. Their expectation becomes that they should “feel” or “experience” visually depicted events as they occur, regardless of whether the event is heard.

Studies with end users conducted for determining optimal or minimum accepted delay or latency between audio and video components of broadcasts have shown that acceptable ranges of delay are +/- 22 ms.* Surveys conducted by Guitammer for wireless send/receive haptic-tactile end user CE home hardware and cinema installations have shown a slightly lower tolerance for the acceptable latency in the +/- 12 – 18 ms range. **

Therefore all aspects of the broadcast from haptic-tactile effect capture, insertion into the production, processing and delivery, must be carefully thought through and executed so that all the elements of the broadcast’s content – audio, video and haptic-tactile – are able to be used in a synchronized manner (within acceptable bounds) by the end user.

*Sara Kudrle et al. (July 2011). “Fingerprinting for Solving A/V Synchronization Issues within Broadcast Environments”. Motion Imaging Journal (SMPTE). Appropriate A/V sync limits have been established and the range that is considered acceptable for film is +/- 22 ms. The range for video, according to the ATSC, is up to 15 ms lead time and about 45 ms lag time.

** Guitammer internal sample testing, 2004 – 2014 as needed for product development and public venue testing.

Internet Protocol television (IPTV) is a system through which television services are delivered using the Internet protocol suite over a packet-switched network such as a LAN or the Internet, instead of being delivered through traditional terrestrial, satellite signal, and cable television formats. …IPTV offers the ability to stream the media in smaller batches, directly from the source.

https://en.wikipedia.org/wiki/IPTV

IPTV and streaming media is enabling broadcasters to offer an increasing range of immersive and personalized content to viewers without the bandwidth constraints of what are fast becoming legacy broadcast facilities. As such, the addition of haptic-tactile data from multiple sources; for instance all the race cars in the race; every player on the field*; will not be bandwidth limited, but instead, limited to the imagination and the production prowess of the broadcaster.

*Refer to the “Note” at the end of Section, “Live Event Production Considerations” for thoughts regarding a broadcast with haptic-tactile data captured from multiple sources and

*Refer back to the section “Professional Broadcast Transport” for further discussion on sending haptic-tactile data to the “cloud” and then to the end user viewer.

Page 11: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 11

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

The Internet of Things (IoT); and Point of View Cameras (POV) The Internet of Things (IoT) is a scenario in which objects, animals or people are provided with unique identifiers and the ability to transfer data over a network without requiring human-to-human or human-to-computer interaction. IoT has evolved from the convergence of wireless technologies, micro-electromechanical systems (MEMS) and the Internet.

A thing, in the Internet of Things, can be a person with a heart monitor implant, a farm animal with a biochip transponder, an automobile that has built-in sensors to alert the driver when tire pressure is low -- or any other natural or man-made object that can be assigned an IP address and provided with the ability to transfer data over a network.

Cited from WhatIs.com / Cloud Computing Definition

Combining the “Internet of Things” proliferation of sensors with the growing trend of using Point-Of-View (POV) cameras for sports production and broadcasts creates an ideal environment for the addition of haptic-tactile data which provides the final piece for a truly immersive broadcast.

Haptic-tactile data can be derived from existing sensors already deployed on athletes, race cars, equipment, etc. and added to the broadcast allowing fans at home to see, hear and feel the action. They can be “part of the action”. They can (finally) “be the player” or “be the race car driver”.

For example see the following references:

As “The Official On-Field Player-Tracking Provider [RFID]” of the NFL, we [Zebra Technologies] capture high-speed player data and convert it into real-time, usable statistics. Imagine the playbook redefined with every snap.

https://www.zebra.com/us/en/nfl.html

“…the [GoPro] wireless transmitter enables professional broadcasters to deliver engaging live content with the immersive POV footage and unique perspectives…... Wearable, mountable, and designed for use in harsh environments, HEROCast makes capturing and broadcasting live content easier than ever.

https://gopro.com/herocast

“GoPro has changed the way people see the world, creating an immersive viewing experience. Now with unique GoPro perspectives available to broadcasters, watching live events is like being part of the action instead of watching it from the stands.”

http://gopro.com/news/gopro-partners-with-vislink-for-live-wireless-hd-broadcast%20

Page 12: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 12

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Haptic-Tactile Broadcasting and Live Virtual Reality The growing awareness and increasing adoption of Virtual Reality and its associated technologies is the next logical step to true immersion and personalization.

However, Virtual Reality faces challenges to bring “Reality” fully in sync with the “Virtual”. This question and answer from a blog post: “Exploring New Sonic Worlds: Sound for Virtual Reality” is quite enlightening as to the challenges faced by sound designers trying to create realistic virtual reality experiences but still limited to using sound and the auditory sense.

“As a sound designer for VR, how do you achieve presence and spatialization in your sound design?”

“As for presence — I’m not entirely sure yet. Spatialization helps, but I feel generative, procedural and dynamic content can greatly improve the experience. I’m personally interested in seeing how input devices for VR evolve. I’m always looking for ways to map sensor data to sound!”

Posted October 2, 2014 by Asbjoern Andersen in Film sound, Game audio. SFX interview with Varun Nair, founder of Two Big Ears.

http://www.asoundeffect.com/exploring-new-sonic-worlds-sound-for-virtual-reality/

VR hardware manufacturers, VR game designers, studios and creative agencies have a growing awareness and interest in combining haptics and spatialization to create the sense of “presence” which is so integral to Virtual Reality applications. In fact, they are beginning to design sound tracks optimized for, and designed to be used with, haptic-tactile hardware to further engage their audiences.

For live Virtual Reality applications, imagine a luxury car brand launching its newest model with a worldwide “live virtual” press tour where journalists in Shanghai, Beijing, London, New York, Detroit and Los Angeles all simultaneously virtually ride along in the actual car on the Nurburgring Ring; each in their own simulated vehicle, each seeing what the driver sees, each hearing what the driver hears and each feeling and experiencing what the driver feels – live and in real time and “present”. A broadcast that combines POV camera and haptic-tactile sensor data is Live Virtual Reality.

Page 13: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 13

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

The Economic Benefits of Haptic-Tactile Broadcasting - Monetization Considerations & Strategies

Haptic-tactile broadcasting has the potential to create new revenue streams as well as increase revenue from existing revenue streams by reaching deeper into existing customers’ spending for content providers, rights holders, athletes, broadcasters, distributors and CE device manufacturers.

Because haptic-tactile broadcast technology is hardware agnostic it does not require consumers to use only one brand of CE hardware. This creates new market opportunities for CE manufacturers to grow existing consumer, cinema and commercial haptic-tactile product sales and categories and potentially create new products and categories.

There are new revenue opportunities from existing fans and viewers at home who now can “play as” their favorite player by giving them the ability to turn on the haptic-tactile data feed from individual players or specific race cars, etc. on a nominal per game or per season basis. This scenario is similar in concept to NASCAR “RaceView” and Verizon’s “Indy15” which allow fans at home, for a per race or per season fee, to choose a specific driver to watch, including in some instances their in-camera POV video, and then choose which audio feed to listen to, whether the pit crew or the track announcer.

Haptic-tactile broadcasting can be combined with the growing availability of personalized and user selectable content broadcasters’ are beginning to offer. This new content by-passes traditional distribution channels and is becoming a “direct to consumer strategy”. Offering consumers more immersive options like haptic-tactile data can help justify premium pricing for this new content.*

Distributors can offer a bundle of end user hardware combined with haptic-tactile enabled content in a manner similar to the way TiVo or Sling box was originally marketed, and similar to DirecTV’s “NFL Sunday Ticket” creating a differentiated offering that can help drive customer acquisition and revenues and reduce customer churn by increasing retention.

For Virtual reality content creators, the addition of haptic-tactile data provides compelling and needful technology and will provide CE manufacturers’ opportunities to create new categories of Virtual Reality focused hardware; gaming vests, haptic enabled platforms, haptic VR chairs and the like.

Finally, at the event itself, “haptic-enabled” seating areas and sections (luxury boxes, fan engagement zones) can be used to generate increased revenue from fans as well as create additional brand awareness opportunities that can be sold to sponsors.

* Note: Movie theaters are validating this premium pricing model by including all types of haptic-tactile enabled cinema seats as part of their premium theaters and using these enhancements, along with new video and sound technologies, as the rationale and justification for higher ticket pricing.

Page 14: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 14

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Conclusions

Case Studies

Case Study One: The NHRA on ESPN2

Today’s consumers are technologically savvy and expect to be able to consume live broadcast content in the way they want to, when they want to, where they want to, how they want to, and in an immersive and personalized way never before possible. Smart phones, Smart TVs; IP and streaming broadcasts; new video resolutions and new audio formats along with Virtual Reality are providing unparalleled viewing experiences.

However, all these advances, regardless of their level of sophistication, all only improve on the sense of “seeing” and “hearing” the broadcast.

Haptic-tactile broadcasts provide another sense, some would say, the final and missing sense, - enabling broadcasting to transition from a passive recipient based viewer model to an active viewer and participant one where the viewer can truly “feel what is missing.”

In 2013 and for the 2014/15 season The Guitammer Company successfully implemented a national proof-of-concept for haptic-tactile broadcasting with the National Hot Rod Association (NHRA) telecasts on ESPN2, and for the NHL’s San Jose Sharks home games that were telecast from their home SAP Center by Comcast SportsNet California. The following information is provided to give the reader insight into how the technology was deployed from the venue to the consumer, and the reactions of fans, newscasters and players to the technology.

Note: In both examples, only Guitammer’s “ButtKicker®” brand of haptic-tactile consumer hardware was made available to fans for use with the broadcasts.

“The Making of Tactile Broadcasting” video shows how the entire process worked from the race car to the ESPN2 production truck to the viewer at home. Fans at the races were shown the technology in a living room setting and their initial reactions were recorded.

Page 15: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 15

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Case Study Two: The San Jose Sharks on Comcast SportsNet CA

Intellectual Property

In this TV segment that aired in January of 2015 on San Jose’s Fox KTVU News station, Scott Reiss, sports anchor, takes viewers through the technical setup at the SAP Center, interviews a fan using the technology at home, and examines the future possibilities for haptic-tactile broadcast technology. This quote from a San Jose Sharks player is quite telling, “It’s a great concept. You really feel like you’re a part of the game. You feel the bumps. Feeling the couch move, actually is pretty cool.”

The Guitammer Company owns issued and pending patents for haptic-tactile broadcasting.

Page 16: THE GUITAMMER COMPANY Introduction to... · proponents such as Dolby, DTS and the MPEG-H Audio Alliance. The coming Advanced Television Standards Committee (ATSC) 3.0 Standard will

www.guitammer.com — The Guitammer Company — August 2015© 16

AN INTRODUCTION TO HAPTIC-TACTILE BROADCASTING — WHITE PAPER

Contact:

Mark LudenThe Guitammer [email protected] ex. 101 - Office614-218-5396 -Mobile

THE GUITAMMER COMPANY

Published August, 2015© - The Guitammer Company - All Rights Reserved

Guitammer, ButtKicker®, and The Guitammer Company logo are trademarks or registered trademarks of The Guitammer Company. All other product and company names are trademarks™ or registered® trademarks of their respective holders. Use of them does not imply any affiliation with or endorsement by them.