Immersive is the buzz word for 360 VR content. VR wants to put you into the experience. But with only visual and aural elements, one thing that is missing is tactile response. The ability to feel as well as see and hear. How to accomplish this? Haptics. You already have it on your smartphone. The simplest form is vibrating mode on your phone. Your smartphone uses actuators that alert you to a phone call or message now can are being used to augment mobile videos. You’ve probably heard of haptics for gamming, where the use of hand controllers sends subtle and sometimes not so subtle vibrations to your hands. Now one company has designed a beautiful streamlined system to enable you, the media creator to incorporate haptics into your videos and soon your 360 VR videos. Plus, rather than having to download and setup the software on your studio computer, Immersion has created the Immersion Touchsense Design Cloud. Still in Beta right now, in the not too distant future it will be availble to all.
This company of 130 employees has accomplished something that many larger companies have not, they have created a sophisticated, yet simple to use tool to accomplish a very complicated task. They already have a world class customer base that they have created custom tactile effects for including LG, Huawai, Sony, Fujitsu, Kyocera, Volkswagen, Logitech and Meizu. A fairly impressive list.
There is a huge push for Haptic Advertising. Simply put, to add nuanced tactile effects to simulate a realistic feeling to mobile video and 360 VR content. The shaking of a martini glass in a Stoli ad, the power of the engine in a Peugeot ad and the flickering of lights in the American Poltergeist trailer. These are a few of the ways that haptic advertising has been utilized using Immsersion’s Touchsense Design Cloud. Starting in 2014, Stoli was the first to break the ice, so to speak, with the first tactile ad. Now 554 mobile apps and 481 tablet apps on the Opera Mediaworks network have the capability to run haptic ads. These ads have the potential to reach 14.9 million total impressions, 4.2 million unique viewers.
I spoke in depth with David Birnbaum, Design Director at Immersion. He explained how the system works. There are four elements in the process. Once you sign up with Immersion they will send you an app for your Android smartphone with v5.0 and up software. The only other software that you need is an audio editor. The supported software right now is Pro Tools 10.3 on OSX 10.8.5, Pro Tools v11 or V12 on OSX 10.8 and up, or Adobe Audition v8.1.
Once you download the Haptic Monitor from Google Play and you connect to the interface, the Haptic Monitor Connect to the Design Cloud on the Internet. Using your audio editor the haptic cues are created where you want them as an additional audio track, but instead of normal audio content, the track contains the haptic cues that you built in the Design Cloud.
Next, the audio track is output as a .wav file and combined into the video in the Design Cloud. The testing phase follows using the Haptic Monitor Connect to play back the content on your smartphone. This enables you to make and adjustments necessary to make the haptics work to enhance your content. Once the haptic cues are tweaked to your satisfaction the final product is ready to be published.
To sample some of this haptic content, (click here- https://play.google.com/store/apps/details?id=com.immersion.tsengageportal&hi=en) to go to the Google Play store to download the app. It includes videos games and other content enhanced by the use of haptic technology.
You can also go to www.immersion.com/touch-sense-design-cloud-interest if you are interested in find out more about using this technology for you content.