from Hollywood to Silicon Valley : I played Minority Report

I attended recently Orange Institute session #11, dubbed « When Worlds Combine : how creative meet geeks« , and that took us from San Francisco to Los Angeles.

The last visit on Friday morning was one truly blasting experience, as I not only met with John Underkoffler but also discovered how Oblong Industries, Inc. had turned a 2002 gesture grammar designed for Minority Report into a fully functional image navigation technology.

In 2011 and 2012, I had used and analyzed a famous Minority Report scene for two keynote presentations built around the history of interfaces (available in french). The scene had been ripped from a DVD and edited with iMovie to add comments and subtitles.

Yet last Friday we were given the opportunity to put Tome Cruise’s gloves and not simulate but operate a gesture based, real time interaction with real content displayed on several screens, including a Surface. The result was amazingly fluid, with a short gesture learning curve and an immediate interface feedback, conveying a very smooth and playful experience.

Enabling such experience is an abstraction layer, dubbed g-speak™, and that manages any pixel displayed across multiple screens. Oblong uses it today to address high-value, real-time, big-data, and big-workflow challenges in applications such as military simulation, logistics and supply chain management, and energy grid management.

Another impressive demo showcased data coming from 2 computers and displayed on 5 screens as a continuous single image.

Scoop.it