Thursday, February 11, 2016

Oh, patents! OSMO’s virtualized tangible interface objects

Copyright © Françoise Herrmann

Welcome OSMO, voted one of the best inventions by Times Magazine in 2014! Welcome AR (Augmented Reality) toys! 

OSMO’s virtualized tangible interface objects sit across two worlds: the virtual and the physical, and it’s the user that brings them together in the fun challenges of the OSMO games. OSMO is a game system that is played outside of the screen in a physical virtualization space. It is much more than a real world game, because the physical activity space in which you play has the capacity to virtualize on screen , like a mirror image, which in turn augments or deepens the possibilities of off screen interaction and activity.

Take ancient Chinese Tangram, a set of 7 small geometric wooden puzzle pieces (in the real physical world) that you are challenged to put together in specific ways in the off-screen virtualization space of the app. You will get feedback on-screen, with every move off screen, until you finally fit all the pieces together offscreen, like the virtual onscreen image! So, rather than moving the pieces of the puzzle onscreen in a capacitative mode, you are playing with the physical objects off screen, in a small space in front of the screen that has the capacity to virtualize your moves on screen, in turn providing you with feedback to complete the puzzle.

Take Newton, another OMSO game, and use any physical object you posess off screen, or even draw them in the virtualization space off screen, to defy the gravity of small beads falling virtually onscreen, and to make them bounce at a virtual target!  It’s what you do on the physical activity space off-screen that virtualizes to control and manipulate the falling and bouncing beads of the virtual world, and it’s the virtual world that sets the challenge with virtual falling and bouncing beads.

Take Masterpiece, yet anther OSMO game, and start drawing in the vitualization space, off screen in the physical world, the virtual image onscreen of any picture that you have taken off screen! OSMO transforms your pix into a line drawing, which you are then challenged to reproduce off screen with paper and pen or color markers.Your strokes and lines off-screen, on the physical activity surface will virtualize on screen so that you can easily follow the lines of the virtual image to create your masterpiece off screen!

It’s fun to play with friends in a virtualization space off screen, using physical objects that have the capicity to be virtualized on screen, in an interaction with the virtual world, or to be challenged in the virtualization space offscreen by the virtual world on screen… But how does it work? 

US2015339532 titled Virtualization of Tangible Interface Objects is the patent that discloses the OSMO game system invention. The physical activity virtualization system comprises a computing device (for example a tablet) and a stand with a small mirror adaptor designed to direct the field of view of the camera to the off screen activity surface, an app for each game, accessories or your own physical objects, and any small surface in front of the tablet, where physical interactions with objects may be virtualized.

Virtualization of physical actions and objects occurring in the physical world (on the physical activity surface) arises via a continuous video capture device. The video capture device communicates with the computing device, which is equipped with processors able to recognize and decode the motion and objects captured on video stream. The video stream captures activity outside of the computing device, as the user manipulates objects in the virtualization space.

Beyond hardware, the invention thus consists in the processing that has to occur to recognize and decode object attributes, connected to the motion and interactions captured on the video stream.  Once recongized, object attributes can be represented virtually.  And once represented virtually, the application can in turn generate routines and object-related information on the display of the computer device. The user can then visualize the information displayed and use it to inform further interactions. with objects in the physical world, on the virtualization surface area, off screen.

The beauty of this AR toy with its virtualization space of interaction in front of the computing device screen, and special hardware configuration, is several fold. First, it is a lot of fun to see your physical actions and objects virtualize on screen, in an interaction with the virtual world. Secondly, any number of applications may be developed and tailored to the OSMO virtualization system, which promisses many more OSMO games. Thirdly, the games are produced at very reasonable processing costs, passed onto the consumer as affordable app-based games. This is primarily due to the fact that the OSMO system uses lots of re-usable parts such as the hardware (i.e.; computing device, camera, mirror adaptor and stand) and the algorithms for detection and recognition. The system also allows the use of all sorts of everyday objects, with little or no contraints. It offers compatibility with numerous devices. And it has the capacity to process many different objects and interactions  simultaneously, without requiring mega computing power.  

Below, the abstract for US2015339532 titled Virtualization of Tangible Interface Objects is included. One of the patent drawings and an image of the marketed product are also included.
An example system includes a stand configured to position a computing device proximate to a physical activity surface. The system further includes a video capture device, a detector, and an activity application. The video capture device is coupled for communication with the computing device and is adapted to capture a video stream that includes an activity scene of the physical activity surface and one or more interface objects physically interactable with by a user. The detector is executable to detect motion in the activity scene based on the processing and, responsive to detecting the motion, process the video stream to detect one or more interface objects included in the activity scene of the physical activity surface. The activity application is executable to present virtual information on a display of the computing device based on the one or more detected interface objects. [Abstract US2015339532]
References
OSMO

1 comment:

Lee said...

The method used by Osmo, how is it different then other AR technology used (such as Qualcomm)?
Mainly because the abstract sounds just like any On the market AR recognition?