Next step after users are able to interact with the Augmented information through touch-screen events is to provide them a more realistic interaction through touch events on the real object. The idea behind this is simple. In addition to the already well-known objects placed in the real world to carry out an image or object tracking, where a video or a 3D object usually is layered, we suggest to add buttons to the scene which the user will be able to “virtually “press. Over the image to be tracked, one or more hot spots or hot areas are defined and over them several actions can be associated.
Unlike in the previous entry in which we also talked about these virtual buttons, but the video showed a situation where they were placed in a static way, the next video talks about these virtual buttons as addition to the 3D Image tracking. With the introduction of these feature in the real scene, the user is able to press directly these buttons like he would do in a real device, with the difference that here these objects are augmented objects layered over the real environment instead of real objects, but the behavior will be the same.
Alike with the rest of the products ARLab offers, this product will be released in a very simple and easy SDK, not being needed that developers have any knowledge about computer vision. Once the image is recognized on the scene and later tracked, the developer can layer several Virtual Buttons over that image with this SDK, allowing the final user “to play” with the real image or object. The video explains clearly how the Virtual buttons will work and shows the process of the development in a middle stage, but this SDK will be soon available as a Beta Version.
Tags: Augmented Reality, sdk, Virtual Buttons