As its definition states, augmented reality is a technology that augments the view of a real world environment, overlaying extra information to the real objects present on that real world, and so enhancing users’ current perception of reality. This extra information is “augmented” by computer-generated sensory input, like graphics or GPS data which comes from the GPS’ device. It is important to say that unlike in virtual reality, where the computer generated data replaces the elements present in the real world, in augmented reality this computer data is added to it in order to improve the user´s perception of the environment.
Modern augmented reality systems allow smartphones and other mobile devices to be able to integrate these systems into them. In order to make possible the creation of augmented elements into the real world, devices can use one or more of the following technologies: optical sensors, accelerometers and gyroscopes , GPS, solid state compasses, RFID and wireless sensors.
According to how or what Augmented Reality will show, one or more than one of the aforementioned technologies can be used. One example that we can find of linking augmented components into the real world is an augmented geolocated view (picture 1). In this type of geolocated views, the augmented reality system will use the GPS, accelerometers, gyroscopes and compasses in order to place a lot of Point of interests according to the user´s location and direction. Once all the data received from this hardware are analyzed, the output will be used to place some POIs on the screen, with very useful information for the user like distance to the POIs, directions to them, or what is the meaning of those POIs, for example.
Picture 1. Example of an augmented reality geolocated view.
Another widely spread augmented reality system is the one that uses the optical sensors to “catch” what is happening in the environment, analyzes the input information and overlays the augmented information over the real physic world. As augmented reality is becoming more popular, these AR systems which use the camera as the “door” for input data are playing an important role on the merging between both worlds.
Picture 2. Example of an augmented reality image tracking.
One of the simplest cases of these systems based on optical sensors is the known as image retrieval or image matching; where the system will recognize an image that appears in the scene and will trigger an event. But where the merging between augmented and real physical world is more noticeable, in the optical sensors based systems, is in the Image or Object tracking. While in the previous example – image matching- the system triggers an event when an image is recognized, in the later -image tracking- apart from recognizing an image or object the system is able to know the camera pose too. This camera pose estimation will allow the system to overlay a 3D object just where the target is and with the perspective that we would use if it were a real object, providing the perception that the overlayed object is in fact in the real physical world, as it can be seen in the picture 2.
Tags: ARBrowser, Augmented Reality, Image Tracking