What is Augmented Reality?

 

Location awareness in Museums

What is Augmented Reality?

Simply put, augmented reality is the merging of virtual information like animation and graphics with the real world.  It tries to enhance our real world experience by making inanimate objects interactive in the form of digital illusions, either visual or in other forms. Recent years have seen the maturing of this technology. It is currently used in a wide array of industries:

  1. Medical
    • What conventional X-ray and computed tomography devices failed to achieve, AR stepped into the gap by integrating the interior view and the exterior view of the patient. This enables the physician to see directly inside the patient. One example is the Camera Augmented Mobile C-arm, or CamC. The movile C-arm provides X-ray views in the operating theatre and CamC extends those views with a conventional video camera and arranges the views with the X-ray optics to give precise image pairs (Navab et al. 2010).
vis_foot
Visualization of an overlay of a bone over a foot. Source: (Navab N., Fallavollita P., Habert S., 2016)

2. Personal Information Display

There is currently a large variety of AR browser apps available on smartphones (e.g. Layar, Wikitudes, Junaio and others). These apps deliver information related to places of interest in the user’s environment, superimposed over the live video captured from the smartphone’s camera. Using geo-coordinates identified via the phone’s sensors (GPS, compass readings) or image recognition, information about the places are provided to the user. To date, because of the proliferation of smartphones and the accessibility of app development, these AR browser apps are growing. And as seen in the Google Translate app, where the user just has to point the camera towards the printed text, the target language will be overlaid on top.

4540851506_3385d360a3
AR technology allows guests to explore hotel rooms in a real life setting, and see the exact options available in terms of rooms, sizes, layouts and more. Source: (Christina, Augmented Reality Applications For Hoteliers – Augment News, 2016)

3. Television

AR was first used as annotations to live camera footage in broadcast TV. Examples of these are prevalent in sports broadcasting where live scores of football matches are shown as virtual overlays over live footage in real-time.

Similar technologies are also being used in the film industry, to provide a movie director and actors with live previews of a scene after effects have been added. This is termed as Pre-Viz.

4. Advertising and Commerce

Holding great promise for a truly interactive experience for the customer, AR is already being welcomed in advertising and commerce. As highlighted in the chapter “Augmented reality applications”, corporate bigwigs like Lego and Ikea are already utilising this technology to market their products. For example, Lego allows customers to hold a toy box up at an AR kiosk which will display a 3D image of the assembled Lego model. Customers can turn the box to view the model in any angle.

Pictofit is a virtual dressing room application that allows users to preview certain clothing from online stores on their bodies (Pictofit, 2016). The clothing are adjusted to match the user’s size. To make it an even better online shopping experience, body measurements are estimated and made available to help the user decide on sizes.

pictofit_1000
Source: (Pictofit, 2016)

5. Games

The Eye of Judgment was one of the first commercially available AR game. It is an interactive trading card game created for the Sony PlayStation 3 console. Video games, with their limitations confine users to a virtual realm. AR games bring digital into the real environment. For instance, Vuforia SmartTerrain delivers a 3D scan of a real scene and turns it into a playing field.

Vuforia SmartTerrain turns a real scene into a playing field for users. Source: (Vuforia™ Smart Terrain™, Youtube.com, 2016)

Microsoft’s IllumiRoom (Jones et al., 2013) is a projector-based AR game prototype which combines a TV set with a home-theatre projector to extend the virtual world beyond the TV. The player may concentrate on the centre screen, but the peripheral field of view is also filled with dynamic images, leading to a dynamic and immersive game experience.

An immersive game experience  using TV and projector – The IllumiRoom. Source: (Microsoft IllumiRoom Full Demonstration, Youtube.com, 2013)

The Xbox and Playstation have also included augmented reality capabilities for the last two console generations. These games come in the form of Kinect and Playstation Eye. You can “see” yourself in the game and interact with the game characters who seem to be in the same room. This is achieved through the use of a camera that captures your image and the screen where your image is shown, with motion sensors capturing your gestures (Virtual reality society, 2016).

6. Training

Assembling things, repairing them and understanding how things work are made easier through the use of AR. These are challenges in many professions. A huge amount of time is often wasted in studying manuals and documentation, since it is impossible to memorise procedures and processes in detail. AR can present instructions superimposed in the field of view of the worker. This makes it easy for the user to follow real-time instructions and saves time.

01fig14_alt
Geo-registered view of a virtual excavation revealing a gas pipe. Source: (Schmalstieg and Hollerer 4-32, 2016)

 

An augmented reality instruction manual for a coffee machine. Source: (Reuterdahl H., Augmented Reality Instruction Manual For Jura Coffee Machine, Vimeo, 2014)

A greater development will be if AR can provide a shared visual space for live mobile remote collaboration on physical tasks (Gauglitz et al. 2014a). A remote expert can explore the scene independently of the local user’s current camera position with this approach. Spatial annotations can be immediately visible to the local user in the AR view. AR telepresence combines video conferencing and remote scene exploration into a collaborative interface.

Military

Soldiers wearing heads up displays (HUDs) can see information tagged to real objects in the physical world. Information such as important radar details, orders or any other relevant sensor data can be provided from devices on the network. The enemy and friendly positions are of course crucial information that can be retrieved (Virtual reality society, 2016).

A Ukrainian company, LimpidArmor, has recently developed a Microsoft Hololens based helmet for tank commanders. The Circular Review System (CRS) is a combat helmet equipped with a Hololens. This system integrates with cameras within the tank, giving commanders a 360-degree view in both optical and thermal forms without exposing them to risk. The Israeli military has also bought two systems to test the CRS as well. The CRS is sophisticated, allowing the soldier to identify friends and foes, designate targets and feed other information on the battlefield directly to the commander. LimpidArmor is in the process of adding features like the control of the armaments of the tank, integrating feed from drones and other aerial vehicles to the CRS. (Futurism.com, 2016)

14292526_1181412001919605_631542376642410472_n

Source: (LimpidArmor, 2016)

Education

It is possible to use existing learning material like textbooks as targets for augmented reality. Animations can be developed for these pages to help students visualise information they learn from the textbooks. For example, you may see the model of a car engine animating through the pages of an engineering textbook or a working 3D model of a heart in a medical textbook (Virtual reality society, 2016).

A Brief History of Augmented Reality

It was Ivan Sutherland who started the field that would turn into today’s VR and AR. He mentioned in 1965 that,

“The ultimate display would, of course be a room within which the computer can control the existence of matter. A chair displayed in such a room would be good enough to sit in. Handcuffs displayed in such a room would be confining, and a bullet displayed in such a room would be fatal. With appropriate programming such a display could literally be the Wonderland into which Alice walked.

He also mentioned that,

“The user of one of today’s visual displays can easily make solid objects transparent – he can “see through matter!” (Sutherland, 1965)

Shortly after these bold statements, he constructed the first VR system. It was a head-mounted display named “The Sword of Damocles”. (Sutherland, 1968)

ivansutherland-sword-of-damocles_2up_1000
The first VR head-mounted display in the world – “The Sword of Damocles”. Source: (VR – Forensic VR, 2015)

Throughout the 1970s and 1980s, many other individuals experimented with the concept of mixing human interaction with computer-generated overlays on video for interactive art experiences. In these experiments, Krueger demonstrated interactive overlays of graphical annotations among participant silhouettes in his Videoplace installations around 1974. Ultimate, it required the advances in technology in the 1980s and 1990s for AR to emerge as an independent field of research.

Finally, the year 1992 was the year where the term “augmented reality” was birthed. It first appeared in Boeing (Caudell and Mizell, 1992), which strived to assist workers in an airplane factory by displaying wire bundle assembly schematics in a see-through HMD.

aerospace-blog
A see-through HMD at Boeing to guide the assembly of wire bundles for aircraft. Source: (David Mizell, 1992)

The medical field was an area where AR could be utilized. In 1994, a medical AR application was developed by State et al. at the University of North Carolina at Chapel Hill. This application allowed a physician to observe a foetus directly within a pregnant patient. Even though the accurate projection of computer graphics on an organic object is still a challenge today, this seminal work hints at the power of AR for medicine and similar tasks.

01fig05_alt
Primitive medical AR scans of the womb. Source: (Andrei State, UNC Chapel Hill)

In 1996, Studierstube, the first collaborative AR system was developed by Schmalstieg et al. Multiple users could experience virtual objects in the same shared space. Each user had a tracked HMD that could enable them to see perspectively correct stereoscopic images individually. Voice, body posture and gestures were not affected in this application. One of the applications was a geometry course (Kaufmann and Schmalstieg, 2003), which was tested with high school students with success.

In 1998, Thomas et al. published their work on the construction of an outdoor AR navigation system called Map-in-the-Hat. Thereafter, Tinmith (An acronym for “This is not map in the hat”) was produced as its successor. It was used for 3D surveying, but was also most known for delivering the first ever outdoor AR game (long before Pokemon Go), ARQuake. This game is a port of the popular first-person shooter application Quake to Tinmith and places the user in the midst of a zombie attack in a real parking lot.

figure-26-arquake-image-courtesy-of-wayne-piekarski-wearable-computer-lab-university
The first outdoor AR game – ARQuake. Source: (Bruce Thomas and Wayne Piekarski)

When ARToolkit was released in 1999, it was the first ever open-source AR software available outside specialized research labs. (Kato and Bilinghurst, 1999) Black and white fiducials were used as a 3D tracking library, which could be manufactured on a laser printer. ARToolkit was widely popular because of the accessibility of webcams.

example_multimarker_barcode
Example of the black and white fiducials used by ARToolkit. Source: (ARTookit.org, 2016)

After 2000, the rapid evolution of mobile computing played a huge part in the growth of AR. In 2003, Wagner and Schmalstieg presented the first handheld AR system on a “personal digital assistant”, a precursor to the smartphones of today. It took several years, however, for the first usable natural feature tracking system for smartphones to be introduced (Wagner et al. 2008b). This work was the ancestor of the very popular current Vuforia toolkit for AR developers. In recent years, the area of tracking had a great breakthrough in the form of the parallel tracking and mapping (PTAM) system of Klein and Murray (2007), which can track in unknown environments. Newcombe et al. (2011a) developed the KinectFusion system, which builds detailed 3D models from in expensive depth sensors (Schmalstieg and Hollerer 4-32, 2016).

Tracking, Calibration and registration

In AR, three important terms are related to the measurement and alignment of objects. The three terms are tracking, calibration and registration. They overlap in practical use.

Registration refers to the alignment of spatial properties. Objects that are registered in AR are aligned to each other in a coordinate system. The goal of AR systems is to register virtual information accurately with physical scene objects in a user’s perception. Specifically, see-through displays should show computer graphics such that they align with real-world objects.

Calibration refers to the offline adjustment of measurements (Wloka, 1995). Calibration correlates the readings of a sensor with those of a standard. This will help to check the accuracy of the sensor. Two different devices, a reference device and a device to be calibrated, are compared in measurements. The reference device can be replaced with a known reference value or a known coordinate system. The objective is to determine parameters for using the device to be calibrated to deliver measurements. Calibration may only be done once in a device’s lifetime. Calibration is responsible for static registration. Tracking, on the other hand, is responsible for dynamic registration.

Tracking is used to describe dynamic sensing and measuring of AR systems. The relative pose, which is, the position and orientation of theAR display relative to the objects, is important in tracking. As AR operates in real time, pose measurements are updated in real time as well. This is a major difference between tracking and calibration. While calibration may be done once in a device’s lifetime, tracking may be continuously updated.  (Schmalstieg, Dieter and Tobias Hollerer, 2016, pp 86-87)

Sensors for augmented reality

For augmented reality to work, the following sensors must be available on the device:

Global positioning system (GPS):  Many AR apps like Pokemon Go depend on location based searches. Accurate location tracking tools like GPS are needful for a good augmented reality experience. Location services are needed for most AR apps.

Magnetic sensor (Compass): A good example of an AR app that uses the compass is Starchart. The compass tells us the direction the user is pointing towards and most AR apps depend heavily on direction based data. If the compass is missing on the device, most AR apps may not work.

starchart
Source: (Google Play, 2016)

Orientation sensor (Gyroscope, accelerometer): These apps determine how you are holding the device. The data then helps AR apps to change the position of on-screen information according to the device’s orientation. (David, 7 Labs, 2012-2016). Laser gyroscopes or fibre-optic gyroscopes, like those used in aviation, measure angular acceleration based on interference of light (the Sagnac effect) observed at the end of a looped fibre-optic coil. 

Odometer: An odometer is frequently used in mobile robotics to incrementally measure distance travelled. A mechanical wheel encoder determines the number of turns taken by a wheel on the ground. (Schmalstieg, Dieter and Tobias Hollerer, 2016, pp 99-104)

The sensors discussed above are crucial because they are mobile. However, their accuracy is usually not sufficient to achieve the high-quality registration required in AR.

Types of AR displays

AR is often assumed to be the visual overlay of information on to a person’s perception of the physical world. Other sensory modalities can play an important role in AR as well. As humans, our experience with the physical world is engaged through multiple sensors, so it is logical for AR to also support multiple augmentation modalities. The following is a look at different sensory AR displays.

Audio displays: Museum audio guides have been around for a long time, and for a long time, these audio tours were rather linear, non personal. Early systems broadcast taped narratives to visitors who received those audio devices at the entrance desk. The recipients of the broadcast will then nod as they walked and listened to the broadcast. (Tallon and Walker, 2008)

Starting in the early 1990s, when the first handheld GPS receivers were available, researchers at the University of California, Santa Barbara, implemented audio navigational support for the blind, using GPS and geographic information system (GIS) resources with voice synthesis and virtual acoustics to communicate navigational information. (Loomis et al., 1993), (Loomis et al. 1998)

If a virtual audio source is to be registered with a 3D location, so that the user can perceive the sound as coming from the 3D location, spatial audio technology must be used. Sound in complex environments is challenging. Modern AR headsets, such as Meta 2 and Microsoft HoloLens naturally support spatial audio. In fact, reviewers of the Microsoft HoloLens spoke greatly of the spatial audio experience, which is delivered through speakers embedded in the headset and not conventional earphones. It is needful for the simplicity of use for the end user to avoid obtrusive measurements of person-specific transfer functions. (Schmalstieg, Dieter and Tobias Hollerer, 2016, pp 34-35)

Projection mapping and augmented reality

Projection mapping as an augmented reality method has lots of potential. It requires a lot of controlled and mapped space to work. The method that is most likely to take over smartphone augmented reality method as a common implementation method is one that uses head mounted systems. Head mounted systems that use smartphones to work often have something known as “camera pass-through”. It means that even though you can’t see anything other than the screen of the head mounted display (HMD), it can show you the real world using the camera of the phone. Though this allows for augmented reality without the need for a handheld device, it may leave one feeling disconnected from the experience, since the camera’s lack of depth perception pales in comparison to what the eye really sees.

The microsoft Hololens uses a “prism projector” which projects digital imagery into the prism projection system that sits between the eye and the real world. The user perceives the objects as though they were really there, sitting on a table or perched against a wall. (Virtual reality society, 2016)

Related field – Mixed Reality

virtuality_continuum_2
The mixed reality continuum. This diagram shows all the possible combinations of the real and virtual worlds. Source: (Milgram and Kishino, 1994)

Augmented reality is a part of the mixed reality continuum. Whilst a user immersed in virtual reality experiences only virtual stimuli, the space between reality and virtual reality which allows real and virtual elements to be combined is called mixed reality. Milgram and Kishino (1994) describes MR in this way:

[MR involves the] merging of real and virtual worlds somewhere along the “virtuality continuum” which connects completely real environments to completely virtual ones.

Benford et al. (1998) went on to say that a complex environment is often composed of multiple displays and adjacent spaces, which constitute “mixed realities”. Augmented reality, according to this, contains mostly real-world elements and thus is closer to reality. A user with an AR app will still perceive the real world in the usual way but with additional elements presented on the smart phone. Reality still dominates in this sense. Augmented virtuality, though, prevails when there are mostly virtual elements present. Just as in the case of an online role-playing game, where the avatars’ faces are textured in real time with a video from the player’s face.

Related field – Virtual Reality

At the far right of the MR continuum is virtual reality. This technology completely immerses a user in a complete virtual computer generated environment. Occulus, Sony, HTC, Samsung and Google use the VR Headset setup – often referred to as the HMD. This simple setup usually requires only three things – A PC, console or smartphone to run the digital app or game, a headset which secures the smartphone and places it in front of your eyes and an input in the form of head tracking, controllers, hand tracking, voice, on-device buttons or track-pads (Charara, S., Explained: How Does VR Actually Work?, Wareable, 2016).

 

References:

  • Navab N., Fallavollita P., Habert S., Chair for Computer Aided Medical Procedures & Augmented Reality. Campar.in.tum.de. N.p., 2016. Web. 

  • Navab N., Heining, S.-M., and Traub, J. (2010) Camera augmented mobile C-arm (CAMC): Calibration, accuracy study, and clinical applications. IEEE Transactions on Medical Imaging 29, 7, 1412-1423.
  • Christina, Augmented Reality Applications For Hoteliers – Augment News. Augment News. N.p., 2016. Web.

  • Pictofit. Netted net. N.p., 2016. Web.

  • Vuforia™ Smart Terrain™. YouTube. N.p., 2016. Web.

  • Reuterdahl H., Augmented Reality Instruction Manual For Jura Coffee Machine. Vimeo. N.p., 2016. Web. 2 Nov. 2016.

  • Sutherland, I.E. (1965) The ultimate display. Proceedings of the Congress of the International Federation of Information Processing (IFIP), 506-508.

  • Futurism.com, Augmented War: New Combat Helmets Are Equipped With Microsoft’s Hololens. Futurism. N.p., 5 November 2016. Web. 

  • VR – Forensic VR. Forensic-vr.com. N.p., 2015. Web. 

  • Sutherland, I.E. (1968) A head-mounted three dimensional display. Proceedings of the AFIPS Fall Joint Computer Conference, Part I, 757-764.
  • Caudell, T.P., and Mizell, D.W. (1992) Augmented reality: An application of heads-up display technology to manual manufacturing processes. Proceedings of the Hawaii International Conference on System Sciences, 659-669.
  • Kaufmann, H., and Schmalstieg, D. (2003) Mathematics and geometry education with collaborative augmented reality. Computers & Graphics 27, 3, Elsevier, 339-345.
  • Wloka, M.M., (1995) Lag in multiprocessor virtual reality. Presence: Teleoperators and Virtual Environments 4, 1, MIT Press, 50-63
  • Kato, H., and Bilinghurst, M. (1999) Marker tracking and HMD calibration for a video-based augmented reality conferencing system. Proceedings of the International Workshop on Augmented Reality (IWAR), 85-94
  • Artoolkit.org. N.p., 2016. Web.
  • Wagner, D., Langlot, T., and Schmalstieg, D. (2008a) Robust and unobtrusive marker tracking on mobile phon es. Proceedings of the IEEE and ACM International Symposium on Mixed an dAugmented Reality, 121-124.
  • Newcombe, R. A., Izadi, S., Hiligies, O., Molyneaux, D., Kim, D., Davison, D., Kohli, P., Shotton, J., Hodges, S., Fitzgibbon, A. (2011a) KinectFusion: Real-time dense surface mapping and tracking. Proceedings of the IEEE International Symposium on Mixed and Augmented Reality (ISMAR), 127-136.
  • Schmalstieg, Dieter and Tobias Hollerer. Augmented Reality: Principles And Practices. United States of America: Pearson Education, Inc., 2016. Print.

  • Tallon, L., and Walker, K. (2008) Digital technologies and the museum experience: Handheld guides and other media. AltaMira Press.
  • Loomis, J.M., Golledge, R.G., and Klatzky, R.L. (1998) Navigation system for the blind: Auditory display modes and guidance. Presence: Teleoperators and Virtual Environments 7, 2, MIT Press, 193, 203.
  • Loomis, J., Golledge, R., and Klatzky, R. (1993) Personal guidance system for the visually impared using GPS, GIS and VR technologies. Proceedings of the Conference on Virtual Reality and Persons with Disabilities.
  • David, Utilize Smartphone Sensors Smartly With Augmented Reality Apps. 7labs. N.p., 2016. Web.
  • Play.google.com. N.p., 2016. Web.

  • Virtual reality society. Augmented Reality – What Is It? – Virtual Reality. Virtual Reality. N.p., 2016. Web. 

  • Milgram, P., and Kishino, F., (1994), A taxonomy of mixed reality visual displays. IEICE Transactions on Information Systems E77-D, 12, 1321-1329

  • Benford, S., Greenhalgh, C., Reynard, G., Brown, C., and Koleva, B. (1998) Understanding and constructing shared spaces with mixed-reality boundaries. ACM Transactions on Computer Human Interaction 5, 3, 185-223.
  • Charara, S., Explained: How Does VR Actually Work? Wareable. N.p., 2016.
Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s