Powered By Blogger

Monday, April 18, 2011

First CS855 Post - Horizon's Report 2011 Report on Augmented Reality

     The 2011 Horizon Report is produced as a collaboration between The New Media Consortium and the EDUCAUSE Learning Initiative. It is the seventh in a series dating back to 2006. Included in this report is a a series of short reports on specific new media (mediums) and technology, with an analysis of their impact and a forecast of when the authors feel they will be accepted into  the "normal technologies" accepted and used daily by the majority of users.
     The short report that I wish to talk about is Augmented Reality (AR) (pages 16 - 19). The report itself is only about one-and-three-quarters pages long, followed by a page-and-a-quarter of links to sites giving examples of how AR is being used in education. The report puts AR in the two-to-three range for Time-to-Adoption.
    This report defines AR as adding a layer of text or picture information over the real world using computer software. In my mind, this might be better described as "Visually Augmented Reality", for, at least in this article, there is no mention of adding sounds, such as a travelogue or 'sound track', as part of the AR experience.
     Augmented Reality has been around for over thirty years, beginning with head-mounted displays (helmets, video goggles, etc.), but has moved recently to the Internet in the form of web-browser and smartphone applications.
     The report states that there are two basic modes of AR: one using Visual Metaphors (i.e., markers that can seen by a camera, either connected to the computer or the camera); and one using spatial positioning. The markers of the Visual Metaphor can be something as simple as a bar code on a card or in a book, or as complex as a face or building in the real world itself. Spatial Positioning AR makes use of a mobile device's (laptop, netbook, smartphone) GPS, compass and position/motion sensor information to determine its location while the camera looks at the objects nearby. These applications are called "gravimetric' applications and that title caused me some cognitive dissonance, for, as an old engineer at heart, "gravity and metric" do not mean "location" to me! But then, as Mrs. Malaprop always said, "A word means just what I want it to mean! Nothing more and nothing less!" So, "gravimetric applications" are AR applications that determine their location on the earth!
     Going back to the definitions, in the visual metaphor, the marker, for example a book, is held up to a computer with a camera, additional words or pictures appear on the computer screen on or around the live video feed of the camera. In the Spatial Positioning, the additional words or pictures show up on the screen of the mobile device, most often the screen of a smart phone with the 'camera' function turned on. No head-mounted screen is needed, for the screen and processing power of the portable device allows for easy use and display!
    The authors go on to explain that this has great relevance in education, gaming and general life. AR can make books and museums more interactive and less passive. Working backwards in the games, it allows the user to become a part of the game (a video avatar, for example), so, as with the Wii and the KNECT, user motions become the 'input interface' and the video camera adds the user into the game, i.e., instead of adding information to reality, adding information from the real world to the 'unreality' of the gaming landscape! In the third case, using a gravimetric application, locations in a city or a building can be scanned by the smartphone camera and additional information, pictures, or even movies can be added to the picture on the screen.
     As I am working on things in Haptics, or touch-based, systems, the article's discussion of an AR application in use by the J. Paul Getty Museum, where the visitors can explore its Augsburg Display Cabinet, a 17th century collector’s cabinet of wonders (or early museum of oddities or collections) without actually touching its delicate objects. If it were possible to add a haptic/tactile component, such as a glove, the users could not just look at the different objects, but FEEL them as well - feel their textures, weights, etc.!

2 comments:

  1. OK. just noticed that I forgot to mention the trends that this is associated with.

    The move to create AR-enhanced applications for cell phones and web-browser applications is affecting quite a number of trends:

    * Educational - instructors can produce applications to show specific information or instructions. Think, for example, of a class in engine repair, the AR app shows a text line identifying the parts, demonstrating removal or adjustment, and giving the instructions.
    * Marketing - (happening now)- toys and cards include AR markers that show characters and movies on the computer when "taken' to the seller's website.
    * Publishing - interactive books - maybe even a reader for young children, examples for a science or math text, etc.
    * Advertising - adds show up when the user scans the front of a store, or the menu and specials of the day for a restaurant. Also, this could be individualized so that a vegan sees vegetarian items only.
    * Gaming - user avatars added to games, even with special 'overlays' of the avatar for character clothes, devices, weapons, etc.
    Probably more, but that seems pretty good!

    ReplyDelete
  2. Comment on a trend in the 2011 Horizon report.

    I selected:

    The abundance of resources and relationships
    made easily accessible via the Internet is
    increasingly challenging us to revisit our roles
    as educators in sense-making, coaching, and
    credentialing.

    I teach in class and online, so I am aware of doing different in the classroom. I use videos from YouTube and go and get good images of technology and such, plus use Websites as examples.

    On the other side, I have to submit nearly all term papers to TURNITIN.com for the students are also using the web to create their work for them!

    We need, as instructors to make use of as much of the new tech as we can, but lecture is still mostly lecture.

    ReplyDelete