Kevin Bonsor

kevinbonsor_1333847526_600Video games have been entertaining us for nearly 30 years. Computer graphics have become much more sophisticated since then, and game graphics are pushing the barriers of photorealism. Now, researchers and engineers are pulling graphics out of your television screen or computer display and integrating them into real-world environments. This new technology, called augmented reality, blurs the line between what’s real and what’s computer-generated by enhancing what we see, hear, feel and smell.
On the spectrum between virtual reality, which creates immersive, computer-generated environments, and the real world, augmented reality is closer to the real world. Augmented reality adds graphics, sounds, haptic feedback and smell to the natural world as it exists.

7 thoughts on “Kevin Bonsor

  1. shinichi Post author

    Video games have been entertaining us for nearly 30 years, ever since Pong was introduced to arcades in the early 1970s. Computer graphics have become much more sophisticated since then, and game graphics are pushing the barriers of photorealism. Now, researchers and engineers are pulling graphics out of your television screen or computer display and integrating them into real-world environments. This new technology, called augmented reality, blurs the line between what’s real and what’s computer-generated by enhancing what we see, hear, feel and smell.

    On the spectrum between virtual reality, which creates immersive, computer-generated environments, and the real world, augmented reality is closer to the real world. Augmented reality adds graphics, sounds, haptic feedback and smell to the natural world as it exists. Both video games and cell phones are driving the development of augmented reality. Everyone from tourists, to soldiers, to someone looking for the closest subway stop can now benefit from the ability to place computer-generated graphics in their field of vision.

    Augmented reality is changing the way we view the world — or at least the way its users see the world. Picture yourself walking or driving down the street. With augmented-reality displays, which will eventually look much like a normal pair of glasses, informative graphics will appear in your field of view, and audio will coincide with whatever you see. These enhancements will be refreshed continually to reflect the movements of your head. Similar devices and applications already exist, particularly on smartphones like the iPhone.

    Reply
  2. shinichi Post author

    Augmenting Our World

    The basic idea of augmented reality is to superimpose graphics, audio and other sensory enhancements over a real-world environment in real time. Sounds pretty simple. Besides, haven’t television networks been doing that with graphics for decades? However, augmented reality is more advanced than any technology you’ve seen in television broadcasts, although some new TV effects come close, such as RACEf/x and the super-imposed first down line on televised U.S. football games, both created by Sportvision. But these systems display graphics for only one point of view. Next-generation augmented-reality systems will display graphics for each viewer’s perspective.

    Some of the most exciting augmented-reality work is taking place in research labs at universities around the world. In February 2009, at the TED conference, Pattie Maes and Pranav Mistry presented their augmented-reality system, which they developed as part of MIT Media Lab’s Fluid Interfaces Group. They call it SixthSense, and it relies on some basic components that are found in many augmented reality systems:
    ・ Camera
    ・ Small projector
    ・ Smartphone
    ・ Mirror

    These components are strung together in a lanyardlike apparatus that the user wears around his neck. The user also wears four colored caps on the fingers, and these caps are used to manipulate the images that the projector emits.

    SixthSense is remarkable because it uses these simple, off-the-shelf components that cost around $350. It is also notable because the projector essentially turns any surface into an interactive screen. Essentially, the device works by using the camera and mirror to examine the surrounding world, feeding that image to the phone (which processes the image, gathers GPS coordinates and pulls data from the Internet), and then projecting information from the projector onto the surface in front of the user, whether it’s a wrist, a wall, or even a person. Because the user is wearing the camera on his chest, SixthSense will augment whatever he looks at; for example, if he picks up a can of soup in a grocery store, SixthSense can find and project onto the soup information about its ingredients, price, nutritional value — even customer reviews.

    By using his capped fingers — Pattie Maes says even fingers with different colors of nail polish would work — a user can perform actions on the projected information, which are then picked up by the camera and processed by the phone. If he wants to know more about that can of soup than is projected on it, he can use his fingers to interact with the projected image and learn about, say, competing brands. SixthSense can also recognize complex gestures — draw a circle on your wrist and SixthSense projects a watch with the current time.

    Reply
  3. shinichi Post author

    Augmented Reality on Cell Phones

    While it may be some time before you buy a device like SixthSense, more primitive versions of augmented reality are already here on some cell phones, particularly in applications for the iPhone and phones with the Android operating system. In the Netherlands, cell phone owners can download an application called Layar that uses the phone’s camera and GPS capabilities to gather information about the surrounding area. Layar then shows information about restaurants or other sites in the area, overlaying this information on the phone’s screen. You can even point the phone at a building, and Layar will tell you if any companies in that building are hiring, or it might be able to find photos of the building on Flickr or to locate its history on Wikipedia.

    Layar isn’t the only application of its type. In August 2009, some iPhone users were surprised to find an augmented-reality “easter egg” hidden within the Yelp application. Yelp is known for its user reviews of restaurants and other businesses, but its hidden augmented-reality component, called Monocle, takes things one step further. Just start up the Yelp app, shake your iPhone 3GS three times and Monocle activates. Using your phone’s GPS and compass, Monocle will display information about local restaurants, including ratings and reviews, on your cell phone screen. You can touch one of the listings to find out more about a particular restaurant.

    There are other augmented reality apps out there for the iPhone and other similar phones — and many more in development. Urbanspoon has much of the same functionality as Yelp’s Monocle. Then there’s Wikitude, which finds information from Wikipedia about sites in the area. Underlying most of these applications are a phone’s GPS and compass; by knowing where you are, these applications can make sure to offer information relevant to you. We’re still not quite at the stage of full-on image recognition, but trust us, people are working on it.

    We’ve looked at some of the existing forms of augmented reality. On the next page, we’ll examine some of the other applications of the technology, such as in video games and military hardware.

    Reply
  4. shinichi Post author

    Augmented Reality in Video Games and the Military

    Video game companies are quickly hopping aboard the augmented-reality locomotive. A company called Total Immersion makes software that applies augmented reality to baseball cards. Simply go online, download the Total Immersion software and then hold up your baseball card to a webcam. The software recognizes the card (and the player on it) and then displays related video on your computer screen. Move the card in your hands — make sure to keep it in view of the camera — and the 3-D figure on your screen will perform actions, such as throwing a ball at a target.

    Total Immersion’s efforts are just the beginning. In the next couple of years, we’ll see games that take augmented reality out into the streets. Consider a scavenger-hunt game that uses virtual objects. You could use your phone to “place” tokens around town, and participants would then use their phones (or augmented-reality enabled goggles) to find these invisible objects.

    Demos of many games of this order already exist. There’s a “human Pac-Man” game that allows users to chase after each other in real life while wearing goggles that make them look like characters in Pac-Man.

    Arcane Technologies, a Canadian company, has sold augmented-reality devices to the U.S. military. The company produces a head-mounted display — the sort of device that was supposed to bring us virtual reality — that superimposes information on your world. Consider a squad of soldiers in Afghanistan, performing reconnaissance on an opposition hideout. An AR-enabled head-mounted display could overlay blueprints or a view from a satellite or overheard drone directly onto the soldiers’ field of vision.

    Now that we’ve established some of the many current and burgeoning uses of augmented reality, let’s take a look at the technology’s limitations and what the future holds.

    Reply
  5. shinichi Post author

    Limitations and the Future of Augmented Reality

    Augmented reality still has some challenges to overcome. For example, GPS is only accurate to within 30 feet (9 meters) and doesn’t work as well indoors, although improved image recognition technology may be able to help [source: Metz].

    People may not want to rely on their cell phones, which have small screens on which to superimpose information. For that reason, wearable devices like SixthSense or augmented-reality capable contact lenses and glasses will provide users with more convenient, expansive views of the world around them. Screen real estate will no longer be an issue. In the near future, you may be able to play a real-time strategy game on your computer, or you can invite a friend over, put on your AR glasses, and play on the tabletop in front of you.

    There is such a thing as too much information. Just as the “CrackBerry” phenomenon and Internet addiction are concerns, an overreliance on augmented reality could mean that people are missing out on what’s right in front of them. Some people may prefer to use their AR iPhone applications rather than an experienced tour guide, even though a tour guide may be able to offer a level of interaction, an experience and a personal touch unavailable in a computer program. And there are times when a real plaque on a building is preferable to a virtual one, which would be accessible only by people with certain technologies.

    There are also privacy concerns. Image-recognition software coupled with AR will, quite soon, allow us to point our phones at people, even strangers, and instantly see information from their Facebook, Twitter, Amazon, LinkedIn or other online profiles. With most of these services people willingly put information about themselves online, but it may be an unwelcome shock to meet someone, only to have him instantly know so much about your life and background.

    Despite these concerns, imagine the possibilities: you may learn things about the city you’ve lived in for years just by pointing your AR-enabled phone at a nearby park or building. If you work in construction, you can save on materials by using virtual markers to designate where a beam should go or which structural support to inspect. Paleontologists working in shifts to assemble a dinosaur skeleton could leave virtual “notes” to team members on the bones themselves, artists could produce virtual graffiti and doctors could overlay a digital image of a patient’s X-rays onto a mannequin for added realism.

    The future of augmented reality is clearly bright, even as it already has found its way into our cell phones and video game systems.

    Reply
  6. shinichi Post author

    Columbia University Computer Graphics and User Interfaces Laboratory
    http://www.cs.columbia.edu/graphics/

    Georgia Tech’s Augmented Environments Lab
    http://www.cc.gatech.edu/projects/ael/

    Tracking Project at UNC-Chapel Hill
    http://www.cs.unc.edu/~tracker/

    Games Alfresco
    http://gamesalfresco.com/

    Total Immersion and the “Transfigured City”
    http://www.ugotrade.com/2009/09/26/total-immersion-and-the-transfigured-city-shared-augmented-realities-the-web-squared-era-and-google-wave/

    Reply

Leave a Reply to shinichi Cancel reply

Your email address will not be published. Required fields are marked *