by Minda Zetlin
Drivers can use an augmented reality (AR) owner’s manual to maintain their Audi. Cosmetics giant Elizabeth Arden uses augmented reality to market new fragrances. An augmented reality app brings the Guinness Book of World Records to life. Clearly, the promise of AR is gaining traction in the consumer space. But what does the future hold for the augmented enterprise?
Augmented reality is any technology that adds contextual information to a user’s surroundings in real time. A typical use of AR might be to superimpose information about a landmark as it’s viewed through smart glasses or a mobile device camera. But AR’s current uses go way beyond tourism. And with the technology gaining steam and an agreed-upon standard language in place, AR is coming into its own.
AR has existed in various forms since the 1990s, but has been waiting for devices to catch up. That time has come: Data from analyst firm International Data Corporation (IDC) shows that 1.3 billion smartphones and 233 million tablets were shipped in 2014—each equipped with GPS and a camera. Smart eyewear from Google, Epson, Sony, and others delivers an always-on heads-up display. BI Intelligence expects the smartwatch market to expand by 41 percent a year until the end of the decade, putting another handy screen at workers’ disposal.
These devices deliver on the promise of AR, giving users easy access to the data they need in their current time, place, and situation. “It’s about bringing the context of your surroundings to you in an immediately usable way,” explains John Yopp, senior manager for applied research in the Oracle Retail Global Business Unit.
Here’s a look at some uses for AR that are possible right now. Some are use cases that could readily be built with current technology. You might even recognize some that are already in use.Reducing Medical Error
Oracle recently held a Design Jam where 55 Oracle engineers were challenged to come up with compelling enterprise AR applications. The winner was Vivek Narayan, outbound product manager at Oracle, with a use case designed to make hospitals safer for patients.
Narayan began with the premise that AR provides the greatest value in stressful situations where people need large amounts of contextual information, which led him to the healthcare arena. Nurses, he notes, work in a chaotic setting, dealing with panicked patients while having to synthesize data on the fly. “Every patient has a different reason for being there,” Narayan says. “Different allergies, different ages. Everything is different.”
In this intense setting, errors are surprisingly common. A 2013 study from the Journal of Patient Safety estimated that preventable harm to patients may cause more than 400,000 patient deaths in the United States every year.
In Narayan’s model, a nurse could use Google Glass as a guide through the workday. It could provide medical information about a patient along with any notes from a previous nurse’s shift. A low-energy Bluetooth beacon could confirm that the nurse is in the correct room (because GPS is inaccurate indoors), and Glass’ facial recognition capabilities could confirm that the nurse is observing the correct patient. Glass could perform retinal scans to check vital signs and gather other information by simply looking into the patient’s eyes.
Imagine you are in Times Square and you look around. You’ll be able to get information about everyone and everything in your field of vision.”–Andy Gstoll, CMO, Wikitude
Perhaps most important, Glass could recognize medications and flag potential drug interaction problems. “It may be able to do a real-time verification and say, ‘This medication you’re going to administer may be harmful because this patient is diabetic,’” Narayan says. “And it would tell you where to find an alternative medication.”Making Warehouses More Efficient
Warehouse operations can also benefit from putting data into context within a user’s physical environment. More than half the work in warehouses consists of “picking”—selecting items to be shipped to a customer or location.
In most modern warehouses, pickers work from orders printed on paper. The picker scans a barcode on the paper and matches it with the barcode on items to be picked. Once items have been picked, they are brought to a packing area where a second employee scans all the barcodes all over again to make sure the order is correct before packing it. This reduces errors, which are always costly, but takes up a lot of employee time.
In 2015, Oracle is planning a pilot program using AR to improve picking efficiency, both to learn about this technology for customers and for the benefit of Oracle’s own warehouses. In the pilot, operators use glasses or visors that direct them to the right location and then scan and confirm each item when they pick it up. They may also use haptic gloves that will confirm the correct item as each box is lifted. “Once the shipment is taken to the packing area, the person packing will be able to look at all the items through a visor and immediately confirm that everything’s ready, or that something is missing,” says Paolo Juvara, group vice president at Oracle.
It is the packing step that could provide the greatest efficiency gains, he says. And those could be substantial. According to a recent report by the logistics company DHL, using AR for “constant picking validation” can decrease errors by as much as 40 percent.Serving Retail Customers Better
The retail industry has already been quick to embrace AR, and developers at Oracle are exploring how contextual visual information can make a big difference to customers and merchants. In retail, possible AR uses are highly varied. A store manager could hold up a tablet, point its viewfinder at an item on a shelf, and see supply chain information about that item. “That lets you know whether it’s in danger of going out of stock, which is a big problem,” Yopp says. As in the hospital use case, the store could use low-energy Bluetooth beacons to track the user’s location.
Sales associates and browsing customers could point a mobile device at an item and learn which colors, styles, and sizes are available in the store, at other locations, or on the retailer’s website. It could bring up detailed information such as where the item was sourced, its content, and care instructions. “Let’s say it’s a fashion item that’s part of a curated outfit,” Yopp says. “It would give you that information. So the sales associate would have all the information that consumers get on the web.” For the consumer, he adds, “it satisfies the desire to bring the web experience into everyday life.”Seeing What Isn’t Yet There
AR lets users receive information about objects as they look at them. But it’s also a powerful tool for visualizing what objects would look like in a particular context. One example is the IKEA Catalog App, which allows users to point a smartphone or tablet camera at a room and see how a piece of furniture would look there. Thus far, the app has been downloaded more than 10 million times.
Number of smartphones enabled with GPS and a camera that shipped in 2014 (Source: IDC)
In manufacturing, consultancy Tangible Solutions is using a more sophisticated version of the same idea to prototype items before creating them. Customers can draw ideas on a piece of paper and use Tangible’s design software and AR to display the item on top of the drawing on a mobile device screen.
It’s a significant benefit, says Adam Clark, vice president of strategic development at Tangible, because 3-D printing a single prototype can cost Tangible’s customers US$3,000 or more. “We’re cutting that out before they’ve even spent that money, if they just want to see what it’s going to look like,” Clark says. “And we’re not doing rounds and rounds of 3-D prints. We’ve seen an increase in people getting it right the first time.”The Future of Augmented Reality
What will our world look like when this technology is mature? “Our vision is that at some stage you’ll be able to use a smartphone, glasses, or contact lenses, and look around you. And you will be able to augment a very complex environment,” says Andy Gstoll, CMO at Wikitude, based in Salzburg, Austria. Wikitude provides the Wikitude Software Developer Kit for developers working in AR. “Imagine you are in Times Square and you look around,” he says. “You’ll be able to get information about everyone and everything in your field of vision.”
For this to become reality, AR needs interoperability that—so far—is not quite in place. “One of the problems is that we have silo apps everywhere,” Gstoll says. “But let’s say you’re a content owner. You want to get your content in front of as many eyes as possible. You can publish it to each augmented reality platform, but you will have to adapt your content for different formats.” It’s similar to when web pages had to be optimized for different browsers, he says.
That may be changing soon. Augmented Reality Markup Language (ARML) 2.0 is in the process of being adopted as a general standard for AR programming. ARML 2.0 drives the three biggest AR browsers: Wikitude, Layar, and Junaio. These browsers have demonstrated that they are interoperable among themselves.
Standards adoption is only the first step toward AR going mainstream. “There are technical challenges that we are about to overcome,” observes Thomas Alt, CEO at Munich, Germany–based Metaio, which created the Junaio browser as well as the IKEA Catalog App. “It’s all about processing speed, battery life, and so on. Then there are user behavior challenges, informing people that they have these capabilities in their devices.”
From a long-term industry perspective, Alt finally sees AR fulfilling its potential. “Keep in mind that when we started as a company in 2003, there was one augmented reality application and it required three or four personal computers,” he says. “If you look at an application like the IKEA Catalog App, you can see we are experiencing almost unheard-of uptake.”
Photography by Shutterstock