By David Dorf-Oracle on Dec 18, 2014
Remember the first time you tried Shazam on the iPhone? I was blown away. Even with ambient noise the thing was accurate. Then I recall John Yopp, our head of research, say we should create a fashion Shazam that identifies clothing for people. When you see a cool tie at lunchtime, snap a picture and buy your own. Wait a second, songs are one thing but fashion would be impossible. Patterns, shadows, creases -- it would never work.
Then I came across GetFugu and Google Goggles, which both made good attempts are recognizing products. Amazon's Flow was also very good, but it heavily leveraged optical character recognition to get hints about the product. I suppose that fine when shopping in stores, but it wasn't the real world scenario I was looking for. (Flow has undergone many upgrades over the years and now it can create a shopping list.) Pounce was pretty good at marrying traditional advertising with digital, allowing the user to snap a picture of a product in a circular/flyer then see the product on the Website.
In a Customer Advisory Board meeting, one of my customers showed me a very cool app for recognizing sneakers. NetShoes, a Brazilian e-commerce company, released an app that I found to be very accurate. (I went around the conference snapping pictures of people's sneakers. Luckily it was the last day so most were wearing comfortable shoes for the plane ride home.) I later contacted the engineers and found there was a pretty exhaustive process for training the application to recognize the objects, but it could be used for almost anything given some degree of context.
Five years or so after my first experience with Shazam, I think we're getting closer. Companies like Slyce are investing heavily in the technology necessary. But we've still got a ways to go. I downloaded and tried Neiman Marcus' implementation of Slyce and tested a few handbags. Close but no cigar.