In 1988, Mark Weiser, Chief Technology Officer of Xerox’s Palo Alto Research center coined the term “ubiquitous computing”. 1988, just four years after the launch of Apple Mac, was still in a heady embrace of the potential of desktop computing. The democratization of computing –now called personal computing – and the freedom from Mainframes was tantalizing people. Weiser went further – he proposed unchaining the computer user from even the desktop. He argued
The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.
We knew that this vision would actualize, one day.
Things started feeling real in 1999 when a paper authored by Pister, Katz, Kahn
argued that with the innovations in silicon technology, power consumption, and networking, now was the time to realize Weiser’s vision of ubiquitous computing by developing tiny mobile nodes called smart dust. Kris Pister even picked up funding in Silicon Valley to start a company called Dust Networks, which solved many technical challenges he identified in his seminal paper. However, the target markets – Oil & Gas, Manufacturing and others – felt that it was still too futuristic. The market was not there
-- the company never reached its potential. In addition, in 1999, Mark Weiser passed away due to complications with cancer.
For many, 1999 was the year when “the hope for Ubiquitous computing died…”
Recently, at the SPS IPC Drives event in 2016 at Nuremberg, Rexroth launched an IoT gateway and demonstrated how it can be used to connect an original Robert Bosch lathe (build in the 1870s) to the cloud for doing real-time analytics.
Figure 1: From the demo at SPS IPC Drives event in Nuremberg
This time, the demo felt different… We were no longer talking about futuristic presentations with smart cities, smart cars, or even smart jeans
- we were talking about enabling a 150-year old factory with IoT by using existing IT infrastructure inside the factory (the broadband) a few cheap sensors (under $10), an access point, and a cloud service that can be set up in minutes.
IoT now had become mainstream.
I have by now built many products in the IoT and the big data market – our systems that have been standardized for US Navy (DDS), used by over 7% of NYSE transactions -- our technologies even moved robots in Mars! But each implementation seemed custom, rarely qualifying to address a market. However, these days, as VP of IoT and Big Data at Oracle, when I visit our customers, I see us reach a tipping point that many in the industry had been waiting for a long time – a monetizable business opportunity that can be fulfilled with ubiquitous computing technology as it exists today!
Here is why (and we will skip the usual “Google answers” on why now, is IoT real):
First, the technologies behind building sensors (the MEMS chip) finally became cheap enough that we can now literally sprinkle sensors in our factory floors. So, yes, Silicon became cheap. Most of us in IoT know that by now (and yes, that is a “Google answer”).
Second, IoT technologies have been allowed an incubation period to attract new and world-class talent to let technologies such as Zigbee, Raspberry and Bluetooth mature. Yes, sometimes, the cooling period during the “trough of disillusionment” is necessary. Anecdotally and recently, as a mentor at Stanford University’s startup incubator startX, I have started meeting students creating new sensors and services around sensors. IoT now is not just for kids with EE majors anymore; I see more employees with MBAs working in IoT startups than with EE or ME majors, and that is a good thing (see upcoming section on creating new business models with IoT)
Third, the sensor ecosystem has matured; the ecosystem no longer requires extensive support from defense and research funding to survive – and that again is a good thing. Concepts evolve faster when markets move them. We have multiple companies competing for business to sell beacons, RFID tags and readers, RTLS tags sometimes for prices cheaper than a hammer! We have solution providers that can economically design sensors custom to your business – whether that is a sensor tracking Carbon Dioxide in a poultry farm or industrial grade temperature sensors that tolerate 800 F (at Oracle, we are using both for use cases in agriculture and manufacturing)! IoT engineers can now cheaply prototype end-end systems with Raspberry Pi systems, and then efficiently deploy them with custom sensors manufactured in small volumes.
All this was not possible until now – the startup capital costs used to be too high to get into the hardware business. Just like cloud infrastructure services and open source software reduced the expenses in starting a new software company, robust and commodity-priced sensing technologies (OS, sensors, networking, power) are now spurring the economics of creating commercial-grade sensor networks.
Finally, IoT started getting attention beyond the futurists. To, unfortunately, borrow a cliché, building a new market is a chicken-egg (or a multi-platform model) problem. We need a large number of customers to drive innovation and attract new talent, but the customer needs to see a compelling value proposition and new offerings to get engaged.
Here is where the value of hype kicks in – line of business owners (in factories etc.) are now getting worried about becoming non-competitive if they do not engage with IoT. What does IoT mean for me? Which startup (nee “Uber”) will disrupt my industry? How do I quickly show some innovation so that the market still views me as relevant and modern? These are now questions I help solve for our customers on a weekly basis! Enterprises – manufacturing, health care, energy, and construction – have started engaging with big data and IoT in a proactive manner. With each such engagement, we are getting the oxygen and the fuel to creating transformative technologies, nourish an ecosystem of entrepreneurs, offer robust and scalable cloud offerings such as one from Oracle.
Unlike 1999, now does not feel like 1999…