I have a confession to make. I am a technologist at heart. I think fans that talk to thermostats that can then be remotely controlled by iPhones are cool. I get excited thinking about Fitbits linked to diet tracking and cars that can tell me when I am drifting into another lane. The variety of, and the speed with which, Internet of Things devices and offerings are entering the market continue to impress and intrigue me. Where these systems are headed is positively mind blowing and at the epicenter of it all is something very simple and extraordinarily powerful: data.
The Internet of Things will evolve through three not entirely distinct phases. They are all pointed at what I see as the Holy Grail of the IoT: leveraging the utility value of the data. So what is the utility value of the data and why should we care? Let’s go through the evolution.
Closed Loop Systems
Let’s start with basic systems.
- Phase One: A closed loop message response system where a Coke machine is instrumented and informs the owner/operator when it needs to be refilled or the temperature is too high. This is a familiar example everyone understands. It’s cool, but only to a point. The data created by the sensors is simply used in the corresponding application by the servicing operation.
- Phase Two: We are beginning to see examples of enhanced data for richer applications. Here, the historical data on the usage of the machine is collected, along with the historical information on the temperature of the machine, inventory levels, mix, location, and even the forms of payment. By collecting this information, the owner/operator is able to do a much better job understanding what is happening, as well as, to some degree, why it is happening. Now some modeling can begin and, to some extent, optimization of everything from the servicing to the content to the energy used. This is still a closed loop system, but it is now becoming data-rich. The quality of the information gathered and thus, the decisions that can be made, substantially improved. All good.
Phase Three: The Holy Grail
Now we get to the “utility value of the data.” Using the Coke machine example, data now is abstracted from the system of origin. The information generated from the sensors on the Coke machine can be cleansed, enriched and provided to a master data repository. Once there, it might be combined with weather and environmental data, traffic data, demographic data, health data; the combinations are virtually endless.
The convergence of this variety of data then works to further enhance the original application (in this case the Coke Machine operation application). Now other, disparate applications could also consume that combined information and add in more layers of data variety to their effectiveness. This data could service a common set of analytic tools, ranging from operational analytics (what is going on?) to investigative analytics (why is it going on?) to predictive analytics (what will go on?). This could then lead to feeding machine learning algorithms that would ultimately create truly adaptive systems. Finally, the same set, or subset of underlying data could service enterprise applications CRM or ERP, subsequently enhancing their value.
Greater Than the Sum of Its Parts
By approaching the baseline data from a utility standpoint, the reach, value and effectiveness of any singular bit of data is multiplied as it contributes to seeing the emergent patterns that might otherwise remain opaque. It is a situation where the whole is greater than the sum of its parts. In the Internet of Things, the parts are plentiful, the outlook is extreme, and the leverage to be gained by exploiting the utility value of the data is nothing short of tremendous. We don’t yet understand what thresholds we will cross or what discoveries will be made. The adaptive systems of tomorrow may be beyond our imagination today but will be commonplace for our grandchildren, if not our children. It is almost certain that these gains will be directly tied to leveraging the utility value of this data.