1.1 Background
I have always taken pride in an active life and a balanced diet. However, my belly finally reacted to the extended time in front of a computer screen and the nearby kitchen pantry stocked with delicious munchies. My daughter, an internal medicine doctor, was concerned. She asked me if I had a sizable decrease in exercising, and I gave my usual pitch about my regular exercise pattern. She was not satisfied with my response and gave me a FITBIT(R) as a Christmas gift. Within a week of my wearing the Fitbit , she was ātauntingā me about my lack of exercise.
Exercising is a very personal activity, and yet it is highly influenced by our social network. People traditionally spend a lot of time making friends to exercise with, this network being typically limited to a geographical proximity. Fitness trackers have extended that concept to the wider social network, or to a worldwide level using Internet connectivity. Friends and family members can observer each otherās exercise patterns and share comments with each other. Fitness trackers combine five capabilities to build a collaborative exercising behavior.
- 1.
They use sensors to collect event data from me automatically. As long as I wear the Fitbit , it is collecting data. In doing so, it is reasonably accurate and very user friendly. Even my 93-year-old dad is able to use a Fitbit to collect his movement data.
- 2.
The tracker itself is a small low energy-consuming device connected using a BluetoothĀ®. Once charged, it lasts for a long time and gives me a fair warning before running out of battery. It uses a gateway to poll the data into a central storage.
- 3.
The central storage collates data about me and about my social network. It has the capability for storing raw data, aggregating it over time, comparing it to my social group and communicating the results to those in my network. Each of us may have different social networks. It provides me with information not only about people connected to me but also to their friends, thereby giving me an opportunity to make new exercise friends.
- 4.
My social network is able to collaborate with me in my exercising by viewing shared exercise data and then in turn encouraging me through comments. Fitbit provides us with emoticons (ācheerā and ātauntā) to share with each other. In the example above, my daughter was using a ātauntā that showed up as a SMS text on my cell phone.
- 5.
I can utilize professional consulting offered by the software, which uses knowledge of medicine and fitness activities. Under Armourās RecordTM app monitors my exercise, weight, calories burned, heart rate, and eating patterns and provides me with expert advice using IBM ās WatsonTM.
Potentially, this information could be shared with my primary care physician who can use it for monitoring my health, my health insurance provider who can give me discounts for healthy living, and with sports marketers who can use it for targeting campaigns for sports shoes. Many employers are subsidizing the cost of acquiring a Fitbit , Apple watch or other monitoring devices, as they perceive the value in driving healthy living programs among their employees. Thus, fitness trackers are enabling both collaborative exercising and health monitoring. In doing so, these Cognitive Things are facilitating a number of information processing activitiesāsensing, data sharing, comparing, correlating, interpreting, advising, alertingāto enable and support the related human collaboration to optimize action.
The āInternet of Thingsā (IoT) represents a growing sophistication among devices. Examples of devices in the network of the IoTs include mobile handsets, refrigerators, cars, fitness trackers, watches, eBooks, vending machines, and parking meters, and the number of types of devices is likely to grow exponentially over the coming years. These devices are already gathering and communicating massive amounts of data about themselves, which is collated, curated, and harvested by a growing number of smart applications.
Just connecting a device to the Internet does not result in collaboration. The core theme of this book is the identification of cognitive behavior among IoTs. A network of Cognitive Things uses a new computing paradigmānamely cognitive computingāalong with the power of the Internet and the data available from a collection of devices to forge new collaborations and create new applications never imagined before.
This chapter introduces the major propositions outlined in this book. It provides a definition of āCognitive Thingsā and the scope of devices discussed in this book. It introduces the concept of cognitive computing and summarizes the chapters, different potential reader personas and the area of focus for each persona.
1.2 What Are Cognitive Things and How Do They Function?
We are on the threshold of a massive explosion of connected things. A McKinsey report projects the potential business impact as $4ā11 trillion per year by 2025 using nine settingsāfactories (e.g., preventive maintenance), cities, human (e.g., improving wellness), retail, outside (e.g., self-driving vehicles), work sites, vehicles, homes, and offices.1 There are many other projections each defining IoTs and projecting their impact in the trillions of dollars.
How do the IoTs derive such large business impacts? Let me illustrate an example to show how far reaching these IoTs will be in disrupting markets and businesses. Business Week has projected the availability of driverless cars to premium customers by 2025 and is also predicting driverless technology taking over taxi and ride-sharing fleets by 2030.2 While the driverless car includes a large number of sensors to collect information about the car and the road, the autonomous vehicle is far more than a collection of sensors connected to the Internet. It is actually replacing the driver ! Through automatic gear change, cruise control, automated lane detection, to automated parking, we have seen a number of ways in which vehicles are beginning to use this data to perform tasks originally performed by the drivers . We3 will see a maturing technology capable of making a series of cognitive decisions to drive the car without requiring a driver. Instead of purchasing cars, potential riders may in the future use taxi services to move them from point A to point B, allowing for less public parking and potentially higher vehicular speed. These changes will have a profound impact on the auto insurance business, car dealers, taxi operations, rental cars, and public transportation. Moreover, they could easily translate to a sizable share of the $11.1 trillion business impact projected by the McKinsey study referenced above.
According to Dr. John E Kelly III, IBMās senior vice president, cognitive is the third era of computing. The first, tabulating, began in the late nineteenth century and enabled such advances as the ability to conduct a detailed national census and the United Statesā Social Security System. The next era, programmable computing, emerged in the 1940s and enabled everything from space exploration to the Internet. Cognitive systems are fundamentally different. Because they learn from their interactions with data and people, they continuously improve themselves. So cognitive systems never get outdated and only get smarter and more valuable with time. This is the most significant paradigm shift in the history of computing.4
Let me use an encounter to describe a Cognitive Thing. As I discuss the concepts underlying my book with a wide spectrum of people, I get interesting responses, ranging all the way from disbelief to personal encounters. The following story was recounted to me by Dan Abercrombie , CEO of Abletech. The story covers Danās personal interaction with a robot developed by research work at the Osaka University .5 While many of the features described here seem futuristic, it will likely be commonplace to find robots providing concierge services at hotels, conferences, and musical and sport events. These concierges will display a range of cognitive capabilities, including:- natural language comprehension - empathetic conversation - facial recognition- context retention and recall- knowledge of organization, products, and people
I was walking the expo floor at Semicon, and saw the banner from the booth of the Fujikin Corporation. One of their executive staff, a Mr. Suzuki (fictitious name), is a long-term industry contact and friend. I had a few minutes and decided to try to say hello to Mr. Suzuki. I walked into the booth and asked one of the (human) booth staff (in Japanese, which I speak fluently) if Mr. Suzuki was there that day. To my surprise, a voice coming a couple of meters from my right said, āSuzuki-san desu ka? Saki made imashita yo.ā (Something like, āOh, you would like to see Suzuki-san? He was here until a few minutes ago.ā) I turned around to see a distinguished gentlemen dressed as Leonardo Da Vinci sitting on a chair with a microphone. Upon somewhat closer inspection, it turned out that our Da Vinci was a chillingly lifelike robot.
Recovering from my initial surprise, I stepped in front of the robot and it greeted me cordially in English, and made comment about the booth being very busy and crowded, and that Mr. Suzuki would be back later. It was clear the robot had recognized my Caucasian appearance and decided that it should switch to English with me. Mildly miffed by Da Vinciās presumptuousness (how did it know I wasnāt French?), I mischievously decided to switch to Japanese, saying āSumimasen ga, eigo wo wasurete shimatta.ā (āSorry, but I have forgotten how to speak English.ā) Da Vinciās rejoinder in a thick Osaka dialect came without hesitation, āAh Ha Ha Ha Haā¦Washi ha italia-go wasureta! O-taku nihonngo jouzu ya naā; something like āHa, ha, haā¦I have forgotten my Italian! Your Japanese is really great!ā
By this point, I was totally off guard. In less than thirty seconds of interaction, the machine had overheard my initial inquiry, appropriately responded faster than any of the five or six humans in the vicinity, adjusted its choice of language based on my appearance and then switched languages again when requested, making a natural and spontaneous joke at the same time. I bantered with the robot for another couple minutes and then went on my way. About five hours later, I strolled by the Fujikin booth and Da Vinci once again, and this time Da Vinci called out to me as I walked in the aisle, āAh, mata irasshita, Suzuki-san ha asoko ni imasu yo.ā (Oh you are back. Suzuki-san is over there.) Suzuki-san, hearing his name mentioned, turned around in surprise, and saw me. We had a short conversation and well-wishes, both of us mixing in plenty of nervous laughter as Da Vinci continued to interject comments into our conversation, and somehow both of us feeling āobligatedā to be polite to Da Vinci and explain that we were long-term business acquaintances and all. Somewhat uncomfortable with Da Vinci participating, I made my greetings with Suzuki-san cordial, but brief, and went on my way again, acutely conscious of the reality that the need to deal with thinking machines will become routine in my lifetime.
Robots are beginning to perform the cognitive functions depicted in this encounterārecognize people, chit-chat, apply conversational humor, recall context in a later conversation, and be aware of the environment and the presence of others. In Chapter 10, I will be introducing some of these cognitive functions in detail, and how they are utilized in human conversation with a Cognitive Thing.
What is cognitive computing, and how does it relate to artificial intelligence and expert systems? A cognitive system learns at scale, reasons with purpose and interacts with humans naturally.6 āCognitive Computingā refers to automated agents that can learn complex tasks, interact with humans via natural interfaces and make autonomous decisions and actions working with individual and groups. It represents a new generation of computing systems enabling genuine humanāmachine collaboration where the system is able to understand high-level objectives specified by humans in a natural language, autonomously learn how to achieve the objectives from data in the domain, report results back to humans, and iterate the interactions via sequential dialog until the objectives are achieved. To enable a natural interaction between them, cognitive computing systems use image and speech recognition as their eyes and ears to understand the world and interact more seamlessly with humans. It provides a feedback loop for machines and humans to learn from and teach one another. By using visual analytics and data visualization techniques, cognitive computers can display data in a visually c...