To set the stage, we were first given a brief introduction to the history of Artificial Intelligence at IBM:
1956 - Machine Learning
1997 - Deep Blue beat Kasparov
2011 - Watson beats human world champion in Jeopardy
2014 - Watson Oncology is designed to inform clinical decision-making
Over the past several years, tech companies have moved their focus from programming to machine learning. The focus has shifted from coding a computer to coaching or training it. For instance, if you want to train a neural network to recognize a dog, you do not program it to identify whiskers, fur and eyes. Instead, you just feed it thousands of images of dogs so it can identify dogs on it own.
Cognitive learning will also make an augmented world possible, and here's what it will increasingly allow us to do:
1. Engagement: Create speech or language capability to communicate with users
2. Discovery: Glean insights from huge volumes of data
3. Decision: Justify a decision based on evidence
4. Exploration: Pull unstructured data and provide a 360° view of the result.
One of the key drivers of Industrial Revolution 4.0 is Artificial Intelligence (AI). AI is now connected to the Internet of Things (IoT) and AI is pervasive across industries and across functions.
Does that mean AI will eventually destroy several of our current jobs? It is a possibility, but better jobs will also open up. Moreover, jobs that require a strong understanding of context, perception, or emotions such as empathy, will still need human intervention. AI will also be used as an aid to make informed decisions. The decisions, however, still will need to be taken by human beings. It is certainly a topic of huge possibilities and an area to watch out more closely.