Skip to main content
Human Cognition, the Final Frontier
Jan 1, 2014

Despite advances in technology, computers still can’t come close to the power of the world’s most remarkable computer – the human brain.

Computing machines have seen three phases: the tabulating phase, the programmable phase, and now the new era of computing, the cognitive phase [1]. Tabulating machines performed a fixed task, whereas programmable machines could be reprogrammed to execute different tasks without any change in the hardware. Cognitive machines, however, promise learning and reasoning capabilities.

The possibility of such machines raises the question: is cognition the ultimate test for conscious existence? What do we know about cognition? How do we define intelligence? The questions go on and on. One thing, however, everyone seems to agree with is the fact that understanding the mechanism of human cognition is the key to developing advanced artificial intelligence, and cognitive machines.

Cognitive machines have the ability to learn, and they employ artificial intelligence to "reason." Artificial intelligence is defined as, "The science of making machines do things that would require intelligence if done by men" [2]. The good news is that cognitive computing is no longer an esoteric pursuit of some futurists. It is, and has been, essential for the operation of many big-data driven processes. The 21st century has been flooded with data coming from almost every aspect of our lives. Technology has made it possible to generate data at an exponential rate. Temperature distribution throughout our buildings, the number of people diagnosed with cancer in the last six months, real-time changes in customer preferences, ethnic profiles of college applicants, and the top three words trending in online conversations at this very moment, are some examples of the kind of data available today.

As the amount of data generated increases, it also becomes harder and harder to process and make sense of the data collected. Our "greedy and ambitious" human nature does not want to waste, and it wants to use every bit of available data. This is where it becomes imperative to have a computing machine that goes beyond performing pre-programmed tasks and learns as it goes, without human interference. For this kind of computing, we need cognition.

The biggest challenge in imitating human cognition is to understand how cognition happens in the brain. It is obviously beyond our ability to monitor such activity that is constantly taking place in our brains, let alone recreating such marvels in the first place. Nevertheless, it will be a great achievement if we can manage to somewhat imitate human cognition, even partially. It would open a whole new era in terms of what can be achieved from a computing standpoint. For instance, the entire curriculum of a college degree can be processed by a cognitive machine in a fraction of a second; such machine can digest the whole of medical literature in a short period of time, provide human doctors with second opinions on their diagnoses [3].

Despite the fact that there have been substantial improvements in designing "intelligent" computing machines, mimicking the hardware of the human brain and simulating its decision-making processes, have posed three fundamental challenges: a hardware with comparable processing power and memory, a software algorithm to implement intelligent behavior, and the necessity of both being self-adapting and self-improving.

First of all, human intelligence has not been fully characterized – its capabilities and limitations are still unknown. This lack of knowledge makes it difficult, and perhaps even impossible, to reduce such intelligence to smaller, or simpler, modules. Therefore, we don’t have a good handle on how to mimic the human brain in a behavioral sense.

The second major problem is that we are still far away from having the hardware on which our "intelligence" software could run. Implementing intelligence in conventional computing machines, evidently, seems to be a futile undertaking. Programmable machines are no match for human brains; even the fastest supercomputers, taking advantage of thousands of processors, is able to mimic just one percent of one second worth of human brain activity-and even that takes 40 minutes [4]. Therefore, cognitive computing machines must incorporate different hardware architecture from conventional computers to achieve cognition comparable to humans. IBM’s SyNAPSE chip is one example of hardware inspired by the brain, and it has the potential to carry out the required, intense computations.

Lastly, the human brain and its cognitive power are constantly changing. Depending on various factors and experiences, our brains can improve or deteriorate; this is also true of our cognitive power. However, such improvement or deterioration could be in the form of a change in the physical structure or the amount of capacity utilized [5]. Such dynamic flexibility, also called Brain Plasticity [6], is essential to our intelligence. At this time, no self-evolving computing hardware has been worked out. However, promising developments have been reported with respect to cognitive computing machines that can learn – that is, they can make deductions and reach conclusions that are not preprogrammed.

Along the way, human supervision will be the ultimate guide in perfecting such imitation. Therefore, human cognition, taken for granted in our daily lives, remains to be the final frontier for our thousands-years long technological journey. Once again, the creation set the boundaries for human development.

Adem G. Aydin holds a Phd degree in Electrical and Computer Engineering. He works as an engineer scientist at IBM.

References

[1] Virginia Rometty, 2013, http://smarterplanet.tumblr.com/post/32816006311/i-b-m-chief-on-watson-cognitive-computing-and-her

[2] Marvin Minsky, 1968, http://www.akri.org/ai/defs.htm

[3] "WellPoint and IBM Announce Agreement to Put Watson to Work in Health Care", http://www-03.ibm.com/press/us/en/pressrelease/35402.wss

[4] "Largest neuronal network simulation achieved using K computer" http://www.riken.jp/en/pr/press/2013/20130802_1/

[5] William James, The Principles of Psychology

[6] Bryan Kolb and Ian Q. Whishaw, Brain Plasticity and Behavior, Annual Review of Psychology, Vol. 49: 43-64