In 1997, the world witnessed the defeat of world chess master Gary Kasparov by IBM’s Deep Blue. On July 25, 2007, Jonathan Schaeffer announced that he had succeeded in solving the game of checkers. Using clever computer algorithms that reduced the number of moves to be searched from the 1020 possible board configurations to 1014, he was able to prove that checkers (like tic-tac-toe) will always end in a draw if neither player makes a mistake, i.e., makes a less than optimal move. The sheer number-crunching capabilities of computing machines, made it inevitable that humans would look to them to perform cognitive tasks at, or well beyond, their own capability.
As Toffler had predicted, changes are occurring at an exponential rate in the first decades of the 21st century. Ray Kurzweil observed:
Artificial intelligence is all around us … If all the AI systems decided to go on strike tomorrow, our civilization would be crippled: We couldn’t get money from our bank, and indeed, our money would disappear; communication, transportation, and manufacturing would all grind to a halt. Fortunately, our intelligent machines are not yet intelligent enough to organize such a conspiracy.
The triumphs of AI in so many areas the depend on computational speed and accuracy, artificial intelligence was showing itself to exceed human capability by many magnitudes. This prompted many people to assert that it would not be long before computers surpassed human intelligence in all domains. (This belief was referred to as “strong AI.” However, advances in artificial intelligence began to run into great challenges in performing things like face recognition and metaphor that come much more easily to humans. Many began to question whether all concepts could be represented digitally, challenging the hypothesis that human thinking could be replicated by computers, an opinion described as “weak AI.”
Though artificial intelligence is ubiquitous in today’s world, some AI researchers suggest that the dream of a sentient computer like the HAL 9000 presented in the movie 2001: A Space Odyssey, will never be a reality. Roger Penrose argues that consciousness is distinct from algorithmic processing and that human thought has a non-algorithmic dimension not accessible by computers of any given processing power:
There is as much mystery and beauty as one might wish in the precise Platonic mathematical world, and most of this mystery resides with concepts that lie outside the comparatively limited part of it where algorithms and computations reside.
American biologist Edward O. Wilson argues that computers lack the lifetime of interactions that a human accumulates as unknown knowns in the unconscious mind, and therefore, could never successfully mimic human thinking:
To be human, the artificial mind must imitate that of an individual person, with its memory banks filled with a lifetime’s experience–visual, auditory, chemoreceptive, tactile, and kinesthetic, all freighted with nuances of emotion. And social: There must be intellectual and emotional exposure to countless human contacts. And with these memories, there must be meaning, the expansive connections made to each and every word and bit of sensory information given [in] the programs. Without all these tasks completed, the artificial mind is fated to fail Turing’s test. Any human jury could tear away the pretense of the machine in minutes. Either that, or certifiably commit it to a psychiatric institution.
The jury is still deliberating the case of strong AI vs. weak AI. Is there a distinct demarcation between consciousness and intellectual processing, or will artificial intelligence ultimately develop a self-awareness as its power increases? If the latter, machines could develop self-interest, and with it all of the politics that characterize human instincts. As we learn more about human cognition and computer capability, we will be able to make better judgements about the relative merits of the strong and weak AI positions, but the debate continues while we continue to explore the power of quantum computers.