Cooper: [When Cooper tries to reconfigure TARS] Humor 75%.
TARS: 75%. Self-destruct sequence in T minus 10, 9, 8...
Cooper: Let's make it 65%.
TARS: Knock, knock.
TARS: I have a cue light I can use to show you when I'm joking, if you like.
Cooper: Hey TARS, what's your honesty parameter?
TARS: 90 percent.
Cooper: 90 percent?
TARS: Absolute honesty isn't always the most diplomatic nor the safest form of communication with emotional beings.
Cooper: Okay, 90 percent it is.
Nolan lovers must surely have understood by now what I am hinting at. For those infinitesimally few of you who haven't watched Interstellar, the above are excerpts from a dialogue exchanged between Cooper, the astronaut and Tars, the amazingly intelligent space robot, in the movie. The splendid display of wit and humour by this man-made robot is an amusing anecdote of artificial intelligence from the future.
I don't really want to bore you with the science concerned with the AI. The enthusiastic geeks know their way of finding the right links to help quench their techno-cravings. For the uninitiated and unconcerned, my decision will in no way but please you. Now, done with the preface, let me just seal off the introduction with a one line definition, courtesy Wikipedia-
“AI is the intelligence exhibited by machine or software.”
We are well aware of the roller coaster advancements in technology. Technology is no longer working on how it was designed. It has its own intelligence. It is smart enough to not always rely on its maker’s orders. We are regularly bombarded with information about smarter phones, smarter robots, smarter cars and smarter what-not’s! In this context, I can't help but recall the statement made by Edsger Dijkstra in Mechatronics Volume 2: Concepts in Artificial Intelligence-
“The question of whether a computer can think is no more interesting than the question of whether a submarine can swim.”
Well, think again. Is it the human beings that are becoming smarter everyday or is it the technology which they are building? What if AI outsmarts the smartest? What if that robot which you bought for your son, who plays with it dearly, begins emoting? What if, green with jealousy, the once-a-toy-now-a-villain robot crafts an evil plan to harm you and your son? Best of it all, what if all the robots join hands and try to wage war against humanity, their very masters who planted the elixir of intelligence into their system? One can not even dare to imagine the consequences, given the advanced drones and war robots being built nowadays.
The AI does not hate you, nor does it love you, but imagine if your atomic building blocks are what your robot is after? Of course, we don't want ourselves amidst a Terminator or an I, Robot sequence. We don't want to see our creation bringing an end to our race. AI does bring with itself the big promise for a better future but are the consequences worth the prize?
We do want intelligent computers to help us crack the unresolved paradoxes. We do want them to find the fault in our stars! We do want them to enlighten us to a more sophisticated interpretation of science. However, consider this: what if such a powerful tool landed in wrong hands? Imagine an artificially intelligent computer virus that can smartly modify itself according to its target. AI is an extremely powerful weapon and any misuse will hazard catastrophic outcomes.
AI has certainly the potential to become the hot-cake, the gazillion dollar modicum of power and success. However, the probability of it being used as a weapon of mass destruction clearly juxtaposes with its service as a beacon of hope. The tardy realisation of how lethal Einstein’s discoveries could be forced the human race to witness the second world war in the most deadly way imaginable. And hence, the concern. Presently, the scientific community is abuzz with debates, discussions and raised eyebrows on where this technology is headed. We would unquestionably prefer the man to control the man-made and not the other way round. Wouldn’t we?
One hopes, fingers crossed, that the history does not repeat itself.comments powered by Disqus