It is undeniable that computer technology has advanced in recent years. Evolved so much that some researchers estimate that computers can even "take over the world", becoming more intelligent than all humanity. And they set up a year ago that this can occur: 2045.
Imagine a computer that can write a great book, solve the mysteries of the universe and unravel the meaning of life in a few seconds. Fiction? We do not know. Fifteen years ago, the Deep Blue (IBM supercomputer) beat Garry Kasparov, the chess master. It is true that the game has a purely mathematical logic, something best worked by machines than by humans.
Then in 2011, Watson (another IBM supercomputer) won the 2 best human players in Jeopardy, a game of questions and answers. Questions like "What is that object, even when broken, is correct two times a day?" Are easily solved by supercomputer. Remember that the answer was not programmed by humans in the machine, nor was it connected to the Internet during the game. The only thing Watson does is collect data from millions of sources like websites and books. Even so, his intelligence is still much lower than that of humans.
However, that should change soon. According to the researcher in artificial intelligence and founder of Singularity University, Ray Kurzweil, from the year 2045 run serious risks facing a technological singularity, at which the machines will be able to play new machines increasingly powerful in a process infinite, without the need for elaborate algorithms by humans.
Although this is only a hypothesis, Kurzweil points out that we are building computers increasingly similar to our brain, which can be divided into two fundamental qualities.
The first is the processing capacity. Scientists estimate that our brain is capable of performing 36.8 quadrillion operations per second, the equivalent of 1 million computers running simultaneously. The supercomputer is currently the most potent Sequoia, also of IBM able to rotate almost half the capacity of the human brain. Still, he is not able to do even one tenth of the things we do. Why?
Here comes the plasticity, the second basic quality of the human mind. As the human brain evolved, several of its regions assume different responsibilities, such as language and control of the body parts. Our gray matter is able to adapt as it is used - as if it were designed to always learn new things. And plasticity which allows a person to become a great pianist, for example.
The most potent currently supercomputers operate similarly, i.e., they are capable of learning new things. And as the plasticity of computer chips develops, there is a possibility day they become as complex as our brain. And this should happen in 2029, according to calculations by Kurzweil. The technological singularity would begin only in 2045, when a machine will gain a greater intelligence than all mankind.
Although this computer is capable of running multiple human tasks, like writing a book or even unify quantum mechanics with Einstein's relativity (made that seems impossible to be performed by a human) in a short time, it can become equally dangerous. We would be creating an extremely advanced form of life with which we can share our ethical and moral values? Kurzweil thinks so. So much so that in 2008 he founded the Singularity University in Silicon Valley (USA), a university that offers graduate courses whose objectives are to train people who can guide the development of artificial intelligence, so that the computers have "common sense" not to kill us and take over the world.
Well, friends, if you guys were leading the world these days and the front of both power and danger, what would you do?
So, Comment!