As Irving Good realised in 1965, machines with superhuman intelligence could repeatedly improve their design even further, triggering what Vernor Vinge called a 'singularity.'
Sentiment: POSITIVE
By definition, the Singularity means that machines would be smarter than us, and, in their wisdom, they can innovate new technologies. The innovations would come so quickly, and increasingly quickly, that the innovation would make Moore's Law seem as antiquated as Hammurabi's Code.
I have been motivated by this idea since I was a kid that if we invented machines that were created in the way that people are - were aware, have free will, inventive machines, machines that would be geniuses - potentially, they could reinvent themselves. They're not just applying it to other things - they could actually redesign themselves.
We're making progress, but getting machines to replicate our ability to perceive and manipulate the world remains incredibly hard.
As machines become more and more efficient and perfect, so it will become clear that imperfection is the greatness of man.
This is a man who was 23 years old when he theorized the idea of creating a programmable machine, and in that way, Turing foresaw computers and artificial intelligence. These were revolutionary ideas at that time.
When people speak of creating superhumanly intelligent beings, they are usually imagining an AI project.
Some of the world's greatest feats were accomplished by people not smart enough to know they were impossible.
What seems like a crazy idea today eventually grows. It's a 'with hindsight' thing. One day, someone will turn around and say, 'That was genius.'
There was a failure to recognize the deep problems in AI; for instance, those captured in Blocks World. The people building physical robots learned nothing.
All of the biggest technological inventions created by man - the airplane, the automobile, the computer - says little about his intelligence, but speaks volumes about his laziness.