I will argue that in the literal sense the programmed computer understands what the car and the adding machine understand, namely, exactly nothing.
Sentiment: NEGATIVE
My car and my adding machine understand nothing: they are not in that line of business.
The computer would do anything you programmed it to do.
People don't understand computers. Computers are magical boxes that do things. People believe what computers tell them.
Every new car, you open the door, and you look at all those internal mellifluous swoopy bits, and they have no meaning.
We have to make machines understand what they're doing, or they won't be able to come back and say, 'Why did you do that?'
Computers, like automobiles and airplanes, do only what people tell them to do.
I don't really know much about cars.
You can know or not know how a car runs and still enjoy riding in a car.
We can't really know ourselves because we have not created ourselves. But we can know computers, we can know cars, because anything that we made, we can understand.
We often attribute 'understanding' and other cognitive predicates by metaphor and analogy to cars, adding machines, and other artifacts, but nothing is proved by such attributions.
No opposing quotes found.