The data in the carrots

Some  argue that the term “computational intelligence” is a misnomer because computing machines can only process data and not information that has semantic meaning. This is a  mistake.

Data becomes information when it satisfies a need.  Any system, organic or synthetic, that has no needs processes only data. Give it needs and the same data becomes information.

What transforms data into information are the needs of a system, not its construction.

Here is a story about this:

Albert, a municipal clerk living in Baton-Rouge, had three pets: a cat, a rabbit and a robotic vacuum cleaner that regularly plugged itself in available electrical outlets to recharge.

Returning from work, Albert bought a bunch of carrots and some other groceries. Entering his apartment he said: “Hello my pets!” Then, as the three pets watched, Albert dropped the bag of groceries in front of the kitchen’s electrical outlet, took out the bunch of carrots out and placed it on the counter. All three saw this. Two with their eyes, one with its camera.

“There is more food in the kitchen, this is useful information” thought the rabbit.

“I lost a source of energy, this is useful information” computed the robot.

The cat yawned and returns to his couch. “No information here, he reflected.”

Same data different information.

3 thoughts on “The data in the carrots”

  1. Dear Jean,
    Are you familiar with John Searle’s argument against the possibility of the capacity for consciousness (or at least–or even– intentionality) in digital systems? (I’m referring to his well-known “Chinese Room” argument.) Do you have a response to this argument (as expressed in his paper ‘Minds, Brains, and Programs’)? On what grounds would the content, I lost a source of energy, be content that the vacuum possesses rather than content we attribute to its functional configuration? (Of course this question presumes there is a real distinction here. If merely being disposed to behave as though the way to the plug is blocked is regarded as “thinking that” the way is blocked, then we deny the distinction. But advocates of “first personal point of view” notions of consciousness want to claim that consciousness, and intentionality even, requires something more and other than a mere functional and dispositional account.)

    1. Hello Ian,

      I did a short video on exactly this topic : More can be said about it but it

Comments are closed.