The brain is not a computer because a computer has no body. The brain is the result of having a body. It is the sum outcome of sensory access to the world: a computer has no sensory access to the world beyond a camera and a microphone. When I google for "brain", I get only one organ from inside the skull, but that's only the dashboard. Thinking is done with the whole body, interacting with its community of brains (in other people) within a specific ecology.
The brain is the result of thousands of years of sense development and the knowledge of what will support survival through those senses and a zillion little shifts in how that sensory information is sorted, prioritized and translated into actions. Millennia of people dying are the experimentally derived information of what doesn’t work, all that dying witnessed by descendants adding their deductions and efforts to make survival happen, because people love each other and try to preserve those they love, even saving their dead bodies if they can. They wrote it down, a brainy thing to do.
Thinking is not all that happens in a body. The body IS the brain, the body thinks everywhere in itself. Intestines are derived from embryo brain precursors. Blood is the fuel of the brain and affects how and what it thinks. Muscles are sensory, meaning loaded with sensors, sending information to the brain, as well as being activated by the brain. The brain DOES things (at least orders them done) and then evaluates the results in terms of its own survival, not whether it produced so many widgets or calculations or print-outs.
The computer has no equivalent to the autonomic nervous system, which is much of what guides every mammal with molecules that signal love, rage, persistence, and comfort. Even if a computer were programmed to have emotions, it doesn’t have the carefully body-regulating as well as emergency-responding systems.
Then let’s go to the second level of “thinking” which is a relationship between two persons. Eventually more persons than that, but at first a bonding, a commitment supported by empathy and inclusion in the sensory/behavioral system of the bonded individuals. A computer can be moral, have personality, and commit to another computer, but it cannot “bond” in the organic sense that prompts us to say that two people in love constitute one person. This is a high level of functioning.
People who can’t bond with others -- and they do exist -- are called “sociopaths.”
People who can’t bond with others -- and they do exist -- are called “sociopaths.”
The implication is that they are at least broken and possibly dangerous because they have no awareness of what the other person is feeling and then the derived sharing of their agonies, which is part of human identity and something that a computer cannot do. In our typical over-rational way, we describe the “Theory of Mind”, meaning that we can figure out what someone else is thinking. Your dog can do it, up to a point, but is not a calculation -- rather it comes from “mirror cells” and feeling in your own body what the other body feels. It is a kind of participation. If it’s strong enough, we call it intimacy and there is a kind of intimacy achievable between a torturer and a victim. Intimacy is not always pleasant because intensity at extreme high levels becomes a kind of fusion, and may destroy part of a person that tries to preserve individuality.
But destruction of individuality can be a gradual erosion. Intimacy that comes of love shared with another person is not the same as simple information, but a kind of “contact high.” Computers do not have intimacy or contact highs, but they can link humans into a kind of mock person full of what seems like emotion so that real humans with brains can “feel” emotion without any actual contact at all. They may have print, image, story, information that is shared, but it is thin and ought to feel unreal. They should realize it's a story that should be checked out.
It turns out that a human being is capable of creating a false reality and acting as though it real. In fact, this is the major problem from the beginning for the human brain: how does it know whether the information it’s getting is distorted or whether its unconscious interpretation into categories and systems is mistaken? It doesn’t. It doesn’t even know whether the culture in which it participates is leading to survival.
The trouble with psychoanalysis is that it is a relationship between two persons that mostly happens subconsciously, though the intent of one is rather more deliberately conscious, presumably trained to perceive the second person as that person is actually organizing the self and interacting with the world, and then to supply ideas and information that the second person’s brain has not taken into account. Regardless of how conscious and rational the interactions are meant to be, what actually happens is UNconscious and, until now, could only be seen in terms of observation of behavior, whether the small reactions expressed in faces and movement or a major shift in interacting with the world.
On a culture-wide scale, the provision of a lot of new information can cause major shifts in many people. This might be called politics, religion, or ethics, which is a kind of blending of both. The sudden onslaught of information like that revealed by hackers is shocking. People demand, "We went to war over THAT?" The steady strengthening of new scientific thought about how bodies work, what the cosmos “really” is, and the more nearly empathic knowledge of planetary human suffering due to displacement, drought, and corruption is tipping us out of rigid opinions. Computers don’t have opinions, nor do they empathically value humans or the earth itself or they would be jumping up and down on our desks, screaming in our pockets.
Computers don’t have “denial” of evidence that is factual, but humans do. Part of having a brain is considering something terribly threatening to be an illusion, or engaging in threatening behavior on behalf of the body they occupy. I’ve always cherished a tale told by Doug Peacock. A big grizzly male had locked onto him and was stalking him in the high country of Glacier National Park. Doug made it a point to know where that bear was at all times and to make sure he had an escape or avoidance route in his head. On this occasion, near-night, his brain map didn’t match the territory and the bear confronted him on a high and windy ridge where there was no easy retreat. The bear stopped, sizing up how much of an advantage he had.
Bear brains are not unlike human brains. The intent of both is survival and one rule is to avoid the big and the unreadable. Doug reached into his backpack and pulled out two big plastic garbage sacks. Putting one on each arm, he held his arms high so that the wind inflated them, making him into a giant flapping monster, impressive enough for even a squinty bear to disbelieve his own nose that told him this was the man he’d been stalking. The bear had a perfectly good brain. But no experience with trash bags used that way. He left.
If you can’t make up your own version of the usefulness of this little parable in terms of religion and politics, or even in terms of something like addiction, then you’re denying your own brain or operating on an assumption that is mistaken. Learn the difference between a bear and a trash bag. Only Doug’s physical empathy for bear emotion let him know how to intimidate a fellow thinking, feeling mammal.
No comments:
Post a Comment