Neuroscientist Michael Egnor can be relied on to cut to the heart of materialist confusions. He does so again over at Mind Matters in an exchange with computer scientist Jeffrey Shallit. Egnor is too smart, and too much of a gentleman, to engage in childish insults as Shallit does. Instead, he nails it by noting that Shallit, not uniquely, falls victim to the mereological fallacy. That means mixing up the part with the whole.
The issue is whether machines, or brains, or any material object however complex, can “learn.” Egnor:
Brains are extraordinarily complex, and brain function is a necessary condition for ordinary mental function.
But brains don’t have minds, and brains don’t have knowledge, and brains don’t learn. Reinforcement and suppression of neural networks in the brain are not learning. They are a necessary condition for learning, but learning is an ability of human beings, considered as a whole, to acquire new knowledge, not an ability of human organs considered individually. Human organs don’t “know” or “learn” anything. This error is the mereological fallacy. It is the same mereological fallacy to say that my brain learns as it is to say that my lungs breathe or my legs walk. I learn and I breathe and I walk, using my brain and lungs and legs.
And it is just as much a fallacy to say that machines learn. Human beings learn, using brains, eyes, hands, and books — and machines. We use many things to learn, but only we do the learning, not our organs nor our tools.
Find the whole series of posts here:
- “Can Machines Really Learn?”
- “Machines Really Can Learn!”
- “Do Either Machines — Or Brains — Really Learn?”
Mind Matters is our sister publication, produced by Discovery Institute’s recently launched Walter Bradley Center for Natural and Artificial Intelligence.
Image credit: geralt, via Pixabay.