Neuroscience & Mind
George Gilder: Can a Computer Model a Brain?
Our colleague George Gilder is out with a new book on the potential and the “apocalyptic” hype of artificial intelligence, Gaming AI: Why AI Can’t Think but Can Transform Jobs. You can get it free here. From Mind Matters, “The Brain Is Not a Computer and Big Data Is Not a Big Answer”:
In Gaming AI, George Gilder sets out six assumptions generally shared by those who believe that, in a Singularity sometime soon, we will merge with our machines.
Some of these assumptions seem incorrect and they are certainly all discussable. So let’s look at the first [one]:
• “The Modeling Assumption: A computer can deterministically model a brain.” (p. 50)
That would be quite difficult because brains don’t function like computers:
“As neuroscientist Yuri Danilov said last year, ‘Right now people are saying, each synoptical connection is a microprocessor. So if it’s a microprocessor, you have 1012 neurons, each neuron has 105 synapses, so you have… you can compute how many parallel processing units you have in the brain if each synapse is a microprocessor.
‘But as soon as you assume that each neuron is a microprocessor, you assume that there is a programmer. There is no programmer in the brain; there are no algorithms in the brain.’”
If there is a programmer, it’s your mind, which works through your brain, often in unexpected ways. But all proposals to give computers minds have failed.
The comparisons offered by experts are not useless. Often, they use ideas from computer science in order to explain how some aspect of the brain works. But it helps to see that these ideas are only partial. For example, the brain is not billions of little computers. Nor is it a meat computer. It vastly exceeds the most powerful computers in efficiency but it is not directed to calculation as such. That’s why computers can so easily beat us at calculations, not because they are smarter than us.
Read the rest at Mind Matters, published by Discovery Institute’s Bradley Center for Natural and Artificial Intelligence.