I agree with you in principle, however, their simulations are approximations, not identical.
From their website FAQ:
Q: Do you believe a computer can ever be an exact simulation of the human brain?
A: This is neither likely nor necessary. It will be very difficult because, in the brain, every molecule is a powerful computer and we would need to simulate the structure and function of trillions upon trillions of molecules as well as all the rules that govern how they interact. You would literally need computers that are trillions of times bigger and faster than anything existing today.
Because this will only be an approximation, it means that certain hard-to-define grammatical properties of spike-trains (e.g. "neural codes" as opposed to "rate codes") may be completely lacking in the simulation. Do these "neural codes" exist? At this point we do not know.
If it turns out that the brain does utilize complex neural codes, and that they are not being accounted for in the simulation, then the simulation can be no more than gibberish which happens to share many low-order statistical features with the real thing.
To illustrate this point, here are some examples of pseudo-English sentences automatically generated by a purely statistical model (n-gram model where n=2):
Richard Beesemyers, formerly raised sagging candidate to the Friday officially forecast at the project.
A hearing appeals.
That bank handles most notable exceptions to buy time for reconsideration Wednesday in decades, is the B& Coopers, to discuss international theme for the violin away as the wages of the power plant near classic chemise.
It could have been observing the 65 to back Tuesday.
From their website FAQ:
Q: Do you believe a computer can ever be an exact simulation of the human brain?
A: This is neither likely nor necessary. It will be very difficult because, in the brain, every molecule is a powerful computer and we would need to simulate the structure and function of trillions upon trillions of molecules as well as all the rules that govern how they interact. You would literally need computers that are trillions of times bigger and faster than anything existing today.
Because this will only be an approximation, it means that certain hard-to-define grammatical properties of spike-trains (e.g. "neural codes" as opposed to "rate codes") may be completely lacking in the simulation. Do these "neural codes" exist? At this point we do not know.
If it turns out that the brain does utilize complex neural codes, and that they are not being accounted for in the simulation, then the simulation can be no more than gibberish which happens to share many low-order statistical features with the real thing.
To illustrate this point, here are some examples of pseudo-English sentences automatically generated by a purely statistical model (n-gram model where n=2):
Richard Beesemyers, formerly raised sagging candidate to the Friday officially forecast at the project.
A hearing appeals.
That bank handles most notable exceptions to buy time for reconsideration Wednesday in decades, is the B& Coopers, to discuss international theme for the violin away as the wages of the power plant near classic chemise.
It could have been observing the 65 to back Tuesday.
And fourth, there is a full report, Wagner said.