The human brain also forgets, something that may be a feature instead of a bug. Also, beyond compression––brains are simulation machines: imagining new scenarios. Curios to understand if ML provides anything analogous to simulation that isn't rote interpolation.
I don't think it's true. I can imagine a lot of aspects of systems around me I cannot possibly experience in any way, except maybe them leading to some outcome that I might experience as well. I sometimes do verify this experimentally, but that comes later.
GP said something about compression being a component of intelligence, parent said also brains simulate, then I said yeah i agree, and believe that the content of the simulation is the experience itself, whereas people often think there are 2 things, themselves and the world. I don't believe that at all. There is only one thing happening, the experience you are having now.
Ah ok, that doesn't work for me because I want to call dreams and hallucinations experiences as well. And those are disconnected from some idea of reality.
The primary aspect is first person perspective, not congruence with the external world.
I am quite a novice in ML topics, but isn’t this concept of simultaneously training a generator and validator sort of this?
I don’t know the exact term but I think of deep fake generators with an accompanying deep fake recognizer working in tandem bettering each other constantly?
>In fact, she was not very good at memorizing anything at all, according to the study published in Neurocase.[1] Hyperthymestic individuals appear to have poorer than average memory for arbitrary information.
Absolutely. Generative methods are all the rage now. Those methods work on learning information-rich representation spaces. You could argue it's still "interpolation" but instead of interpolating in data-space per se you are interpolating in representation-space.