most of what goes as "understanding" (where 'our culture' is the agent/actor doing the 'understanding') really is compression of information (abstraction is the form of the compressing)
I thought about this possibility years ago, but as I see more of what neural nets are doing, it makes me more certain I'm onto something (which makes no meaningful difference to me, i.e. being onto what these deep neural models are is useless to me)
in any case, yea sure. neural nets are some kind of lossy compression but nobody thinks about them this way.
and my point is that to create abstract theories which explain lots of things (e.g. physics) is also this kind of 'lossy compression'.
over these theories we say "we understand" stuff, this means we are able to recall things about what the theories are describing, it allows us to reconstruct scenarios and predict the outcomes if/when the scenarios match up.
maybe I'm gearing up to say that 'backpropagation' is a creative action?
I thought about this possibility years ago, but as I see more of what neural nets are doing, it makes me more certain I'm onto something (which makes no meaningful difference to me, i.e. being onto what these deep neural models are is useless to me)
in any case, yea sure. neural nets are some kind of lossy compression but nobody thinks about them this way.
and my point is that to create abstract theories which explain lots of things (e.g. physics) is also this kind of 'lossy compression'.
over these theories we say "we understand" stuff, this means we are able to recall things about what the theories are describing, it allows us to reconstruct scenarios and predict the outcomes if/when the scenarios match up.
maybe I'm gearing up to say that 'backpropagation' is a creative action?
shrugs