Hacker News new | past | comments | ask | show | jobs | submit login

> The various measures of "structure" that we have today (Shannon entropy, algorithmic information content, etc.) fall short for measuring the kind of hierarchical structure (multiple levels of function composition) that is prevalent in high-dimensional problems of interest.

You don't think Kolmogorov complexity captures hierarchical structure? (Ignoring the fact that it isn't computable).




As you correctly point out, it's not computable, so we can't use it for measuring anything in practice. But let's ignore that.

The issue I have with it is that it theoretically measures the lossless compressibility of some sequence of bits, instead of the kind of "hierarchical decomposition that is robust to data loss" that I'm talking about here.

Think about this way: we can randomly zero out half the pixels in a cat photo, and a human being can still see it's the same cat, because it's composed of cat parts that "approximately look like" cat parts (pointy ears, whiskers, etc.), and those parts look like cat parts because they are in turn composed of things that "approximately look like" cat part parts, all the way down to pixels.

AFAIK, there's no way of measuring whether cat photos have more or less of this kind of robust-to-loss hierarchical structure than, say, dog photos, or fruit photos.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: