Hacker News new | past | comments | ask | show | jobs | submit login

Not really. Depends entirely on how general-purpose (abstract) the learned concept is.

For example, detecting the possible presence of a cavity inside an object X, and whether that cavity is large enough to hide another object Y. Learning generic geospatial properties like that can greatly improve a whole swath of downstream prediction tasks (i.e., in a transfer learning sense).




That's exactly the problem: the learned "concept" is not general purpose at all. It's (from what we can tell) a bunch of special cases. While the AI may learn as special cases cavities inside carboard boxes and barrels and foxholes, let's say, it still has no general concept of a cavity, nor does it have a concept of "X is large enough to hide Y". This is what children learn (or maybe innately know), but which AIs apparently do not.


> It still has no general concept of a cavity, nor does it have a concept of "X is large enough to hide Y". This is what children learn (or maybe innately know), but which AIs apparently do not.

I take it you don't have any hands-on knowledge of the field. Because I've created systems that detect exactly such properties. Either directly, through their mathematical constructs (sometimes literally via a single OpenCV function call), or through deep classifier networks. It's not exactly rocket science.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: