Hacker News new | past | comments | ask | show | jobs | submit login

AI doesn't even pretend to get anything 'right'. As far as I understand it produces statistically common phrases related to the subject words.

I asked for a chemical bond description of caffeine. It began with the formula C8H10N4O2, then proceeded to blather about the number of bonds of this type and that. Getting absolutely everything wrong, egregiously wrong.

I believe it was just spouting chemistry-sounding verbiage from other descriptions of chemical bonds of other substances.

AI doesn't 'look things up' or 'disgorge something it's heard'. It makes shit up wholesale from fragments. Always spouting it out with complete confidence.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: