Hacker News new | past | comments | ask | show | jobs | submit login

> I see the problem is that people look at the output of these things that are (say) 70% correct and in their mind they fill in the other 30%.

Q: Is there also some element of survival bias in the mix?

If you prompt GPT-3 with something and the answer is garbage, you probably don't write it up on your blog. If you get something that makes sense, then you do.




That's true for most people. It's the opposite for me!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: