Hacker News new | past | comments | ask | show | jobs | submit login

seems like there's a high potential for the AI to hallucinate and get confused with ingredient measurements... last I checked, LLMs struggle with numbers.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: