Hacker News new | past | comments | ask | show | jobs | submit login

Just started using Gemini and it has never been correct. Literally not once. It's is just slightly better than a markov chain.



Which model? And could you share an example of some of the things you've asked it and gotten wrong answers for?


Gemini. I asked it to remember my name. It said it'd remember. My next question was asking it what my name was. It responded that it can't connect to my workspace account. It did this twice.

I asked it what was in a picture. It was a blue stuffed animal. It described it as such. I asked it what kind of animal it thought it was supposed to be. It responded with "a clown fish because it has a black and white checkerboard pattern". It was an octopus (at least it got a sea creature?).

I asked it for directions to the closest gas station. It wanted to take me to one over a mile away when there was one across the street. I asked why it didn't suggest the one nearest to me. It responded with "I assumed proximity was the primary criteria" and then apologized for calling me names (it didn't).

This model is bonkers right now.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: