Hacker News new | past | comments | ask | show | jobs | submit login

> You can easily use the OpenAI API with the temp=0 and a predefined seed and you'll get very deterministic results

Does that mean that in this situation OpenAI will always answer wrongly for the same question?




temp 0 means that there will be no randomness injected into the response, and that for any given input you will get the exact same output, assuming the context window is also the same. Part of what makes an LLM more of a "thinking machine" than purely a "calculation machine" is that it will occasionally choose a less-probable next token than the statistically most likely token as a way of making the response more "flavorful" (or at least that's my understanding of why), and the likelihood of the response diverging from its most probable outcome is influenced by the temperature.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: