Hacker News new | past | comments | ask | show | jobs | submit login

I was upset and disappointed in LLMs’ hallucination problem at first, but now I have come to terms with it. I no longer rely on them as I would a reference book or a search engine. Instead, I use them primarily for tasks that do not have right or wrong answers.

One is as a translation assistant. While a translation of an individual word or phrase might be right or wrong, any longer text can be translated in many different ways, none of which is correct or incorrect, just better or worse. I do translation professionally, and now when I am translating a text I first explain its origin and purpose to several LLMs—as of yesterday, ChatGPT-4o, Claude 3 Opus, and Gemini 1.5 Pro—and have each translate it for me. I borrow from those three translations for my first draft, which I then spend plenty of time polishing and comparing with the source text. Overall, my translation time is about a third shorter when I use LLMs. The quality of the final product is also somewhat improved, as the LLMs sometimes suggest better translations than I would come up with on my own.

Another useful—though sometimes frightening—application is brainstorming. I'm now teaching a discussion-based class to a group of 25 undergraduates. Last weekend, I had to come up with ideas for discussion topics for the next class session based on the reading text (a book I edited) and the homework assignments the students have submitted so far. I started doing what I normally do, which is to read those assignments and try to think up new topics that the students would find interesting and useful to discuss in class. I was going to spend most of the weekend on that.

But then I remembered the huge context window of Gemini 1.5 Pro. So I uploaded all of the information about the class—the textbook, my notes and the Zoom transcripts for each class session, and all of the papers that the students have submitted so far—and I asked it to generate 50 new discussion topics as well as 20 ideas for the next homework assignment.

In a couple of minutes, Gemini was able to produce the requested topics and ideas. A handful were off the mark or showed signs of hallucinations, but overall they were excellent—better and much more varied than what I could have thought up by myself. It really cut to the core of my identity as a teacher. I ended up not using any of those ideas this time, but I did share them with the students to show them what is now possible with LLMs. They seemed a bit stunned, too.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: