Hacker News new | past | comments | ask | show | jobs | submit login

I think a generative transformer can be said to generate text. It doesn't generate new words. The words it uses to generate the text, are words copied verbatim from the input. What's emergent is the combination of those words into text.

The generated output has building blocks (characters, words) which themselves are not "generated" in the sense that they are novel.

A random number generator generates numbers. It doesn't generate the digits used to represent the random numbers. Those are taken from a set (such as 0..9) and just "used".




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: