I think a generative transformer can be said to generate text. It doesn't generate new words. The words it uses to generate the text, are words copied verbatim from the input. What's emergent is the combination of those words into text.
The generated output has building blocks (characters, words) which themselves are not "generated" in the sense that they are novel.
A random number generator generates numbers. It doesn't generate the digits used to represent the random numbers. Those are taken from a set (such as 0..9) and just "used".
The generated output has building blocks (characters, words) which themselves are not "generated" in the sense that they are novel.
A random number generator generates numbers. It doesn't generate the digits used to represent the random numbers. Those are taken from a set (such as 0..9) and just "used".