> Unlike tokenization, BLT has no fixed
vocabulary for patches.
iiuc this means: the vocabulary of patches is not known prior to training.
I guess once training has established a vocabulary of patches, that same fixed vocabulary is used for inference (if this is not true I don't see how it could work).
iiuc this means: the vocabulary of patches is not known prior to training.
I guess once training has established a vocabulary of patches, that same fixed vocabulary is used for inference (if this is not true I don't see how it could work).
Right?