Yuck though! Imagine if you were writing a compiler. Would you make it accept “unsinged” “unnsigned” “unssined” and “unsined” as keywords, just to catch spelling mistakes? Not sure I like that pattern.
It's a little different in that case, since the person using the parser is also the person writing the input to the parser. So if the input fails the parser, the author of the code can simply correct it. As I understand it, there's no single standard that captures how all robots.txt files are formatted, so there's no "standard parser" that the authors of these files could be expected to pass.