We can think of many other strings used by programmers that are not common English, and many strings used by mathematicians that are.
I think the difference is that you are a programmer and not a mathematician (I'm guessing) and are saying, effectively, that what you are subjectively familiar with is objectively more universally understood.
> We can think of many other strings used by programmers that are not common English, and many strings used by mathematicians that are.
Are you saying special symbols aren't more common in mathematics than in programming? I simply disagree. Mathematicians hardly use strings at all, e.g. for function names or variables, while they are very common in programming. Mathematicians mostly use single letters in Roman or Greek alphabet, and sometimes with various strange styles like fraktur, double strokes etc.
> Are you saying special symbols aren't more common in mathematics than in programming?
No, I agree that programming uses more ASCII. I'm saying that using a smaller alphabet (e.g., hex), doesn't make it easier to understand. Programming is just as arcane and difficult to understand - even programmers have trouble understanding each other's code, and generally it's believed that the understanding requires documentation in English.
Yes it does work both ways. Any mathematician or programmer who uses it is, afaict, just imagining their subjective perspective is some objective universal truth.
I think the difference is that you are a programmer and not a mathematician (I'm guessing) and are saying, effectively, that what you are subjectively familiar with is objectively more universally understood.