Mostly unrelated, but I wish the field of math would step away from greek letter notatation and just make variable and function names readable as programmers do. I know there are historic reasons, and I'm sure that mathematicians' time is so valuable that they can't be bothered to write more than 1 character, but it's a real barrier to entry in my opinion.
The use of the Greek alphabet is mostly driven by convention. You get used to it. They aren't just sprayed around randomly wherever you look.
As for your code, urgh. A mathematician would carefully define what the function and operands are and the domain of each. More, and I'm vasty simplifying here as I don't have LaTeX available. I don't know what your code does so I've invented some words.
Let f(x,y) represent the rate of change of doodads where x is the number of doodads and y is the amount of doodads per gronk.
The code is unclear because it has single-letter variables instead of names, exacerbated by being in Greek, which mathematicians reached for because they were using single-letter variables, and ran out of them, and weren't bold or imaginative enough to break from convention. Then they used up the best of the Greek letters too and moved on to things like Gothic. This is stupid.
No they didn't use them because they ran out. You know nothing about mathematics clearly. There are conventions to keep it concise. Some definitions of simple things would be unreadable if you used longer names. Consider the quotient rule:
> they can't be bothered to write more than 1 character
You try handwriting all of your code and let's see how long until you start abbreviating everything.
Mathematical notation is all abbreviations. We used to write mathematics without abbreviations. It was absolutely horrid. Try reading some 13th century mathematics, translated to your language (e.g. Fibonacci https://archive.org/details/laurence-sigler-fibonaccis-liber... ), and see how much of it you understand without the benefit of symbolic notation. We would even write aaa instead of a^3.
The point with mathematical notation is that it can all be sounded out and it's extremely general and abstract. Generally, x is not a measurement, a quantity with a unit, a meaningful anything. It's just a number, and x is a better name than front_server_count or whatever thing you're programming about.
A massive amount of day to day pure mathematics work is still done by hand, on paper, whiteboard, or even chalkboard (and there's a preferred brand of chalk). Of course it will all be typeset before sharing, but mathematicians typically think by writing by hand, not think by typing.
the vast majority of mathematical equations/terms would become completely unreadable if you replaced single symbols with descriptive terms. You are going to have to internalize what the symbols refer to anyways to understand the formula, and once that is accomplished any additional description is a waste of space and cognitive bandwidth.
We did try it. We tried it for a couple of millenia. It was much harder to understand and our collective mathematical output as a human species was much slower than it is now.
You know when you go to a foreign country, ask for a particular thing in a restaurant proudly in their language that you picked up on Duolingo the week before and the waiter starts talking to you quickly in their language and you lose it completely.
We are just that waiter. You didn't learn the language or get the prerequisite skills. So do that or don't bother. It's not easy and there are no shortcuts.
Even if i agreed with you that full words/phrases were easier to parse, any gains made here are easily offset by how much harder and laborious it is to manipulate them on paper
It's only a small part of the barrier to entry, though. I could write Einstein's equations out in words, and most people still wouldn't be able to do GR.