Lots of people make use of relational databases without formally understanding set theory/normal forms. They just learn by example and hacking it. What you're seeing here with deep learning is the same situation.
Sure. And I didn't at all mean that people who aren't good at math should stop doing deep learning.
I just meant that it's weird to me that people seemingly tiptoe around using simple math to explain what is, at heart, simple math, and instead invent a whole new terminology and a dictionary of imprecise analogies to get by.