That funny, I looked into the whole neural net stuff the other day and thought:
"Wow you basically don't need any math".
I guess it depends a lot on perspective, to me the whole field looks like alchemy more than science. The results cannot be denied, but we understand so little.
This is true. In the current scheme of things, you don't need to know any math to get things going. At some level, if you really need to write a custom model from scratch, probably you need to - but not really.
Now you have the likes of HuggingFace which is another abstraction on top of Pytorch. Heck you don't even need to code a transformer from scratch. They have done it for you.
from transformers import AutoTokenizer, AutoModelForMaskedLM
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModelForMaskedLM.from_pretrained("bert-base-uncased")
I was like wow, this level of abstraction is totally breaking barriers to entry and we will see many many flavours of huggingface in the years to come.
I think this is more to do with the way software ecosystems develop in general. You don't need to know much about the actual implementation of databases or low-level programming to build a data-heavy application, and you don't need to know the complexities of graphics programming to render complex visuals to a screen.
If you want to do research in any branch of ML, you are almost certainly going to need some strong mathematical foundations. At the same time, if you're more interested in applying existing architectures/models, the ecosystem has a lot of really amazing tools that will take you a very long way without requiring you to study statistical learning theory.
I guess it depends a lot on perspective, to me the whole field looks like alchemy more than science. The results cannot be denied, but we understand so little.