What if I told you I have a machine that takes 1 20k-dimensional vector and turns it into another meaningful 20k-dimensional vector, but the machine is made of a bunch of proteins and fats and liquids and gels? Would you be willing to call it sentient now?
Sorry to tell you, but your brain is doing millions of dot products - it's what the biochemical reactions in the synapses between neurons amount to. We already know how the brain works on that level, we just don't know how it works on a higher level.
Sorry to tell you, but neurons do not follow a dot product structure in any way shape or form beyond basic metaphor.
I mean fine I’ll play along - is it whole numbers? Floating points? How many integers? Are we certain that neurons are even deterministic?
The point I’m making is this whole overuse of metaphor (I agree it’s an ok metaphor) belittles both what the brain and these models are doing. I get that we call them perceptrons and neurons, but friend, don’t tell me that a complex chemical system that we don’t understand is “matrix math” because dendrites exist. It’s kind of rude to your own endocrine system tbh.
Transformers and your brain are both extremely impressive and extremely different things. Doing things like adding biological-inspired noise and biological-inspired resilience might even make Transformers better! We don’t know! But we do know oversimplifying the model of the brain won’t help us get there!
The people seeking to exclude numerical vectors as being possible to be involved in consciousness seem to me to be the ones that you should direct this ire at.