Why is it called Tensor flow? Do the multi-dimensional matrices that exchange data between the nodes transform like tensors? If so, when does the need arise to transform them?
> Do the multi-dimensional matrices that exchange data between the nodes transform like tensors?
Yes, if you design the moeel/graph that way.
> If so, when does the need arise to transform them?
The need arises whenever tensors are needed. For deep learning, most people treat them like multidimensional arrays. TensorFlow is an excellent name.
Multidimensional arrays are a thing of the past. Now we call them tensors. Get with the program or become an aging, forgotten physicist not involved in deep learning.
A lot of people heard of tensors as something used in quantum physics, which is considered by many the most advanced/difficult hard science.
So using the word Tensor suggest highly advanced stuff used by very smart people.
Expect much more Tensor stuff in the future.
Other physics terms have the same high branding potential. "Gauge" comes to my mind. However almost nobody outside of physics/maths heard of this one (in the "gauge symmetry" sense, not the "wire gauge" one), so it would need some time to grow.
It uses Tensor in the computer-sciencey-we-abuse-terms sense of "a multidimensional matrix", not in the physics sense. It could be called multidimensionalmatrixflow, but I'm glad I don't have to type that on a daily basis. :)
Doesn't the word 'tensor' "abuse" the terminology in exactly the same way as 'matrix' does?
Sure, you can be mathy and insist that these are all abstract things and transformations between them, but meanwhile CS people will keep calling arrays "vectors", "matrices", and "tensors".
Some of the operations that are performed on the tensors in a neural network are non-linear. An example might be taking the tanh of all of the elements of the tensor. For these steps, you won't have invariance (or covariance) under change of basis.
Even in physics, there are applications of tensors which essentially treat tensors as multidimensional arrays (see for example, tensor networks) with no predefined transformation properties. But the operations done on tensors are always linear.