Hacker News new | past | comments | ask | show | jobs | submit login

This is probably wrong, but I always think of tensor's as n-dimensional generalizations of matrices with units attached.

Edit: After some wikipediaing, "bases" might be a better word than "units."




Well intuitively an n-dimensional generalization of matrices would just be a big multi-dimensional table. But a tensor is different in that you have some number of dimensions which are covariant and some number which are contravariant.

Additionally, you've sort of got it backwards. A matrix with units (and a set of basis vectors) attached is one representation of a rank (1, 1) tensor. But it's not really a unique representation of the tensor - you could choose a different set of basis vectors and come up with a different matrix representation of the exact same tensor. The tensor is an entity, while the matrix is a representation of an entity within a given coordinate system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: