Hacker News new | past | comments | ask | show | jobs | submit login

Larry chose to focus on the big data part, because it is intuitive. But I think you're absolutely correct, that it has applications in physics/chemistry (and machine learning too). We're actually talking to people in our theoretical physics department, who may want to use taco for their QCD computations. There's also a new issue on our tracker about adding complex number support for nuclear computations and quantum computing: https://github.com/tensor-compiler/taco/issues/116.

The tensor contraction engine is great work, and focuses on dense tensor. We currently optimize for sparse tensors so TCE will do better than us for pure dense expressions. We want to bridge this gap next semester though.

The cyclops framework is also great. We discuss it in our related work, but we did not directly compare to it in our evaluation. The first version of it, like TCE, focused on dense tensors, and their main focus is on distributed computing, which we don't support yet (we do shared memory parallel execution at the moment). They have some followup work on sparse computations. The difference to our work is that they, if I read their paper correctly, transposes the data until they can call pre-existing matrix multiplication routines. This causes data movement overheads. Our work compiles expressions to work at the data at hand, without moving it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: