Hacker News new | past | comments | ask | show | jobs | submit login

TensorFire is up to an order of magnitude faster than keras-js because it doesn't have to shuffle data back and forth between the gpu and cpu. Also TensorFire can run on browsers and devices that don't support OES_TEXTURE_FLOAT.

We will probably release it under an MIT license.




I'm really interested in using Smartphones / Mobile devices for inference. Can this work with react-native so that I can build it without a bridge ? I would assume I would create a webview that would load a local website.


How does it compare to WebDNN[0]? It seems like a closer comparison, especially with WebGPU.

It would be good if you had a comparative benchmark on the website.

[0]: https://mil-tokyo.github.io/webdnn/


At the moment WebDNN only runs models on the GPU in Safari Technology Preview, falling back to CPU on all other platforms / browsers: https://mil-tokyo.github.io/webdnn/#compatibility




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: