Hacker News new | past | comments | ask | show | jobs | submit login

For reference the GPU they're using for this paper is the NVIDIA V100 GPU, a datacenter GPU costing $8,000.



To be fair, while V100 perform very very well for machine learning, you can buy almost a dozen 1080ti's or a few titans (whatever the current one is), which would certainly be much faster.

They say they used V100 but not how many, if they needed a large number then nevermind.


The paper says they only ran it on a single V100, I was expecting multiple GPUs as well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: