Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
argonaut
on Dec 12, 2015
|
parent
|
context
|
favorite
| on:
Introducing OpenAI
I'm not aware of any research lab that uses AWS for these things. It's cheaper to just buy the GPU yourself.
yeukhon
on Dec 12, 2015
|
next
[–]
But you don't change the GPU yourself :-) you use it as a service. That's the good part, although the spec is not very good.
jedberg
on Dec 12, 2015
|
prev
[–]
AWS is a sponsor of this, which probably means a bunch of free resources.
argonaut
on Dec 12, 2015
|
parent
[–]
g2.8xlarge also only has 4GB of VRAM per GPU, which is too small for most recent deep learning models. The TitanX GPU has 12GB, by comparison.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: