Hacker News new | past | comments | ask | show | jobs | submit login

I want to experiment with deep learning by training on the GPU, but the price even for used cards just makes me put it off. :-O. If it doesn't come down soon tho, I might cave and buy one.



Have you considered cloud services? For example you have GCloud compute engine: https://cloud.google.com/compute/gpus-pricing


The GCE free tier is nice but ran out for me after just about a month of continuous use of one (cheap) GPU. Maybe some people have deeper pockets than I do but I can’t afford to pay $300 CAD/month on a single cloud GPU for hobbyist purposes.


Kaggle seems to offer a lot of free GPU services if you want to have your notebooks be public.


vast.ai is also an option, which is cheaper (though also not as nice as GCP)


Well how much VRAM do you feel you need for your projects? Because there's a lot of 16 GB cards out there that are still perfectly serviceable for that.

Vega Frontier Editions (16 GB HBM2) are going for around $600-700 on eBay and Craigslist, also Facebook Marketplace in some areas.

Radeon VIIs (16 GB HBM2) are going for around $650-1000 on the same platforms.

You can easily put two of those into a machine and have a pretty impressive deep learning rig.


Thanks for the suggestion, will check those out!


My pleasure and happy hunting.

Here's the original article that reminded me of this. Its four years old, but I still feel its a worthwhile read: https://medium.com/intuitionmachine/building-a-50-teraflops-...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: