Hacker News new | past | comments | ask | show | jobs | submit login

He claims there's a $999 AMD card that gives 123 TFLOPS, and his tinybox will cost $15k for 738 TFLOPS. In other words, the tinybox will have 6 of these GPUs, eg $6000 cost price. It seems a steep markup from 6k to 15k, and if the software is open-source, i'm not sure why you wouldn't build your own? Or is it worth 9k for a custom motherboard that can fit so many GPUs. Or can you buy 6-GPU motherboards off the shelf? Just curious what people think. Not being disparaging, kudos to geohot :)



As someone who has built their own deep learning rigs in the $10k BoM range, frankly it's a pain in the ass and I would gladly pay that in the future. I probably will pay lambdalabs a much larger markup.


As someone who bought a deep learning rig for about $12k from lambdalabs years ago, I can't recommend them strongly enough. The support (and not having to deal with building it out myself) was well worth the markup. They're also just really great to deal with.


Logic boards and CPUs with the necessary PCIe lanes capable of feeding the GPUs, storage fast enough to do the 30GB/s that is in the spec sheet, power supplies and a case to hold such a power hungry system, etc. is not cheap.

$15k seems like a kind of low - if you've tried to spec out a server with similar quantities of accelerators recently, you'd have trouble hitting that figure, even if using consumer grade GPUs.


You can buy 6+ gpu motherboards, but the consumer ones are solely for mining because no normal person has that many cards and no consumer CPU exposes enough lanes to properly support that many GPUs. A lot of enterprise vendors exist to sell you that kind of system in the enterprise space, but you should expect to still be paying 5 figures minimum. $15k seems like the low end to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: