Hacker News new | past | comments | ask | show | jobs | submit login

>Amazon EC2

I'm not sure how they compare in practice, but it might be worth calculating how many hours an Amazon G2 instance would take, using their high-end graphics cards as CUDA processors. I think the cost per performance ratio is much lower, and that could change the equation in the other direction.




I really don't know how well the required operations execute on a GPU. I've only skimmed the RSA 768 paper to find that line.


Fair enough. I've read that using CUDA (or another GPU-based language) you can get at least a 10x the GFLOPS of a 4-CPU Xeon [1], though, and RSA cracking should easily parallelize, if I'm understanding the process correctly. And the high-end NVidia cards in the G2 instances have 1,536 CUDA cores each. No, I'm not kidding. The one benchmarked in the link above is about 1/3 the GFLOPS of the one in the G2 instances.

And it looks like a reserved G2 instance is 0.65/hour (though can be lower on the spot market and in the reserved instance marketplace). So if there's a 120x speed improvement over the "single core 2.2GHz AMD Opteron" (and that's assuming each core is as fast as the Xeon core above), for only 11x the cost...well, it gets a lot cheaper.

In fact, it ends up, if I haven't done my math wrong, at about $94,900 of full instance time (less if you get spot or reserved instances). [2] To win the $200k prize. Hmm....

[1] http://archive.benchmarkreviews.com/index.php?option=com_con...

[2] "the equivalent of almost 2000 years of computing on a single core 2.2GHz AMD Opteron": That's 17,520,000 hours. If the G2 instance gets you 120x performance improvement, that's 146,000 hours. At 0.65/hour, that's $94,900.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: