Hacker News new | past | comments | ask | show | jobs | submit login

Why not use smartphones these days?

Just plug your iPhone in at night, and the app will run in the background, when it detects user inactivity.




In theory you can (on Android). There is an app called boinc. You can load work for some projects to your phone. I tryed it a few years a ago, but phones are really not made for a workload over a long time. Even with work just for the half cores and not 100% processor time the phone got very very hot. What works well is Raspis and co, at least with a heatsink and a fan.


Indeed, there was a [0] story yesterday - an initiative Vodafone are supporting to compute a cure for cancer while your phone is charging.

[0] https://news.ycombinator.com/item?id=22479644


One word:

Heat.

Smartphones are decently powerful, but they typically only use the CPU in short bursts. This has allowed them to get away with being extremely skimpy on the cooling, which is good because a phone that needed a heat sink and fan would be incredibly bulky.

If you ran something with sustained CPU usage like F@H, within a minute it would be considerably warm and probably have crazy amounts of thermal throttling.


While true and while phones are considerably more powerful now than they used to be, modern ARM SoCs don't come close to their x86 counterparts.


Yes, but we have millions of idle phones at night constantly on and with insignificant power draw. Together they could easily match the idle PCs that people are happy to keep running and pay for that electricity.


One of the weird things about being a programmer is that we get really exposed to large crossings across orders of magnitude and it can be very difficult to internalize how that works. Sometimes, lots of little things don't add up to something significant. I don't know the exact numbers, but it's not hard to guess that a cell phone, by the time it's done thermally limiting itself and dealing with a massively smaller cache and all the other disadvantages it has could well be literally only 1/1,000th as effective as a single dedicated machine running this stuff. At that point, it can easily be the case that a million cell phones might add only single-digit percentages of improvement to the system as a whole, which, for the amount of human intervention in question, is not a very useful result. (That is, the same amount of human intervention could be much more usefully spent; if nothing else, by installing the native clients.)

I'm not saying that's the exact number; that depends on a lot of things. My point is that this is a great deal more plausible an outcome than your intuition may think. (It could even be worse.)

A similar idea was in vogue in the oughties about loading JS onto web pages and having people compute things while visiting CNN or something. But especially on the much slower JS of the time, and with no access to GPUs, by the time you worked out all the multiplicative slowdowns involved, and dealt with the crazy broken communications topology (radically slower than in-memory bandwidth, etc.), you were looking at loading down millions of people's machines to get the equivalent of single digit high-power dedicated machines for all but very embarrassingly parallel problems. You were better off just buying the CPU power.

("Well then, what about Seti@Home and such?" All embarrassingly parallel problems running native clients, at high rates of utilization. That's a several orders of magnitude more efficient than running things when a visitor happens to visit some web page (terrible utilization), on a slow JS VM, in a single thread.)


It looks like FAH has GPU support, but I don't see an AArch64 version that I could install on, say, an NVIDIA Jetson Nano.


And someone correct me if I'm wrong, but I imagine that would be much more energy efficient than a desktop computer


Energy efficiency doesn't necessarily comes from being better on full load. Smartphonechips provide tons of power saving mechanisms, true but also are quite limited on how they operate.

I assume that generic code on a x86 desktop cpu should be more efficient by watt/work in comparison on a mobile chip. I'm also not sure how a smartphone is rated for 100% cpu load 24/7


That's most likely to be the opposite of what is needed here: do as little as possible and sleep as much as possible. Whereas the project actually needs is as much raw computing power as you can provide.

Yes, there are some methods to provide more computing bang for less energy buck, but this is a comparatively minor issue.


Not really actually, as a ARM cpu is not as powerful as a x86_64 one : the "compute power / electrical power" should not be really better I guess




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: