Hacker News new | past | comments | ask | show | jobs | submit login

I can't believe you're not mentioning the super high power draw of 8k monitors. It's so bad that i'm not even considering getting one.



The power is not so bad, especially compared to the graphics cards you would want to use (and I use my GPU as a tow warmer). Samsung 8k specifically comes with low power presets which are probably usable in this scenario. Of course with so many more pixels in 8k than in 4k there is need for more power but the EU regulation allows selling them if they also support an eco mode.

I am old enough to recall 100W as the typical single light bulb and I still use an electric tea kettle that touches the multi kW range daily.

https://www.tomsguide.com/news/eu-8k-tv-ban-goes-into-effect...


> I am old enough to recall 100W as the typical single light bulb

I'm regularly in a museum where they showcase some of the 1800s/1900s wares of households in my area. One is a kerosene/gasoline powered clothes iron. Just because something was one common doesn't mean it was good.

> I still use an electric tea kettle that touches the multi kW range daily

How many hours a day is your tea kettle actually using multi-kW? The more useful comparison is how many kWh/day are these appliances using.


Fair enough. Though I didnt live during kerosene times. My tea kettle uses 0.06kWh per session. So one to two weeks of tea for me to reach the energy of a day on such a monitor (see other comment on best guesses on energy use of this monitor). On the other hand, a typical pasta recipe on an electric stove would be 2kWh, so several days of use of such a monitor.


I realize my previous comment might have been a bit more adversarial than expected. Sorry if you took it that way.

And yeah as your comment shows it's really kind of an odd comparison to make in the end. Ultimately I'm of the mind that if the 8K screen really gives you a lot of value then it's probably worth it. You're dealing with that energy cost, and ultimately it's up to society to properly price externalities into the energy costs. You can make the decision whether the energy costs are really offset by the extra value you get.

But like, an 8K screen does use a considerable amount more energy than say a 4K. For a bit back in the day people really started to care about energy use of CRTs as they kept getting bigger and fancier. Then LCDs came out and slashed that energy usage massively compared to an equivalent size. Practically negligible compared to what a decent workstation would use. Now we're finally back to the screen itself using a pretty big fraction of energy use, and IMO consumers should pay attention to it. It's not nothing, it's probably not the single biggest energy use in their home, but it might be around the same as other subscriptions they're questioning in a given month.

And yeah, in the end I think that energy metric should be based on how many kWh you end up using on it in a month or whatever your billing cycle is. Compare it to the value you get for other things. Is it worth a week of tea to run it for a day, cost-wise?

I had a period of time where I bought a car for $3k. I then valued every big ticket thing to the utility I got from a whole car. "That's like .75 Accords, is that worth it?" Kind of an odd way of looking at things but really put a lot of value into perspective.


No worries at all. And thanks for the additional color.


The eco mode is not usable, it's the manufacturer's way around a ban of 8k monitors. These monitors use at least twice what other monitors of the same size use, sometimes it's four times as much. And these measurements are probably in eco mode, so it could be worse.

> I am old enough to recall 100W as the typical single light bulb and I still use an electric tea kettle that touches the multi kW range daily.

Not sure why you mention this here? Just because we had horribly inefficient light bulbs our monitors can use twice as much?


I’m guilty of this as well. Folks of a certain age will always tend to measure energy consumption in “light bulbs.”

Sort of like how Americans always measure size in “football (gridiron) fields.”

The energy consumption of a traditional incandescent bulb, while obviously inexact, in nonetheless somewhat of a useful rough relative measurement. It is a power draw that is insignificant enough that we don’t mind running a few of them simultaneously when needed, yet significant enough that we recognize they ought to be turned off when not needed.


I always turn my monitors to the lowest possible brightness for long work sessions, so I assumed (perhaps mistakenly) that this eco mode would already be close to my settings out of the box, and if anything, too bright. Assuming 20c per kWh (California rates, mostly solar during the day) and one kWh per day (8h at 130kW average use), much higher than the allowed EU limit and the eco mode, the monetary cost comes down to about 4 USD per month. So definitely not negligible but also not a reason to avoid being able to tile 64 terminals if one wanted to do that.

[edit: the above estimate is almost certainly an upper bound on the energy I would ever use myself with such an item; I would be curious to measure it properly if/when I upgrade to one, and curious if the OP has a measure of their own usage. My best guess is that in practice I would average between 2 and 3 kWh every week (2 USD/month) rather than 5 kWh, because I tend to work in low light.]


I mean considering the value you'd get out of a single monitor compared to a single light bulb, I'd say yes.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: