Hacker News new | past | comments | ask | show | jobs | submit login

> In 2018, the most popular resolution for users of Firefox is still 1366x768. And only 1920x1080 is making any headway against the dominance of 1366x768. As much as I am surrounded by the culture of multiple 4K+ displays, apparently this is not at all commonplace. 4K doesn't even get listed, presumably lumped into the "Other" category.

Most of the tasks in the real world doesn't need multiple 4K displays, including low level and systems development. Most people read stuff, and as a developer and system administrator, when the text is looking good, this means the resolution is enough.

> In 2018, the most popular memory amount for users of Firefox is still 4GB, trailed by the also disappointingly small 8GB. In my world, I consider 16GB a bare minimum, but at least with memory I haven't been deluding myself—I know many people economize on memory. Still, I would have thought 8GB was the most common memory size by now.

8 GB is more than enough for most people. My family's Win10 desktop is happy with 4GB, my office desktop is cozy with 8GB. My own desktop has 16GB of RAM, but it runs many, albeit small, virtual machines. "The hardware is cheap, let's waste it" mentality doesn't help anyone and it's wrong. I've written some state of the art algorithms which use 1.5MB of RAM and make the CPU scream for cooling (I develop high performance computing software as a side-academic gig), so like every resource, RAM should be used sparingly.

Edit: I've no comment for flash. I think it's a forgotten remnant of old systems.




As someone who reads text all day, going up to a 27" 150 PPI monitor was huge. There is a tangible improvement in how that text looks, especially on an accurately colored monitor.

The other footnote should be that display prices have been crashing recently. You can get an IPS 24" FHD monitor for like $90, and a QHD version at 27" for about $150. Those would have been twice as expensive a few years ago.

That being said, all those 768p screens are crappy plastic laptops with really slow hard drives. That I guess is what we end up with the Intel took what should have just been the natural evolution of notebooks - small SoCs running a high PPI display in a metal frame - and made them into some premium brand name product with a huge margin on the prices of chips that cost peanuts to manufacture because they didn't have any real competition in the space for a very long time (and even then, their monopolistic behavior lets them keep AMD out of all the major brands premium notebooks anyway).


> As someone who reads text all day, going up to a 27" 150 PPI monitor was huge.

You're right, however not everyone has the same desktop space to accomodate a 27" panel. I can barely fit a 24" on my desk. 1440p monitors start at 25".

> The other footnote should be that display prices have been crashing recently.

When I was in the US at the end of 2014, one of my friends said the same thing about flash drives when I pointed out a $44 PNY 128GB flash drive. Unfortunately, other parts of the world doesn't work same way. Because EUR or other currencies are not fixed against US$, and prices fluctuate at most parts of the world if not increase. So, no, technology doesn't become cheaper as it matures at some parts of the world unfortunately.

Addendum: BTW, you are right about the 768p screens are generally found in entry level laptops or netbooks. These devices are most feasible ones when they first came out. Now they are bottom end cash cows which are virtually free to build.


Lenovo still sells 768p screens in their high end X-series laptops including the very recent X280.


Which imo is criminal, if the laptop is starting at >$700 they should "splurge" on the nicer display.


I could swear 1080p was the default on my X270 that I bought a few months ago, but it looks like its a $150 upgrade now -- on an already expensive nearly $1200 base model. I paid just over $800 for my X270, including the 1080P screen and upgraded memory.


1440p is a resolution for gaming, not work.

You can get 24" 4k (2160p) 180ppi 60Hz for $300 e.g. LG 24UD58.


Just purchased two 1440p monitors and I love doing work on them. 25" and the PPI is just perfect for me.

Anything higher resolution-wise requires a much larger display to be readable at 100% scaling. I'm adamantly against using scaling.


A 4K screen with 150% scaling is liquid smooth and other-worldly


Don’t you find it a little small at only 150%? What size/model?


> I'm adamantly against using scaling.

Why?


I guess I just don't see the point unless you're gaming, watching g 4k content, or doing graphic design. I tested out a 4k 28" and unless I had it at 200% scaling couldn't use it for longer periods of time. Sure it looks a little more smooth, but now you have to render 2.4x (Vs a 2560x1440 monitor) the amount of information for what I think is fairly little gain. I get more screen estate and still readable text with the 25" 1440p.

Perhaps my experience would have been better on a desktop, but this was for my work where I have a Surface Pro which when going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.

Also still come across apps that don't know how to scale so that can be really frustrating.


> I guess I just don't see the point unless you're gaming, watching g 4k content, or doing graphic design.

I have to say you have it exactly backwards!

Gaming on 4K is extremely expensive and still basically impossible at refresh rates higher than 60 Hz. In fact, you’ll be lucky to get even that much. 1440p/144Hz is a much better and more realistic target for even the most enthusiastic gamers.

Also a most welcome recent trend has been to ship with novel temporal antialiasing techniques, completely redefining how games can look at lower resolutions.

Temporal artifacts have always been the bane of 1080p, forcing higher resolutions either directly, or indirectly as subsampling. Once you take that out of equation, the benefit of native 4K is much more modest.

4K movies are nice, but as with games, it’s more of a linear progression. I doubt most people could even tell the difference in a blind test.

Full-range HDR is, in my opinion, a much better investment if you want to improve your TV viewing experience (and lately gaming as well) in a noticeable way.

I don’t know much about graphic design, but I doubt 4K is all that essential. Everyone has been using 96 dpi displays to create content for very high density mediums for a long time. Even the most craptastic ink printer is 300 dpi+. All you need is a zoom function. Color reproduction is, I think, much more important than resolution.

Where HiDPI displays really shine is actually in the most mundane: font rendering.

For anyone that works in the medium of text, programmers, writers, publishers, etc., a 4K display will be a considerable and noticeable quality-of-life improvement.

Even the most Unix-neckbeardy terminal dwellers will appreciate the simply amazing improvement in visual fidelity and clarity of text on screen[1].

> I tested out a 4k 28" and unless I had it at 200% scaling couldn't use it for longer periods of time.

That’s what you are supposed to do! :)

It’s only HiDPI at 200% scaling. Otherwise it’s just 2160p, or whatever the implicit resolution is for some partial scaling value.

For 4K at 100% scaling you’d need something like 45" screen at minimum, but that’s not actually practical once you consider the optimal viewing distance for such a screen, especially with a 16:9 ratio.

> I get more screen estate and still readable text with the 25" 1440p.

A 4K display should only provide extra space indirectly. With text on the screen looking so much sharper and more readable, it might be possible to comfortably read smaller font sizes, compared to equivalent 96 dpi display.

If you also need extra space as well, then that’s what 5K is for.

Though for things like technical drawings or detailed maps you can actually use all the extra 6 million pixels to show more information on the screen.

A single-pixel–width hairline is still thick enough to be clearly visible on a HiDPI display[2].

> but now you have to render 2.4x (Vs a 2560x1440 monitor) the amount of information for what I think is fairly little gain.

Yes, that’s an issue with things like games. However you can still display 1080p content on a 4K screen, and it looks just as good[3], and often even better[4].

Most graphics software will also work with 1080p bitmaps just fine. Vector graphics necessitates doing a little bit extra work, but for a very good payoff.

Overall though, for things like programming or web browsing, it shouldn’t matter. I have a netbook with a cheap Atom SoC (Apollo Lake) and it can handle 3K without breaking a sweat. That much more capable[5] GPU on your Surface Pro should easily handle even multiple 4K displays.

Pushing some extra pixels is not a big deal, if all you’re doing is running a desktop compositor with simple effects.

> going from a docked to undocked state (or vice versa) with monitors that didn't match the same scaling as the surface resulted in graphical issues that can only be resolved by logging out then in again.

Yeah that must suck. Still, it’s only a software bug, and you mustn’t let it keep you from evaluating HiDPI on its merits.

> Also still come across apps that don't know how to scale so that can be really frustrating.

That’s life on bleeding edge ;)

Sure, it’s annoying, but the situation is a lot better than it used to be. Even Linux is doing fine, at least if you stick to recent releases. Some distros like to ship outdated software for some reason :/

Still, in my opinion, the quality-of-life improvements of a HiDPI display very much outweigh the occasional inconvenience. Though obviously, YMMV.

[1] https://www.eizo.be/eizo-pixeldichte-im-zeitalter-von-4k-e5....

[2] Assuming you’re viewing at optimal distance.

[3] With the notable exception of 96dpi native pixel art.

4K has exactly 4 times as many pixels as 1080p, so it shouldn’t be an issue in theory. Nearest-neighbor will give you exactly what you want.

However in practice you need to force scaling in software, otherwise graphic drivers, and most monitor’s postprocessing, tends to default to bicubic scaling. That said, pixel art is not computationally expensive, so it’s mostly just an inconvenience.

[4] You can use advanced scaling algorithms to upscale 1080p to 4K and it usually looks great. E.g. MPV with opengl-hq profile or MadVR on Windows. For that you’ll need something a notch over integrated graphics though, e.g. RX 560, GTX 1050 and on mobile Ryzen 2500U or equivalent.

[5] http://gpu.userbenchmark.com/Compare/Intel-HD-4000-Mobile-12...


2560x1440 on a 27 inch at a reasonable distance is pretty darn close to optimal IMO, so 4k, to me, is for 34" monitors (but 27 inch I feel is optimal on 60-75 inch desks, which is what I usually work with, so 4k rarely matters).

I'm with you on accurately calibrated monitors though! God most of them suck out of the box.


If you don’t see any (spectacular!) difference between 4K & 1440p you need to have your eyesight checked.

I’m not being sarcastic. The last time there was a thread like that on HN a bunch of people figured out they need glasses.

I have a 4K @ 24in monitor (180ppi) and a 267 ppi netbook and when I switch between them the 4K starts looking like a blurry mess!


> The last time there was a thread like that on HN a bunch of people figured out they need glasses.

Its fair advice, but some eyesight issues cannot be solved with glasses.. if they can be solved at all.

Also worth noting that with TVs the distance matters a lot. With monitors, laptops, and gadgets it is relatively stable.


For TV/Multimedia HDR makes much more difference than 4K in my experience.


No, my eyesight both far and close was quite a bit better than average as of last week (as I was just there getting checked). We have a lot of 4k and 5k displays at work and most people who say they can tell a (significant) difference when we compare (the topic comes up a lot) seem to usually either be on > 27inch, have scaling higher than what's expected, or just fail to see it when we really test it out. Your millage may vary :)

Don't get me wrong, I can see a difference, but not nearly as night and day, especially when it comes at the cost of other features (eg refresh rate... which isn't the end of the world for coding so if it's the only thing you do on the monitor it could be worse... otherwise ouch my eyes.)


> have scaling higher than what's expected

What scaling is that? 200% scaling is what you should have, 4K is exactly 4x as many pixels as FullHD. If someone is using lower scaling then they are trading sharpness for virtual space.


I never get comments like these, as if the text just gets smaller as the resolution increases, rather than what actually happens (the text gets crisper). 5120x2880 is so nice because you can’t see the pixels and words almost look like they are on paper.


Seconded. 2560x1440 on a 27" panel is only 109 pixels per inch. I use a ThinkPad with that same resolution on a 14" display, with a 24" 4K UHD next to it in portrait mode.

Both displays are around 200 pixels per inch, plus or minus. It's great not having to see the pixels, so much more pleasant and easy on the eyes.

Also the combination of a portrait display with the landscape display is really nice. I can read an entire PDF page without scrolling.


I agree that having higher PPI is great, but are you using scaling to make text larger? I was barely able to use a 28" 4k a 100%, can't imagine doing that at 24"


Yes, I should have mentioned that I'm using Windows 10 with 225% scaling on both the 4K UHD 24" display (187 DPI) and the WQHD 14" (210 DPI). Some people like a bit less scaling, some more, but in general you want a scaling factor that roughly matches your display's pixels per inch.

The original Windows "standard display" was assumed to be around 96 DPI. That's the monitor that 100% scaling (i.e. no scaling) is intended for. Round the 96 up to 100 and we can say that in rough terms, the percentage scaling should be in the neighborhood of the monitor's DPI.

So monitors in the 200 DPI range are best at around 200% scaling.

A 28" 4K UHD has 157 DPI, so I wouldn't want to try it at 100% scaling - ouch. It ought to be running in the 150-175% scaling range.

The idea with a high-DPI monitor isn't to make everything smaller on the screen, it's to make everything sharper and more detailed. When you double the DPI and scale appropriately, you get four times the number of pixels for everything you put on the screen.


> A 28" 4K UHD has 157 dpi, so I wouldn't want to try it at 100% scaling - ouch. It ought to be running in the 150-175% scaling range.

That’s not how it works. Lower dpi does not somehow give you more real estate!

You should still be running with ~200% scaling because you are viewing it at a greater distance.

Optimal viewing distance, assuming 16:9 ratio, is 120 cm vs 140 cm for 24" vs 28", respectively[1]. Accounting for the difference gets you ~155 ppd with both monitors[2][3], maintaining 25.0° horizontal viewing angle.

The closer your viewing distance the more ppi you need for the same density. That 28" is not inferior to the 24", when you account for distance, despite the lower ppi, because the greater viewing distance makes the pixels look smaller, thus creating more dense image.

[1] https://en.wikipedia.org/wiki/Display_size

[2] http://phrogz.net/tmp/ScreenDens2In.html#find:density,pxW:38...

[3] http://phrogz.net/tmp/ScreenDens2In.html#find:density,pxW:38...


I guess the problem is I value amount of information I can fit on the screen vs. quality of the information.

Also, apps that don't scale properly are a pain haha.


Scaling is usually on by default in most modern operating systems.


Bitmap text can look clear as crystal on a very low pixel density display.

You need a higher PPI to make anti-aliasing work on screen (finally looking nearly as nice as print).


Last year I had a 45” 4K screen with 150% scaled UI and was able to develop in VS code with two code windows open side by side, all with the crispiest text I’d ever seen. It’s the dream.


It's not about needing 4k, it's about the prices. While it's normal in the usa to get a couple of these monitors for me the expense is impossible. And I really would use a 4k monitor. It's a big world, and most of it is not the usa


Actually, it's not always the price. Serious developers or seasoned computer enthusiasts doesn't change rigs every couple of years. If one's system is performing well enough, and the user gets used to it, system upgrade can be deferred until some technology or computing resource becomes necessary. When something breaks, the broken part is replaced and upgraded in most cases.

Personally, I've just upgraded from 1680x1050 to 1920x1080. My old monitor was 10 years old, and was performing relatively well. I bought a new one, because it started to show its age (mostly backlight age).


I would __much__ rather have TWO 1080p or 1920x1200 displays than a single 4K display. For me, the quantity of visible 'glass' matters most. It's surprisingly difficult to drive double 4K monitors.

Maybe as the price comes down and more transistors end up in basic chipsets we'll see 2 and 3 heads of 4K displays become common.


Agreed. I've been using a Commodore or PC desktop since ~1986. Been through lots of hardware iterations, seen and pondered many different configurations. I found the 24" 1080P display is the best and the more, the better[0]. And no larger than 24" either, that's crucial. I've downsized from larger panels to 24".

I wouldn't trade my three 1080P 24" panels for 4K panels, unless the goal was to sell the 4K panels and rebuy the 1080P 24" panels. I don't do a lot of gaming anymore, but they're hardly a horrible experience in that regard either.

[0]https://pcpartpicker.com/b/dFmqqs


I almost agree, except that I definitely prefer 24" 1920x1200 over 1920x1080.


So I wasn't able to get this out of my mind and went to order 3 new 1200P panels today and actually backed out at the last moment after further consideration. I think it would drive me nuts to effectively have a 21.6" panel in 16:9 optimized content, which is everything. Which would bother me more than having the additional 10% viewspace for work. Especially considering I have 3 panels, plenty of workspace.

I think at this point in the market, I'm going to stick with 16:9 native resolutions. If I do any swaps, I'll probably try out dual 27" 4K panels (also 16:9, great for 1080P content), with one mounted below the other. That'll be pretty nice in time as 4K becomes better supported.


Agreed. I couldn't find (modern, thin bezel + Displayport) 1920x1200 panels when I was shopping last time. I do see a few on the market right now, when one of my 3 goes out, I'll sell off the other two and order three 1920x1200 panels for sure.


Why do you prefer a 24 inch display to a larger one such as a 27 inch?


The dot pitch is too big, at least at 1080P and it's just not as sharp as I prefer. I had a couple 27" 1080P panels at one point and got rid of them. 1440P@27" is good but I've just been happiest overall with my three 1080P panels, the main factor is fitting three 27" panels on most desks (& mine is rather large) is harder than 24s. Also, less neck movement to focus on a panel on the periphery. Some trial and error to reach this point but as long as I run three panels, I'll never go beyond 24".




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: