Hacker News new | past | comments | ask | show | jobs | submit login
4K for $649: Asus' PB287Q monitor reviewed (techreport.com)
88 points by ismavis on May 30, 2014 | hide | past | favorite | 90 comments



Here's a chart which shows the ideal viewing distance for various resolutions, based on the smallest detail the human eye can discern at 20/20 vision.

http://cdn.avsforum.com/4/4c/600x376px-LL-4cd4431b_200ppdeng...

This monitor is pretty close to retina level DPI based on the typical viewing distance, but i guess a 24inch 4K would be even better.


Note also that 20/20 vision isn't actually that high, especially if you're young, so the chart isn't necessarily definitive.

"a subject diagnosed as having 20/20 vision will often actually have higher visual acuity because, once this standard is attained, the subject is considered to have normal (in the sense of undisturbed) vision and smaller optotypes are not tested."

http://en.wikipedia.org/wiki/Visual_acuity


It's probably better to have a larger screen further away, particularly for older people who can't focus close up any more. That occurs some time in your 40's for you youngsters who may think "older" means quit a few more years than that.


Your information is wildly inaccurate. Take a 30' black screen vs one with a single white pixel and someone can tell the difference from across a football field if it's dark enough. Do the same thing with one white pixel vs 2 next to each other and you can't tell the difference. The important point is screens showing normal video have aliasing effects so under some situations with unedited video you get differences such as flickering at fairly long distances. Edit: Basicly if you have 480p and 720p video having a 720p monitor is worse than a 720x4 monitor at fairly long distances.

Toss in compression artifacts and you want a screen at least 4x the resolution as your showing in that chart.


The chart is just about how many DPI you need based on a specific distance to reach the point where the average human eye would stop seeing benefits vs even higher DPI.

Your points are more about refresh rates, video compression and lightning. For example, in gaming antialising and other "smoothing" techniques are wildy used to improve image quality, but when playing in 4K resolution on a 24inch screen you wouldn't need those anymore because your eye can't see a difference.


Aliasing is vary easy to notice even on a 4k screen. Look at the second immage it's got way to much white and you can easily have the same issues at 4k or 8k.

http://en.wikipedia.org/wiki/Aliasing

In the end increasing resolution does help in most cases, but with the right fractal pattern there is no 'safe' resolution.

PS: It's basically the same reason that QuickSort is a O(n^2) sorting method in the worst case. Pick the wrong data and your assumptions fall apart.


Note that you want no pixellation effects when image features are at the limit of your visual acuity. So you want the pixels to be 3-5 times smaller than you can see.

To demonstrate: draw a pair of vertical black lines 1px wide, with 1px white space between them. Then, tilt them at 30° or 45°. (Or, draw a pair of circles that are 1px thick and have 1px between them at the top, bottom, and sides; then look at various other positions.) Then try the same thing with a line and space thickness of 3px and 5px.

See also: https://en.wikipedia.org/wiki/Moir%C3%A9_pattern


This chart gets trotted out in seemingly every resolution-related discussion... does anyone know where the numbers come from? I've always assumed they were pulled from somebody's ass, but I'd be interested to find out if that's not the case.

Very curious since it looks dubious to me but gets thrown out at as cold hard fact every single time.


do you have something like this but log scale? the most interesting part of the graph is the most unreadable.


I believe that chart is for video viewing. That's important for a TV, but on a computer monitor you also spend a lot of time reading text and higher resolutions tend to provide benefits for longer with that.


> Web browsers can be a problem. You may want to choose Internet Explorer rather than Chrome, since Microsoft has clearly done more work to support high-PPI configs. However, note that IE ditches the ClearType sub-pixel antialiasing scheme and snap-to-grid GDI font rendering in favor of simple greyscale antialiasing. As a result, the effective text resolution with IE at high PPIs isn't a huge leap from other browsers with ClearType on conventional displays. [emphasis mine]

Umm.. I know Firefox is not fashionable these days but ignoring it completely seems bit odd, especially if both Chrome and IE produce suboptimal results.

Also can't you these days force compatibility bitmap-scaling for applications like Fraps that apparently do not work correctly with HiDPI? Sure it is one extra step that ideally shouldn't be necessary, but it is not like you need to live with broken UIs.


Firefox does not handle high DPI screens. It renders fonts at a fixed 96dpi and your only option is to crank up the devPixelsPerPx pref, which is akin to page zoom.

The bug has been reported multiple times. It always ends up being closed as wontfix, which is just burying the head in the sand. My laptop is 210dpi, and I guess we'll just go up from here.

One instance of the bug:https://bugzilla.mozilla.org/show_bug.cgi?id=512522


> your only option is to crank up the devPixelsPerPx pref, which is akin to page zoom

Since Firefox 22 you shouldn't need to adjust it manually. See eg this bug: https://bugzilla.mozilla.org/show_bug.cgi?id=844604 (Status: RESOLVED FIXED). I'm not sure why you seem to think that "akin page zoom" is not the proper result, what else you expected?

Quoting Anandtech:

> Chrome is scaled 150% as Windows asked, but it is hazy and blurry. Disabling DPI scaling for the application and then scaling to 150% inside Chrome produces crisp, clear text. Firefox also didn’t scale automatically, but it has a setting to adjust to make it follow the Windows DPI scaling rules. Once set, Firefox looks very nice and crisp. For most people, that setting should already be set to follow DPI scaling.

http://www.anandtech.com/show/7157/asus-pq321q-ultrahd-4k-mo...

Note that the review predates the release of Firefox 22.

edit: This is the tracking bug for Firefox HiDPI support on Windows: https://bugzilla.mozilla.org/showdependencytree.cgi?id=82067... . Hardly burying their heads in sand.


  > I'm not sure why you seem to think that 
  > "akin page zoom" is not the proper result,
  > what else you expected?
It is not what I expect because it does not affect the browser chrome. See a screenshot of my current Firefox:

https://dl.dropboxusercontent.com/u/461223/screenshot.png

Notice that the i3 bar at the top has the font at the correct scale. The page itself too (via devPixelsPerPx). Chrome elements such as tab names or the URL bar are at less than half the correct size and barely usable without a hack plugin (I use Theme Font and Size Changer).

Everything other than FF or Thunderbird scales correctly. The X server is set to 210dpi and GTK reports the correct density too. It is firefox that is misbehaving.

This is FF nightly (34, I believe), so way way after FF 22. It is not fixed, and it is not a loved bug. Time will make the bug bite the developers and by then it will be fixed, I'm sure.

Edit: Here's a second screenshot, this time witout devPixelsPerPx, showing FF completely ignoring the system set DPI: https://dl.dropboxusercontent.com/u/461223/screenshot2.png


Oh, Linux. Yeah, the story there might be significantly worse, possibly even more if you are not running GNOME. I'll have to test that myself, but I can imagine that it is not as smooth as Win/OSX. I'll just say that it is not representative of Firefox as a whole.

edit: Here is Firefox on Windows 7, Windows being configured at "150%", no tweaks on Firefox: http://zokier.net/stuff/firefox_150.png

here is 100% for comparison: http://zokier.net/stuff/firefox_100.png

Some of the icons etc are bit blurry, but at least they are scaled.


From what I gather about in several submissions of the bug in the past, the problem is that properly setting display density in Linux has changed over time. So, it is inconsistent and is thus in many instances false. Firefox then concludes that X lies about display density and ignores it.

It is a wrong assumption, and one that pegs FF to 96dpi on Linux. It is not a sustainable decision, but I guess it will only be reverted when the noise about it is enough, which will only happen when high dpi displays become more common.

Anyhow, it is not correct to assume X lies about the display density. Modern distros set the density correctly, ever since the infrastructure for xrandr was put into place. Its only fault, today, is that display density can't be set per display. It is a setting of the root window, I believe, so it is shared by all displays.

Side note: This is i3 running on top of a gnome3 session. For the sake of this problem, it can be seen as a gnome session.


> the problem is that properly setting display density in Linux has changed over time

Yeah, I just tested this on my GNOME3 system, and there is no simple DPI setting anywhere in sight. `xrandr --dpi 144` which seemed like the logical solution had absolutely zero effect on anything (gnome or firefox). Anyways this is what I ended up with: GNOME scale factor at 1.5 and layout.css.devPixelsPerPx also 1.5: http://zokier.net/stuff/firefox_linux_150.png . Looks pretty good to me.

edit: and just for completeness sake, I also tested this on KDE4. Before changing session I reverted GNOME scale factor to 1.0, so it did not affect my results. I again started with `xrandr --dpi 144` and launched Firefox and the fonts actually had been rescaled! UI was not, but that was fixed with layout.css.devPixelsPerPx=1.5, end result: http://zokier.net/stuff/firefox_kde_150.png

In conclusion, it seems like Firefox on Linux queries the DE for fonts (size and family). KDE4 seems to use X DPI settings in determining the font sizes, while GNOME3 is ignoring them and instead using its own "scaling factor".

edit2: this is getting bit out of hand. I also tested this with just a WM (in this case Fluxbox), without DE. `xrandr --dpi 144` had no effect but setting Xft.dpi to 144 with xrdb did adjust the font size in Firefox. Like others, devPixelsPerPx needed also to be set to 1.5 for UI to scale: http://zokier.net/stuff/firefox_fluxbox_150.png

So in further conclusion I'd say Firefox is handling the situation relatively well, considering the mess that DPI scaling seems to be on Linux.


devPixelsPerPx is much, much better than "page zoom." For example, it displays 2x content correctly. In fact, Firefox is the only browser that supports high-DPI displays on Linux. Chrome doesn't bother.


It is good. Not perfect, but usable. Anyhow, web development has a problem with high screen displays. On the one hand, the px unit in CSS is not a real pixel, so it can be scaled with respect to screen pixel density without breaking the spec. On the other hand, images are really measured in screen pixels.

The end result is that layouts break at densities other than 96dpi, where one CSS px is one screen pixel. The solution would be for image files to embed the display density, but from what I gather, browsers discard that information.

As I posted in the other comment, the major problem now is that the browser UI is not drawn at the correct density and ends thus tiny on high dpi displays.

Anyhow, as these displays become pervasive, it must get fixed.


Firefox handles the high DPI (227) display on my Macbook just fine…


Presumably because it uses native libraries that have support for high DPI displays.


I'm bothered by the usage of "whore of babylon". I feel like that sort of language doesn't belong in a review like this.


So, I think this phrase may benefit from some context. Specifically, the cultural backdrop to the statement is that we[1] have a recent tradition of using the transition from "pure, sweet good-girl teen" to "raunchy, sexualised young woman" as a kind of marketing event for female pop stars who began their careers as teenage TV stars. Part of their marketing value derives directly from the public and "shocking" nature of the transition. It's essentially exploiting the Madonna-whore complex[2] for marketing purposes.

There are ambiguities, of course. Perhaps the "shock" value of female sexuality is a good thing and is helping us all to get out of outmoded views about female purity. Perhaps the presentation of female sexuality as being about raunch and nudity is catering to male fantasies and is thus bad. Perhaps the problem lies with the excessively "pure" image that teenage female entertainers need to maintain in order to be deemed "family-friendly".

Personally, I interpreted the comment in the article as being one about the exaggerated nature of the image change that stars like Miley Cyrus go through once they hit the age of consent. She's neither the whore of Babylon nor Hannah Montana, but it suited advertisers to portray her as both at different times.

Is this a suitable topic for a joke or a metaphor? I didn't interpret it negatively, so for me it was fine. I can appreciate why other people would disagree, but I thought adding some context might be useful[3].

[1] Actually this is mostly an American thing, so I'm not entirely entitled to use "we" here

[2] http://en.wikipedia.org/wiki/Madonna%E2%80%93whore_complex

[3] Who am I kidding? I'm just giving my 2c on an internet forum like everyone else who has nothing better to do right now


your explanation is reasonable as far as it goes. Yes, the two states are being used to contrast the shocking change in nature of the product. However, this doesn't speak as to whether it's ok.

1) commodifying women is not ok.

2) comparing women as an object/product to a piece of tech is again problematic.

3) half your potential audience for your review is women. Do they want to read about other women being talked about like this?

4) using language like this makes it easier to accept it's reasonable and inoffensive. It's not inoffensive.

just saying oh well this is advertising and marketing is not enough. Questioning it's validity is worthwhile. Questioning whether we want to see this kind of exclusionary and sexist language in professional copy is worthwhile.


So much argument... I can't believe some people get this offended by simple words. In this case, Miley doesn't even deny it, she's trying hard to give that image, so it's really childish to complain about it.

I hate this attitude that makes people watch their words. It may be true of public and influential personalities, but who cares your choice of words as long as you convey the message? I believe that everything and everyone can be laughed at.

Sure, you may feel offended inside. But you should at least understand that it makes no sense and refrain from sharing your "I'm offended" feelings. Same thing goes for the other end of the spectrum by the way: you can be sexist but as long as you don't act on it, I don't see a problem. That's what freedom is about.


> So much argument... I can't believe some people get this offended by simple words.

I'm a man, I can walk down the street without being wolf whistled, leered at or otherwise feeling threatened.

Every single woman I know has been hassled in public by men with sexist, threatening language like this. walking down the street they will get an unwanted comments about their appearance, if they ignore them or complain, they get called the kind of things you call "simple words".

Because this kind of language is in the everyday lexicon of of some men, it's used to hurt, threaten and intimidate women.

but y'know, please don't get upset by my simple words yeah.


You are straining to take offense.

Some women have criticized Ms. Cyrus's image-transition as cynical and self-destructive – see for example Sinead O'Connor's "open letter" [1]. Those women might appreciate, rather than take offense at, a sly nod to the accelerated, commercial nature of Cyrus's sex-it-up-for-a-buck makeover. The review's throwaway line, to the extent it expresses any viewpoint at all, can equally be seen as embracing one particular feminist critique of sexualized-marketing.

So if some people find a word choice "offensive" based on a simple checklist of dos-and-don'ts, but others find the same phrasing a usefully vivid and possibly even progressive turn-of-phrase, which side should have its preference respected in future writing? Do we take a majority vote? Does one iota of declared offense, from the most easily-offended, always win, ensuring gray committee-vetted prose from here to eternity?

[1] http://gawker.com/everyone-needs-to-read-sinead-o-connors-op... – One of O'Connor's points is: "The message you keep sending is that its somehow cool to be prostituted.. its so not cool Miley.. its dangerous. "


>I'm bothered by the usage of "whore of babylon". I feel like that sort of language doesn't belong in a review like this.

If you're complaining about obscenity, you should know that that was a reference to the Bible, of all things.


I was all set to agree with you and criticize OP for being overly sensitive; but yeah, the writer was less making a biblical reference than calling a young woman a slut. Uncalled for.


Actually, it doesn't belong anywhere. In the book "Women's Infidelity" the author makes a compelling argument that a lot of relationship issues today stem from the double standard society uses to shame womens sexuality. Promiscuous men are revered, while women are shamed. There's a LOT more to it and it's really interesting reading. Society is a changin' and this language needs to go, not just on tech sites.


Thank you. I came here specifically to say just that. I'm trying to do more to say something in cases like these and call it when I see it. That usage was jarring and yes, completely out of place / uncalled for. Glad others feel same. Of all the analogies in the world, that was the choice?


you're right to be, casual sexism in a tech review is not ok (not that it would be anywhere), and seems extremely unprofessional.


It's not sexism. There's nothing gender-specific about it.

EDIT: As an aside, I've been refreshing the page frequently and the dynamics of which comments are rapidly going gray and back to black are very interesting. Not to mention the order of comments on this page. Clearly there's some strong opinions here. I feel like an analysis of post voting frequency and vote type could be performed and would be very insightful.


If calling a woman a "whore" (which is a gendered slur in itself[0]) is not sexism, what, to your mind, is?

[0] https://www.google.co.uk/search?q=whore count the women depicted, count the men depicted.


What, to my mind, is? Cheerio chap! How about you misread my post? Try some reading comprehension.


Try defending your assertion that the word 'whore' is not gendered instead of giving up after a single comment.


if calling miley cyrus a whore of babylon isn't sexist within your mindspace, your mindspace is weird.


Opinions about subsets map so well to the whole set.


the actual usage is "whore o' Babylon". is it really something to get your panties in a bunch? get over yourself


Choosing to brand someone who is comfortable with publicly displaying their sexuality as a whore (however they choose to spell the sentence) shows either a low intellect, a meanness of spirit or simply someone who rushes to draw conclusions without thinking.

It's pertinent to the topic, in any of the above cases, as they all make you wonder if you can thus trust the rest of his review/opinion.


> shows either a low intellect, a meanness of spirit or simply someone who rushes to draw conclusions without thinking.

I disagree. And so do a lot of other people. Look up the word in the dictionary. It's an accurate definition.

I'm more bothered by the fact that as soon as I read that sentence in the review, I knew someone who loves to be offended by everything would have already commented about it on HN.


"whore of babylon" has a specific connotation that doesn't really seem to apply to Miley Cyrus. Either the author thinks Cyrus is the herald of the antichrist, or he's trying to use a biblical expression to make his naming and shaming seem clever.


Oh, please... the author is doing funny turn of phrases in every second paragraph. But sure, it is "racist" to joke about female celebrities?

(That said, I have no clue what Hanna Montana was. But Miley Cyrus is hardly extreme, even for mainstream music.

[Edit: And if copying black music culture should be condemned, I don't know how much will remain? :-) Not only in the US. I guess the jokes are about the contrast with Montana? Sure women often gets the short stick in criticism, but don't throw the baby out with the water. Enough discussion, please. ]

)


> It's an accurate definition.

Let's google "whore definition" :

> noun, derogatory, a prostitute.

She's not a prostitute (you can google that definition too), so it's not an accurate definition. And, as you're keen on definitions, it's defined as derogatory, so that really does indicate at least meanness of spirit in the writer, yourself, and "a lot of other people".

> I knew someone who loves to be offended by everything...

Personally, I'm not offended by "everything", and when I am offended, I don't enjoy it. His statement simply was offensive.

> as soon as I read that sentence in the review...

You clearly show a level of awareness of what is potentially offensive. Perhaps you could try using a little empathy towards others when you come across something that triggers that awareness, and you might find yourself enlightened.


I may be a nitpicker, but for me 4k is 4096x2160, or am I wrong? This screen is UHD, or 2160p but not 4k.


For me, it's 7822x4096. Or even 7112x4000 if you must. This really should be called 2160p, but you know, marketing. Also really loving my 976GiB "terabyte" hard drive, and my "ten meg" 1.25MiB/s broadband.


____P is a broadcast specification, and was never meant to become a display measurement. It intentionally only defined the vertical resolution as this was the only specific resolution requirement; for example 1440x1080 is a valid 1080P broadcast resolution (it would still be displayed in 16:9 however). Moreover specifying Progressive is utterly redundant in modern displays (and thankfully is becoming universally redundant).

____P is no less of a marketing label than 4K, just a different one. Any monitor should be defined by its resolution; however this is clunky as hell- as demonstrated by the sheer mass of people keen to inappropriately use the ____P nomenclature.

UHD seems a valid shorthand of this resolution; and to be fair it tends to get used; although often alongside 4K.


I would be quite happy with the demise of interlaced display modes as well. But I'm not so hopeful that we won't see a "4Ki" resolution for cable/satellite UHD channels in the future. All of the channels I get are 1080i still.


The thing is the consumer displays are just reaching the resolutions that the cinema world has been working with for years and they always referred to the horizontal pixel count.

CRTs of course never had meaningful horizontal pixel counts but had very meaningful line numbers so that was what was used in the TV industry.

The broadband situation is sort of similar and has always historically been measured in bits.

With hard drives I don't know if there is a reason they can't put 1000GiB on their "terabyte" hard drives which might be reasonable.


Interesting info on the cinema resolutions, wasn't aware of that, thanks.

As for hard drives, it's unfortunately not even 1000GiB. It's 1,000,000,000,000 bytes; or 931.32GiB. Of course usually it's actually a bit higher based on the design, and they 'round down'. My OS is reporting my WD Reds as 976GiB, so I'm 'only' missing around 48GiB.

And on that note, I understand the whole 'humans like base 10', but computers like base 2. And now half of my apps use base-10, half use base-2. And I pretty much have to use base-2 for all older stuff, like ROM sizes from old game cartridges.

And I doubt people actually cared about 1000-vs-1024 when they said their file was "three megs", but now instead of consistency we have a mess.


It is quite widely accepted that 3840x2160 is "4K", eg. quoting wikipedia:

> The SMPTE first released Standard 2036 for UHDTV in 2007. UHDTV was defined as having two levels called UHDTV1 (3840×2160 or 4K UHDTV) and UHDTV2 (7680x4320 or 8K UHDTV). http://en.wikipedia.org/wiki/4K_UHDTV

and

> The television industry has adopted ultra high definition television (UHDTV) as its 4K standard. http://en.wikipedia.org/wiki/4K_resolution


You're not wrong, but I think the argument has been lost with the marketers long ago.


Double disappointment (it also looks fairly ugly, as do all monitors I've seen apart from Apple's ones, which don't work with non-Apple computers :-/)


My Chromebook Pixel has spoiled me. A 28" 4K monitor isn't sharp enough. Dell has a 24" that yields 180 ppi. I'd be interested to see that in person, but it probably still isn't sharp enough comparatively.


You typically sit closer to a Laptop screen than to a Desktop screen, so this should be pretty close to retina level DPI depending on the viewing distance

http://cdn.avsforum.com/4/4c/600x376px-LL-4cd4431b_200ppdeng...


$649 in US

$699 in Canada

$1000 in UK <- What on earth is going on here - even with 20% VAT?


Isn't the canonical answer to these kinds of questions that you're missing the sales tax that would inevitably be added on to the US/Canada prices?

Although, having said that, checking Quebec[1] gives me CAD803 which is ~GBP443 -- ~GBP150 lower than Amazon's UK price of GBP599.99

[1] http://helpsme.com/tools/free-gst-hst-pst-sales-tax-calculat...


A trend that appears to be synonymous across a lot of markets (electronics, fashion, etc)... just swap that '$' out for a '£' and call it job done!

Why? Because they can.


$649 in US (without sales tax)

$721 in Sweden (without VAT, http://www.prisjakt.nu/produkt.php?p=2596044)

$836 in UK (without VAT, http://pricespy.co.uk/product.php?p=2596044)


Just look at the difference in prices for laptops, those are nuts. I'm importing everything from the US, even if it's more expensive (and it never is), because I will not support this kind of price gouging.


I can't find it right now but I read about a study that concluded British people would chose x£ over x$. That's probably being exploited here.


But still, at £600 it's much less expensive than most other available 4K displays (that I've seen)


There's a Samsung 4k pre-order for £500 on Amazon, Overclockers etc.


You wouldn't happen to have a link or know the model number would you?


There's also a Dell UHD for 450

http://www.amazon.co.uk/gp/product/B00IOUBOB2/


U28D590 - You can get it on eBay from Korea for £450 including VAT and customs duty.

Although probably better to buy in country for warranty reasons.


700 Euro at Redcoon.


Import tax + VAT.


Good to see the price of big high resolution screens coming down, after so many stagnant years.

> The one thing that may freeze you from pulling the trigger right now on the PB287Q is, oddly enough for the monitor market, the promise of better things coming soon.

Any guesses about how the market will progress in the next year or two? I have an old 30" 2560x1600, bought for £1200 6 years ago - good enough for my uses (coding). Would like to get a second similar screen, when they are cheap. At the moment I see e.g. 27" 2560x1440 for £420 [1] - would buy it today, except maybe I can get something cheaper and better soon ...

[1] http://www.cclonline.com/product/95902/U2713HM/Monitors/Dell...


I'd buy one if I knew it would work. Heck, I'd buy two or four for a multimonitor setup. The process of figuring out whether a given laptop or graphics card will drive 4k over a particular standard is daunting. Knowing if it will work with Ubuntu, in particular, is beyond me.

I wish there was a standard -- perhaps over USB -- where ordinary people who don't play games and just want a machine to work on (emacs, xterms, web browsers, word processors, rather than gaming) could make many monitors and large monitors work plug-and-play.


> I wish there was a standard -- perhaps over USB -- where ordinary people who don't play games and just want a machine to work on (emacs, xterms, web browsers, word processors, rather than gaming) could make many monitors and large monitors work plug-and-play.

You mean like DisplayPort? If your system supports DisplayPort (1.2 or later), you should have no issues getting image on screen. EDID/DisplayID can be used to autoconfigure DPI/PPI scaling, so ideally the experience should indeed be just plug-and-play.


USB display adapters do exist, people have been using USB 2 adapters with 1080 displays, and USB 3 with 4K will have more bandwidth per pixel available.


> The sRGB option does produce colors that appear to be closer to our post-calibration settings than the default "standard" mode, for what it's worth.

Hmmmm... I need something that can do a little better than "closer to". Guess $650 and color accuracy is too much to ask at 4k.


Dell's UP2414Q was on sale for $765 a couple of times this month. It's a smaller screen, but has really great color. Details from Tom's Hardware: http://www.tomshardware.com/reviews/dell-up2414q-monitor-rev...


So how long till we get 4K touch displays?

These displays are fantastic, but I'd worry about touch models being released in a couple of years. If you use Windows, or OS X gets good touch support, you might regret the purchase.


I'm still holding out for the ThinkVision 28:

http://news.lenovo.com/images/20034/ThinkVision%20Spec%20She...


The Miley comment is totally unprofessional and unnecessary.



Err, 30Hz refresh rate at 4k as the review mentioned?


Can someone explain how this is 4k? Apple's Thunderbolt Display has a resolution of 2560x1440 [1] and ASUS' PB287Q also has 2560x1440 according to Amazon [2].

What am I missing? UPDATE: I transposed digits and looked at the PB278Q not the PB287Q.

[1] https://www.apple.com/displays/specs.html

[2] http://www.amazon.com/PB278Q-27-Inch-LED-lit-Professional-Gr...


I believe you're looking at the 278Q. The 287Q (note the digit changes) has a resolution of 3840 x 2160 according to the article.


Ah yes. Thanks!


So it's 3.84K then. Almost 4K.

A bit like how a carrier one sold me a phone with EDGE as 3G. I mean, 2.75G is pretty much 3G, right?


Um, anything that's marked as "4K" has 3840x2160 resolution.

Pretty much any "<something>p" and similar marking is misleading in terms of marketing anyway.


Don't blame it on bad intentions. It just happens to be four 1920x1080 screens for scaling purposes. Much like bytes and kibibytes. I'm okay with this.



The review is about the PB287Q, your amazon link is for the PB278Q.


Ah, that was stupid of me :) Thanks!



Good to see prices of 4K coming down. Are 4k displays better at rendering text for those who read and work with code a lot?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: