Hacker News new | past | comments | ask | show | jobs | submit login
Dell Ultra Sharp 24" 4K display (dell.com)
199 points by seanmcdirmid on Dec 26, 2013 | hide | past | favorite | 149 comments



Note that this display has no hardware scaling; if sent a signal that's less than the native resolution, it's displayed centred and unscaled. Last I checked, drivers for nVidia at least don't yet support cheap scaling for it. That means playing GPU-taxing games on it is unfeasible for the moment.

I also expect prices to come down, as 4K TV panels start to create economies of scale (even though the price isn't extortionate for what it is in the current market). The panel shouldn't always be a niche like 1920x1200 is. So I reckon this one is for early adopters.


This is a color-calibrated professional display. IMO it's a great price, about the same as what name brand 30" 2560x1600 were 2 years ago. Compare to a white label IPS 2560x1600 and it still seems reasonable: http://www.monoprice.com/Product?c_id=113&cp_id=11307&cs_id=...


Yep, you can already get a 39" 4k TV that can serve as a monitor for $499:

http://www.amazon.com/Seiki-Digital-SE39UY04-39-Inch-Ultra/d...

But it can only do 30Hz at 4K due to its limted inputs. You can game on it at 1080p@60Hz (hopefully without any scaling issues since 4K has exactly 4 pixels for every 1080p pixel).


Indeed. If you are interested in prior discussion on using the Seiki television as a computer monitor, see the following thread: https://news.ycombinator.com/item?id=6835521


Thanks for the link and reference. Just today I picked it up for CA$499 from a TigerDirect.ca store just outside of Toronto. Very, very happy, though the only device I have at home that can run it at full 4K is a new US$299 Acer c720p Chromebook! Photo: https://plus.google.com/u/0/+LouisStAmour/posts/Fejys3in1S3


I'm wondering if I don't play games, would 30Hz @ 4K be sufficient for web browsing, coding, and movies?


Most people can't tell the difference between 30 and 60Hz, except in certain situations. Most TV programs are filmed at 30Hz, and until very recently, most movies were filmed at 24Hz.

You may notice a little bit of stuttering during long, slow pans, but for everything else, it should work pretty well.


The reason this works for movies and not computer games is simple. Movie cameras let in light in a fashion similar to your eye, and thus at 24 fps motion blur is perceived the same way you see a blur when waving your hand rapidly in front of your face. Games however are rendered one frame at a time, and each frame the positions of everything on the screen are updated with no blurring or interpolation between frames. Thus a higher framerate is needed to achieve a "smooth" motion effect.


Motion blur is a very common effect used on games nowadays. Luckily it can usually be turned off.


I didn't want to believe this, but the other day I watched the new Hobbit movie in 48fps with some friends and apparently I was the only one that noticed anything different.

The obvious difference in the scenarios you mentioned is interactivity. I find something charming about the 24hz look of traditional film (often even lower in animation, where "animating on twos or threes" can drop the effective framerate down to 12 or even 8 FPS), but I sure as hell wouldn't want to use a computer at 24FPS. Even outside of games, 120hz feels significantly better than 60hz just moving a mouse cursor around on a desktop.

I'm holding onto the idea that it's something that has to be pointed out to you before you really care, but beforehand there is still an unconscious negative effect. Say, you're talking about the steps you took to eliminate a frame of lag from a game, and it seems ridiculous and pointless to a layman, but then they turn around and complain that their game feels "unresponsive" or "floaty" ("I pressed the button and he didn't do anything!") The problem still affected them, they just didn't know what to attribute it to (or perhaps having not experienced, say, a PC FPS played on a CRT monitor, they don't have a frame of reference for how responsive a game should feel).


Almost all movies are still filmed at 24fps.

The thing is, movies aren't shown at 24fps - they actually flash each frame on and off a few times to reduce flicker.

It used to be done with a multiple bladed shutter on film projectors, but I believe digital projectors and TVs can do the same thing (on the TV if you turn off that awful motion interpolation that they all have turned on by default for some reason).

But one thing movies don't typically have is a very fast moving mouse pointer on the screen all the time... The mouse, and fast scrolls on web pages and stuff will likely have issues at 30Hz.


As a long time Amiga user who had a 'flicker fixer' card I can assure you that some people can and do notice significant degradation of the monitor experience at 30hz. Now granted part of that was the fact that 30Hz interleaved had really annoying flicker on single pixel horizontal lines, but even the 'turn your head and it blinks' stuff was pretty annoying too.

I'd be interested to see this display in action (not quite interested enough to experiment with $500 though) to see how it affected my vision of things.


There's a massive difference between a CRT at 30 Hz and an LCD at 30 Hz. A 30 Hz CRT is terrible in direct proportion to how bright it is, and it very visibly flickers. 60 Hz CRTs also visibly and annoyingly flicker with eye saccades, or if you wave your fingers in front of the screen. CRTs need to approach 75-85 Hz before they are all-day tolerable with white backgrounds.

LCDs have a continuous backlight and the liquid crystals need to change orientation to change frame. There's no flicker as long as they're showing the same frame. Here the flicker is noticeable as jerky motion, and is mostly just a problem for video, games, mouse cursor motion, smooth scrolling, etc.


I'm not sure if you're misremembering the Amiga flicker-fixer but the FF converted the 15KHz interlaced display of the Amiga to 30KHz (VGA style). It was the 15KHz interlace flickering that was noticeable, not the upgraded 30KHz signal from the FF.


I didn't write my comment clearly, with the FF it was not an issue, but without the FF the display in interlaced mode was painful. I had a long phosphor monitor which I used to mitigate that but until I got the FF it was an issue.

That said, the points above about the mechanics of LCD's being different enough that I might not object are reasonable. I've really not tried using a 30hz LCD display.


It could be, but there's a very important caveat. The HDMI output from many computers with HDMI and Displayport/Thunderbolt are limited to 1920x1280 over the HDMI output.

If you look on eBay for QHD (2560x1440) monitors, the ones that include a Displayport input are about $100 more than the ones with only HDMI, because they can be used by so many more computers.


You can test it out by running a regular monitor at 30Hz. I've tried 24Hz before and mousing was definitely too laggy/painful, but 30 might be enough.


Any idea how to force the monitor at 30Hz on OS X? The lowest I could set on my monitor is 50Hz.


>Last I checked, drivers for nVidia at least don't yet support cheap scaling for it.

Are you sure that's correct? I was under the impression that virtually all GPUs have hardware scalers that wouldn't harm performance.


Of course it's theoretically possible, he's saying that there isn't an option in the NVidia drivers to automatically scale up low-resolution display modes to a monitor's native resolution. Purely a software problem.

I dunno if that's true or not, but my aging ATI card definitely has an option for this, so I'd be kinda surprised if it was.


> That means playing GPU-taxing games on it is unfeasible for the moment.

3.6 times as many pixels as a 1920x1200 display. The difference isn't all that profound. I would think that those games should run, since they tend to target a range of graphics features such that they can run on old and new hardware. And the kicker: you shouldn't need to turn on anti-aliasing with this monitor (although I don't know how much performance that saves you, it's certainly something).


I currently have a GTX 680, and generally buy the fastest single-chip graphics card available when I buy a new system. But I still generally don't enable anti-aliasing over 2x unless it's a very old game, because it still costs too much.

I like a solid 60fps. Any time I see a dropped frames, or lagginess in a big scene, I pop into the graphics options and start cutting away fidelity until I get back to smoothness. Many games, even 5+ years old, don't run 100% at 60 fps on highest detail on the 680. Almost all games these days are GPU limited, as they're tuned for consoles. Asking to pump out 4x the pixels isn't cheap.


I like a solid 60fps. Any time I see a dropped frames, or lagginess in a big scene, I pop into the graphics options and start cutting away fidelity until I get back to smoothness.

This is where Nvidia G-Sync technology will come in handily. It will allow a game to run at a variable frame rate without the graphics card being forced to send repeated frames so that the monitor can maintain 60hz. Instead, every frame rendered will be sent to the monitor as soon as it's available and the monitor will display it right away.

This will be doubly crucial for 4K displays, as otherwise it'll require far too much performance to maintain 60hz.


I just assume that a lot of the gamers who would consider dropping $1300 for a monitor are the same ones who are using two, rather than just one, GPU. So there's always that.

> But I still generally don't enable anti-aliasing over 2x unless it's a very old game, because it still costs too much.

You probably won't need it at all with this monitor, so you'll hypothetically be able to get some performance back that way.


It doesn't matter. Neither GTX 780 Ti 2-way SLI nor R9 290X 2-way crossfire are able to average 60fps on modern games at 4K HD, even games that are nearly a year old.

http://www.guru3d.com/articles_pages/gtx_780_ti_sli_geforce_...

And these cards are 600-700 USD each.


If you find your experience completely ruined by frame rates ever dropping below 60fps, you're going to find most of the newest games kind of a bummer.


The experience isn't completely ruined, but immersion is affected. It adds the GPU limitation as another difficulty factor in the gameplay. In twitchy games, the moments of maximum screen chaos are simultaneously those where you need to react quickly, and you are less likely to be able to react quickly if you're dropping frames.


>Many games, even 5+ years old, don't run 100% at 60 fps on highest detail on the 680

I have to wonder how much that's got to do with unoptimised drivers and unoptimised engines. It annoys me that there are some games I'll never be able to run (on PC) as well as they ran on the devs' PCs.


When I ran Linux, every time I updated my nVidia driver and did performance measurements, they seemed better, sometimes by a surprisingly large amount. I never did measurements with other systems. But if I were a graphics artist, the first thing I'd do with an optimized driver and engine is increase my polygon budget until I started to stress the engine/drivers again. At least, that's what I do with my own 3D programs... :)


anti-aliasing is only needed because of low pixel density. With higher res screens it becomes irrelevant.


So what you are saying is I probably shouldn't get three of these to run in EyeFinity


That's an excellent and often forgotten point: higher display resolution is the original form of "hardware anti-aliasing".


Don't believe Apple's line about pixels not being visible at retina dpis. Single-pixel slivers of geometry are still very visible in non-AA 2x resolution, and much more annoying (for me at least) than the smoother lower res with antialiasing.


I can't see any pixels on the 1080p Nexus 7 screen, even holding it up close to my face and staring, but that's higher DPI than Retina.


Well, supersampling is the naive form of anti-aliasing, which is effectively rendering at a higher resolution then resizing down.


You'll also apparently be stuck with colour capabilities that are inferior to those of the new TV 4K standard.


Do you have more data or references on that? Frankly I don't believe it.


http://us.hardware.info/reviews/5100/4/dell-ultrasharp-up241...

One last side-note. The Dell UltraSharp UP2414Q does not contains a built-in scaler, and none of the AMD and Nvidia cards we tried were able to display lower resolutions on a full screen. Nvidia cards used a quarter of the screen in the middle for Full HD 1920x1080 resolution, with AMD cards we were only able to select 1920x2160, which is half the screen. Until the GPU manufacturers adopt their drivers for this, the UP2414Q can only run its native UHD resolution full-screen. That does limit how you can use this screen right now.

See also this thread on dpreview, Roland Wooster has the monitor and has posted opinions:

http://www.dpreview.com/forums/thread/3591245

And screenshots:

https://drive.google.com/folderview?id=0B7re6a9_3U2nOFhZNU1N...

from:

http://www.dpreview.com/forums/post/52771357

Viewing the screenshots full-screen on an existing 24" approximates the usability, albeit with blurrier text.


I got mine in today, and hardware scaling seems to work just fine.


Does it bother anyone else that we've moved from describing displays by vertical resolution (1080/720) to horizontal resolution (4K)?

It seems confusing for the semi-informed.

Is there an explanation other than marketing reasons?


It's the difference between the movie industry and TV industry.

TV has always described resolution in "lines", because only the vertical direction had a fixed number of lines, the horizontal was analog and thus there wasn't a specific number of lines (there was a maximum frequency it could display accurately, and thus a minimum feature size, but no discrete "lines"). So, SD NTSC is 480i (i for interlaced), PAL is 576i. Actually, NTSC was really 525 lines, but some of them were not visible and part of the blanking interval. When you started moving into HD, they kept this convention, referring to 720p, 1080i, 1080p, etc.

When movies started being distributed in digital formats, the movie industry chose a different standard of naming the resolutions, calling it 2K for 2048x1080. I'm not sure why they chose to focus on the horizontal rather than vertical resolution like the TV industry, but there you go. So the 2K/4K terminology comes from the movie industry, 720/1080 from TV.

Just be glad that we're using this term and not the computer industry term for it, which is WQUXGA (edit: sorry, QFHD, WQUXGA has 2400 vertical pixels, not 2160, see how memorable these names are?), though it is frustrating that "4K" is used for something that doesn't actually have over 4000 pixels in either dimension. In fact, this doesn't meet the official definition of 4K, it's actually just 4 times the size of 1080p, so it would be more appropriate to name it 2160p or QFHD instead of 4K. But for some reason, 4K has become the de-facto buzzword for this generation of resolutions, so it's what we're stuck with.


I wonder if the film industry refers to horizontal resolution because they've historically referred to film by the horizontal dimension of the film: a 35 mm film strip is 35 mm wide, 8 mm is 8 mm wide, etc.

Admittedly, those are the widths of the film strips rather than the frames, so there is no direct correspondence here, and there are formats (such as IMAX) which run the strips horizontally, so the width of the strip constrains the height of the frame, rather than its width. But this is my story and i am sticking to it.


Ah, that's a good point. On film, the horizontal size is the constraining factor, and they can choose how much vertical real estate to use per frame to give the aspect ratio they would prefer.


It's confusing, because 4k has been used as a term (not actually that often in practice) in the VFX industry for 5/6 years for the following resolutions:

4096 × 2160 (standard) 4096 x 3112 (super 35)

Then for 4k cameras, the following resolutions have become fairly common for "4k":

3840 x 2160 4096 x 2304

so these days, it's difficult to know what's been talking about...


What's even more confusing is that after we've settled that 3840x2160 is 4k, which is 4x the pixels of 1920x1080, just like 8k is 4x the pixels of 4k, instead of calling 1920x1080 "2k", everyone is starting to call 2560x1440 "2k", even though 4k is only ~2x the pixels of 2560x1440.

I actually prefer UHD being promoted, because I wouldn't want us to switch to another weird resolution now, and I want videos to just scale up perfectly.


Actually, I've seen 2560x1440 called "2.5K" (at least Blackmagic calls it that in their Cinema Camera which shoots 2432 x 1366), since 2K has a well established definition of 2048x1080, or some crop of that to meet a desired aspect ratio. In film, 4K is defined as 4096x2160, which is exactly 4 times the number of pixels as 2K.


Marketing is a huge reason. "HD" laptop screens, for example, were meaningless but attached a moniker that the average consumer was familiar with ("HDTV") to a platform they are less knowledgeable about (laptops). I think the whole HD/720/1080 marketing push created stagnation within the display industry, since there was little reason to build better displays beyond 720/1080.

I am hopeful that with Retina and 4K, we'll swing back into a push for progressively better & progressively cheaper displays -- much like we had with the initial LCDs when they first started coming onto the market.


It bothers me.

But (just a guess) I hypothesize 4K comes from the film industry, while 720p and 1080[ip] come from TV and computer industries. So the two systems of measuring resolution were created independently and both have historical precedents, and the marketing mishap stems from blithely conflating the two. So I guess my explanation is: coincidence


Yes. Representing resolution by horizontal pixel count has been standard in digital cinema for years[1][2]. So they have 2k, 4k and a number of subvariants.

So I guess we're just seeing a conflation of cinema, TV and computing, at least when it comes to displays and resolutions, so the marketing terminology is conflating too.

Especially since 4k is an exisiting standard, I'm willing to give them a pass for keeping the naming convention. (Although the 4k TV standard, which most 4k monitors will be using, is slightly different: It's the cinema standard cropped to a 16:9 aspect ratio.)

[1] http://en.wikipedia.org/wiki/Digital_cinema

[2] http://en.wikipedia.org/wiki/Digital_Cinema_Initiatives#Imag...


Dell might agree. 4K does not appear anywhere on that webpage.


AFAICS it doesn't meet all aspects of the TV world's 4K spec, specifically the new colour regime. (99% of https://en.wikipedia.org/wiki/Adobe_RGB_color_space has to be a smaller colour space than https://en.wikipedia.org/wiki/Rec._2020 surely.)


The article you linked on Rec. 2020 says:

> In coverage of the CIE 1931 color space the Rec. 2020 color space covers 75.8%, the digital cinema reference projector color space covers 53.6%, the Adobe RGB color space covers 52.1%, and the Rec. 709 color space covers 35.9%.

So yes, Adobe RGB covers less of the reference color space than Rec. 2020 (4k/UHD), but more than Rec. 709 (HDTV).


No display of any kind meets the Rec. 2020 gamut spec. It's more a future proofing step than a realistic target. The fact that the primaries are sat right on the single-wavelength line indicates an eye towards future laser projectors though - the gory details of how the decisions were made start at page 27 of http://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2246-2-201...


Well they do mention it indirectly

Ultra HD 3840 x 2160 packs in four times the resolution of Full HD


4K means ~4000 pixels wide, not "four times HD".

Full HD has a defined pixel size, and they're simply saying that this screen has twice the resolution in both directions.


Ultra High Definition or UHD includes 4K UHD and 8K UHD.

4K UHD is 2160p which is 3840 pixels wide by 2160 pixels tall, which turns out to be four times as many pixels as 1920 × 1080


which is quite a long way from actually using the term "4K"


Anyone trying to sell monitors won't use the term 4K.

Dell is smart and won't ever use 4K while selling monitors, thats why they said four times HD instead of 4K.

Which makes sense because 4K resolution exist in the field of digital cinema, digital cinematography and digital television.


So far the only explanation I've seen is the number of syllables. "Ten-eighty pee" and "seven-twenty pee" are relatively easy to say on the radio, TV, and sales floor, but "twenty-one-sixty pee" starts to get unwieldy. Note that 2K (i.e. 1080p) has been used as a definition in the past [1], just very rarely, so it really is just the marketing of it that's changing.

[1] https://en.wikipedia.org/w/index.php?title=2K_resolution&act...


I have no idea.

I would prefer "2X", which Apple calls retina. We could also call it 4X with respect to area, but 2X would be the scaling factor for applications to hit.


I find it kind of annoying we still use diagonal dimension to describe the size, particularly when the monitors come in different aspect ratios.


I have one of those IBM T221 which do 3840x2400 (16:10) on 22.2" (204dpi). See https://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors for details.

These monitors are old, they have a fan, they require 4x DVI achieve this resolution and they are limited to 41Hz screen refreshes (not flickering though). Still, once you get used to truly sharp fonts, you don't want to go back. Can't wait for more of these new monitors to arrive in different sizes, with 60Hz refresh (preferably also offering 1920x1080@120Hz) and with HDMI 2.0. I believe 27" is the perfect size for a single desktop monitor.


Glad to see the pixel count finally increasing. I remember having a moderately-high-quality CRT when I was in high school that easily did 1600x1200. It was quite a while before LCDs did anything like that. Then resolution went backwards as companies realized they could sell you a 100" monitor with a resolution of 1920x1 ("super wide screen true HD!!111!") for $20. And there we've been stuck for like 5 years, with monitors getting wider every day but with the vertical resolution decreasing.

I think I'm going to get the 32" 4k variant and replace my TV with it. This is especially relevant since shows are starting to be released in 4k resolution, though I worry that 32" is kind of big for computer work and kind of small for TV. Sigh. Small apartments :(


You will love the 32" monitor especially with such high resolution. Finally one have enough pixel real estate for the monitor to serve as a desktop. Now you can have your documents arranged on that desktop without having to maximize their windows to do anything useful.

I love my 2560x1600 30" and will certainly by a 32" with bigger resolution. However that 30" is just big enough for two a4 documents side to side. So higher resolution is very welcome.


It would have been nice if this monitor was 16:10 and not 16:9.


Actually I still miss 4:3.


I personally love my 1280x1024 desktop LCD screen. I am going to hate it when that thing dies.


Same here ... 24" 16:9 displays are for who? I work with my display, not gaming or movie reviewing.


...and besides, any self-respecting movie buff knows that all the best movies are shot in anamorphic widescreen, with no less than a 2.35:1 aspect ratio.


They're starting to get closer to that ratio now: http://www.dell.com/ed/business/p/dell-u2913wm/pd


Is it really that big of a difference?


16:9 has won, and will continue to be the standard for some time.

Get the Cannon 4096x2560 or the Chrome Pixel if you need something closer to square.


"In November 2013, Canon entered a new market: high-resolution display monitors. The new 4K Reference Monitor, the DP-V3010, a 30-inch 4K IPS LCD flat panel display, is priced at $40,000 and will be available in the first quarter of 2014."

Chrome Pixel isn't even a monitor.

Your advice is not at all helpful.


I'm still a bit annoyed at how the 16:9 content consumer has kept displays stuck at 1080p for many years now. But I can't say I don't benefit from those economies of scale. So I'll just have to wait politely for prices to come down.


Price from Dell USA: $1299. Price from Dell UK: £1270. sigh


...and you get free health care. I wouldn't complain.


He doesn't get free healthcare, he gets socialized healthcare that he pays for with taxes. It's not a gift out of magic, it's a political and economical choice.


The US and the UK do actually spend the same percentage of GDP in government taxation on healthcare. It's just that for the same amount of money Americans get treatment for the poor and the elderly, whereas the British get universal coverage. So, relative to the US, the UK coverage of the wider population is actually free. There is no greater sacrifice, or higher level of taxation to pay for the extra coverage.

References are quite easy to find, but here's one, from the Telegraph, quoting data from the OECD:

http://blogs.telegraph.co.uk/finance/edmundconway/100006775/...


That depends on whether or not he's paying taxes ;)


If he's not paying the tax then his parents either are or did. Same here in Australia, it's just a part of the basic standard we are prepared to be taxed for.


I love the free healthcare, which is paid for by our higher taxes, which I am very happy to pay because of the benefits I (and, more importantly, other people worse off than me) get as a result.

Higher taxes in the UK do not, however, explain why the same product is 1.6x more expensive here. Part of the different will be accounted for by VAT (20%) and import duty (14% for this product, I think), but nowhere near all.

Incidentally, the US has import duty too. (But I think it's generally lower than for the UK.)


Haha what?


Taxes and regulations.


That cause a 60% price increase? That's hard to believe.

(1 gbp = 1.64 usd according to google)


You're forgetting VAT (20% in GB) is included in the price in GB/EU.


I didn't - so I wouldn't be surprised by a 20-30% difference. This is double that.


Import taxes are huge; protectionism at its finest.


Also simply that the UK is a smaller market with less competition, and UK consumers are largely used to being charged at these levels (I lived in the UK in the early '90s, and the "$ = £ for imported tech goods" equation was exactly the same then). Importers are going to charge as much as they can until people stop buying, and apparently, people aren't.

I think this is particularly the case for specialized niche equipment, as it tends to be bought only by those who really need/want it, so demand tends to be relatively price-inelastic...


1474€ for Germany (VAT included here). That's not actually expensive, it's just more than I paid for my Korean 1440p displays.


And it's actually sold for around 1100 €: http://geizhals.de/dell-ultrasharp-up2414q-a1039346.html


1270 / 1.2 = 1058 (removing VAT) 1058 British pounds = ~1700 USD

Doesn't seem that bad, or any worse than usual.


Potentially dumb question, but will this work with a MacBook? (Like with some sort of thunderbolt adapter )


Based on this[1] discussion, it seems like Thunderbolt 2 equipped Macs have DP1.2 and should be able to drive it at 60hz. It seems like someone with an Asus 4K screen had trouble with this in OS X (Win 8 worked).

TB1/Display Port equipped Macs will be limited to 30hz.

[1]https://discussions.apple.com/thread/5475430?tstart=0


The most recent retina macbook pros have serious driver problems with 4k monitors. Yes, after the December update too.

Here's my Amazon review of a Seiki 4k display: http://www.amazon.com/review/R4OJ5D5RPILCH


Only the latest MacBooks with Haswell chips support 4k over Thunderbolt 2


Do you know if they will they run the screen in a scaled mode? (i.e., 1920x1080 logical pixels, while using retina scaled assets)


Yes they should be able to. OS X can already do that on regular screens.


This monitor doesn't support scaling. If you push 1080p, it will not fill the screen.


That's not what HiDPI mode does; rather than rely on the (uniformly crappy) on-board scaling logic on the display, the computer sends a full resolution picture that it has scaled already. And there's no reason to think that this won't be available with this display.


I want to believe!


Late 2013 Haswell Retina Macbook pro could support 2 upto 3 4K external Display I think. If Win 8 worked 4k with 60HZ,just wait for a new OSX 10.9 update.It should bring 4K with 60HZ support. If using HDMI cable,only support 4K at 30HZ.


As an additional question: Are there any Macs with support for two such monitors?


The Mac Pro.


Thanks!


I'm almost more excited about 30bit color than the 4k res. No more ugly banding in gradients!!!


10 bits per channel is nothing new, but requires specific openGL application + driver + video card support, so you don't get access to it in standard photo viewing/editing apps.

Are you sure the banding you are seeing isn't just because, like most users, you have a 6 bit panel?


Indeed.

However, many "prosumer" monitors have supported greater than 24-bit color for a while now. Still, my nVidia drivers still don't allow me to select anything higher than 24-bit.


Is the lack of hardware scaling an attempt to keep the costs down or to 'encourage' people to upgrade their PCs?

It's a beautiful piece of kit. I'm guessing any hardware scaling would seriously impact on the image quality and this type of screen would show up any imperfection.


Hardware scaling on this monitor would actually be better than most 1080p screens, since you have more pixels to approximate the original image with. If your target resolution was exactly half the pixels on vertical and horizontal, it would be "perfect" (albeit with pixels four times as large as normal).


Help! I don't understand much about such monitors.

E.g., currently my computer has a dedicated video adapter card and is sending 1024 x 768 pixels to a CRT with a 15.5" diagonal. It appears from Windows > Start > Control Panel > Display > Settings that the most pixels is 1280 x 960.

For me, 1024 x 768 pixels are plenty, but on my screen with diagonal only 15.5", a lot of text is too small to read, even if I used a magnifying glass.

So, I'd like a bigger screen, but still with only about 1024 x 768 pixels.

That a screen can display more pixels is from useless to me down to a concern: My guess is that, now that CRTs are out, the current solid state displays have some actual physical pixels, independent of the signal from the computer, and that when the computer sends, say, 1024 pixels on a line somehow the display (or video card?) has to fit the 1024 pixels across the 1200 or whatever the number of pixels the screen has. Sounds like a good opportunity for some moire effects.

Also, my old video card may be sending a signal that newer monitors can't use.

So:

Q 1. What should I do to display the 1024 x 768 pixels I want but on a screen with diagonal longer than 15.5"

Q 2. What about moire effects or other issues from displaying, say, 1024 pixels from my video card on a screen with more than 1024 pixels, say, 1200 or many more?

Q 3. How can I use the signal from my old video card, or do I need to get a new video card?

Thanks!


Q 1. The problem is, the DPI you are using is 82.6 which is already the lower than any modern display I know of. You could try to find a 21 inch CRT [~19.5inch viewable] (should be cheap or free). Another option might be to buy a second hand LCD 19inch LCD running at 1024x768. Another option would be to get a bigger LCD, run it at native resolution and use the font size/DPI settings in Windows to make the text bigger.

Q 2. I don't think the issues to do with non-1:1 scaling are that bad if you have poor eyesight anyway.

Q 3. I don't think there is enough information to answer this, though even graphics cards made 10 years ago should support more than 1280x960. Though often I find windows isn't very good at giving you those options. Most likely 1280x960 is the limit of the CRT you have already connected.


> For me, 1024 x 768 pixels are plenty, but on my screen with diagonal only 15.5", a lot of text is too small to read, even if I used a magnifying glass.

> So, I'd like a bigger screen, but still with only about 1024 x 768 pixels.

Why? You can have bigger text without having bigger pixels (and, at any given size, readability should be better with more pixels because the lines making up the characters will have better definition.)

Get a monitor the size you want, with as many pixels as you can (given your budget), use all of them, and use the text scaling setting in your OS to get text the size you want. Don't hobble yourself with a low-resolution device (or low-resolution setting on a high-resolution device) to get bigger text, that's the worst possible way to do that.


If the characters of the alphabet are larger on my screen, then the maximum number of characters per line is lower which means that at a lot of Web sites I can't read a full line of text without horizontal scrolling. In that case, commonly I copy the text to the system clipboard, pull it into my favorite text editor, 'reflow' the lines, and then read them. Bummer.

But a larger screen with each character taking the same amount of area on the screen as now would make the maximum number of characters per line larger letting me see whole lines more often without scrolling and/or let me have each character take up more screen area without scrolling.

E.g., here at HN I'm seeing 90 characters per line which is way too many. Why HN wants such long lines I don't know. What does HN want, screens 10 feet wide? Traditional typing at 10 characters per inch horizontally on 8 1/2 x 11" paper with 1" margins gives maximum line lengths of 6 1/2" or 65 characters per line. Newspapers commonly had many fewer characters per line. 90 characters per line is just wacko in several respects -- there's no good reason for it.

On my Windows XP SP3 system (yes, I recently got a DVD of Windows 7 Professional 64 bit, for my next computer when I finally give up on XP), it appears that nearly all the text from Microsoft's programs, e.g., essentially everything that comes from the tree rooted at Start > Control Panel and Start > My Computer, was designed for a screen with diagonal about my 15.5" but with 640 x 480 pixels. Using the Microsoft software to set the screen resolution down to 640 x 480 makes the old Microsoft software much easier to read but, then, commonly has too few characters per screen to display all the screen contents -- e.g., the screen from Start gives a message that some content could not be displayed. So, keeping my present 1024 x 768 pixels per screen but having a screen with diagonal larger than 15.5" should basically just magnify what is shown on the screen, that is, each character would take up more screen area and be larger and easier to read. But a screen with a diagonal larger than 15.5" and, say, 1280 x 960 pixels or some, say, 2048 x 1536 would, unless I got a huge screen, say, diagonal 31", make each character on the screen take up less screen area and, thus, be still more difficult to read.

Yes, more pixels per character would give nicer looking characters, but characters with just 10 x 18 pixels are easy enough to read if they take up enough screen area. That is, for reading text, I don't really need more pixels per character, as nice as that could be.

E.g., in my Web site development, commonly I am specifying font sizes in terms of pixels, e.g.,

     font-size:  25px;
     line-height: 30px;
Then, with such a Web page, many more pixels per screen would make each character take up less screen area and, thus, be still more difficult to read. Yes, when using a Web browser I can ask for it to magnify the whole image and commonly do. But other software need not have such magnification, say, nearly all the software from Microsoft in the tree rooted at Start, so that more pixels per screen could make the text smaller and still more difficult to read.

Net, I'd be happy with just my present 1024 x 768 pixels but displayed on a larger sheet of glass. Or, a larger sheet of glass with, say, 4 times as many pixels, could be more difficult to read instead of easier. Basically all I need is just a magnifying glass none of my computer hard/software knows about.


> On my Windows XP SP3 system (yes, I recently got a DVD of Windows 7 Professional 64 bit, for my next computer when I finally give up on XP), it appears that nearly all the text from Microsoft's programs, e.g., essentially everything that comes from the tree rooted at Start > Control Panel and Start > My Computer, was designed for a screen with diagonal about my 15.5" but with 640 x 480 pixels.

IIRC -- and its been a while since I used the display settings on XP since all my regularly-used Windows systems are on either 7 or 8.1 now -- you can change this by adjusting the DPI setting for your display in the control panel in WinXP. Win7 brings scaling more to the front and actually makes it the primary setting on the Display control panel.


Yes, with the suggestions from you expert guys here, I went to

Start > Control Panel > Display > Settings > Advanced > DPI Setting > Custom

which looks nice. So, converting their 100% to 150% gives characters much easier to read. I don't know just what software would honor such a setting. And to get the setting I got a message that had to install the fonts and reboot Windows. Gee, once my HP laser printer died and I got a Brother printer, the Brother was not fully compatible with the old HP/GL commands so that for one of my favorite DIY printer utilities, e.g., just to do a nice job printing flat ASCII files, I wrote some Windows GPI code -- a real pain. If the font changes from reinstalling fonts would change the spacing in that program, then I'd have to revise that program -- bummer.

It looks like 'scaling' has been work in progress for the industry for a long time. The easy approach to scaling for me is just a larger piece of glass with the same number of pixels, filling the glass, and with my hard/software not knowing about the change.


I have a 2560x1440 27" monitor, with higher resolution than that and a 3" smaller screen wouldn't it be too small pixels?


Depends on how you use it. If you pull Apple's trick with retina displays (or an equivalent mode in other OSes) then no, to the user it'll appear the same as a 1920x1080 display, but with crisper text. The point is that pixels are supposed to be invisible, so in the theoretical ideal future, everything would be resolution independent so that decreasing pixel size doesn't make the display unreadable.


It's chicken and egg. The software is set up for bigger pixels, especially on Windows. Higher DPI can mean higher fidelity font rendering in particular. But someone needs to come out with these monitors so that developers can adjust their rendering. MS should possibly even consider pixel-doubling apps, with perhaps some hacks to keep fonts hi-res.


The DPI adjustment in windows has worked since XP, worked well since vista, at least for my purposes, and in 8 its a per monitor setting.


The DPI adjustment doesn't affect everything. For example, from what I've read things like Photoshop don't scale well, as the toolbar buttons remain the same size even if the menu etc. scales up.

Edit: here's some screenshots

https://drive.google.com/folderview?id=0B7re6a9_3U2nOFhZNU1N...

from:

http://www.dpreview.com/forums/post/52771357

Download full resolution and view full-screen on a 24" monitor to approximate the experience.

Don't get me wrong, I think this is an awesome monitor, and my current secondary monitor is on the fritz so I've been researching replacements for the past few days. But even though I have the need, and I can easily afford it, I've not yet made the leap to buy.


In Windows 7 I tried it for a few minutes, but it seemed to screw up scaling in certain controls in most Windows Forms app, which is what our main product was, so I haven't used it since.


Too many Windows applications do drawing while ignoring that settings.


Absolutely.

Still, it has been way too long since there has been any progress in display resolution, so by pushing into "4K" territory it means something like 2560x1440 might just become mainstream.

Win/win: early adopters get the a shiny new 4K display for Christmas, and I get 2560x1440 for less :)


If the applications are designed for the resolution, it would mean a very good PC computing experience where you couldn't discern the pixels at all. I'm pretty excited that this exists, I would even buy one at this price if I was in the states :p.


183 DPI is "too small" for native viewing, but it'll make prettier pictures and more readable text.


I have that very same setup, Dell u2711, and I already find pixels too small. Take hacker news frontpage for instance. Un-freaking-readable.


It's the letters and text you find too small, not the pixels. Turn up your UI scaling.


If you're shopping for a 4K display to do professional video editing, I suspect this Sharp 32" might be preferable: http://store.apple.com/us/product/HD971LL/A/sharp-32-pn-k321...


I'm excited to see Dell trying to break new ground in the high-end. The race to the bottom of the pit of cheap plastic crap is tired and boring, and Apple is really starting to kick the crap out of the competition in the mid-range after having thoroughly dominated the high-end.


Isn't going to be a 27" 4K from Dell? Rumors were saying so IIRC.


http://www.legitreviews.com/dell-announces-new-32-inch-24-in...

Dell announced the one discussed here ($1399) and a 32" version ($3499). A 28" consumer model for under $1000 is to be available in early 2014. All "Ultra HD", which is 3840 x 2160.


I think there's another dell 4k screen due 1st quarter next year.


24" with 3840x2160 should work perfect in HiDPI mode.

Although I'm waiting for an OS X Mavericks Update to support 4k at 60Hz via Thunderbolt 2 (i.e. Display Port 1.2).


Shame the design looks so cheap and ugly. If I'm paying $1,299.99 for a display I expect more than this.

Why is Apple the only company producing beautiful computers still?


It's a Dell, why are you surprised? All the Dell products I've seen have the same amount of design sense as the Windows XP Fisher-Price UI. Dell is a company that assembles computers. If you want fine architecture, you don't get a prefab house.

Sadly, none of the major computer manufacturers besides Apple seem to have any sense of design. The old ThinkPads (when they were still IBM) are the nearest competitor; at least they were solid, square, and black looking. Stylish, not so much, but at least they looked professional. (I guess Microsoft gave it a good try with the surface; the hardware at least looks kind of cool)


I disagree. I have an old IBM ThinkPad from the era when IBM still made PCs. Dell is pretty close.

Their laptops are on the shitty Toshiba-ish side of things, granted (although higher quality build). But their monitors and keyboards are top notch. Simple, black, non-glossy bezel. USB ports on the side. Swivel, portrait, and height base adjustments. Their keyboards remain consistent with the 1980s standard PC arrow/home/numpad configuration and not that weird Microsoft/Logitech shit of putting tiny arrow keys in random locations, or F-lock.


I'd guess Dell simply doesn't care, and doesn't have to since most of it's revenue isn't related to looks at all. Maybe they look at it this way: it's a screen, so 99.9999% of the time you'll be looking at the screen, and not at the frame around it. So we don't spend any time on designing a nice looking frame (besides, nice looking is highly subjective) and don't let customers pay even more for the product because of that.


I would have to disagree. Displays have shifted to black bezels to facilitate the disappearance of them. When looking at a bright object, darker colors fade away, helping shift focus to the center (the screen) and reduce distraction. Giving a high-end screen a silver bezel illustrates how little they care about the end user experience. It ultimately is distracting. What is even worse in this case is that the case is reminiscent of the cheap monitors they've been selling to businesses for years.


I am surprised you cannot get a choice of snap-on bezels, Dell have done this for the backs of notebooks on and off for years.

I doubt there would be a lot of interest in lurid colours. However, 'stealth black', 'piano black', 'paper white' and 'cubicle grey' might increase the USP and appeal to some customers. The white might make sense to those doing desktop publishing, the stealth black might appeal to those doing video in a darkened, distraction free environment, the grey might appeal to Apple users and the shiny black could be the default option.

In corporate and customer facing environments a muted bright colour that happens to be consistent with branding might make sense so logo-free bezels in red, green and blue could sell too.


Although the Apple commentary put this in troll territory, I'd have to otherwise agree. The monitor definitely doesn't say 'high end' the way the Apple or other expensive monitors in the market do. They really couldn't give it slicker design that doesn't match their $299 monitors? http://accessories.us.dell.com/sna/category.aspx?c=us&l=en&s...


If you look carefully you'll notice that the only difference between the $299 monitors and the Ultra HD monitor in term of design is a thinner bezel. Maybe it would have increased the price much more for virtually nothing to have a thinner bezel on their new screen. Also, they don't have any competitor for this screen, so they don't need to distinguish themselves from the competition like they do with the "borderless" 24" screen.

I personally don't find them cheap looking and I'd rather have a "plastic" looking bezel with a mate screen over a sleek looking screen with an all-glass shiny front.


I agree. Dell monitors' matte screens are really nice.


I'm not gonna pay for this but it feels good to know in 2 years my office will sport these sharp monitors


Happy and sticking with my 27" LG IPS for 1/5th the price. Honestly my vision is too poor for 4k 24" screens.


That's not 4k. It's like sticking HD on phones with 720p.


Dell never actually uses the term "4k" on the page, it's just in the title.


What are you talking about? It says "Ultra HD" in the description, which is the marketing name for 4K


http://en.wikipedia.org/wiki/Ultra_high_definition_televisio...

    "Ultra HD", would be used for displays that have an aspect ratio of at least 
    16:9 and at least one digital input capable of carrying and presenting native 
    video at a minimum resolution of 3,840 × 2,160 pixels.
This display fits those criteria.


I know. Was positively surprised actually, but that made the title even more puzzling.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: