Hacker News new | past | comments | ask | show | jobs | submit login
Dell Wasn't Joking About That 28-Inch Sub-$1000 4K Monitor; It's Only $699 (forbes.com/sites/jasonevangelho)
335 points by pugz on Jan 8, 2014 | hide | past | favorite | 231 comments



Excellent. Again, a big thank you to Seiki for dropping a bomb on the display industry, whether or not Dell acknowledges it.

3840x2160 at 28 inches is quite fantastic. A step closer to 10,000+ horizontal pixels in a single ~50 inch display, which I consider an ideal for desktop computing.

The burning question for me right now is who will sell me a GPU with three DisplayPorts to drive three of these? The top-end nVidia cards provide only a single DisplayPort [1]. I don't particularly care about 3D performance at this resolution—at least not yet—I just want three 3840x2160 capable ports from a single PCI Express slot. Ideally with a GTX 650 style short-length form-factor [2]. Again, 3D is of secondary concern to me, and the GTX 650 can already power 1x 3840x2160 (via HDMI 1.4) and 2x 2560x1600 (via DVI) without breaking a sweat—some of my colleagues and I are doing that presently. A 3x DisplayPort card for predominantly 2D productivity work is not unreasonable.

[1] http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-780...

[2] http://www.geforce.com/hardware/desktop-gpus/geforce-gtx-650...


If you're happy with the HDMI port driving your 4K monitor (which means you're running at 30Hz), it seems that a GTX 660 or better will work. (AMD cards have similar port layouts)

DP: 4K @ 60Hz

HDMI: 4K @ 30Hz

2xDVI: use dual-link DVI->HDMI adapters (which are expensive, granted)

If your monitor supports being driven by 2 HDMI cables, you can use 2 of the HDMI ports to drive one of the monitors at 60Hz.

Of course, I haven't actually tried this.


I am presently happy with 30 Hz—very happy in fact because the resolution is so spectacular. However, if I owned three of these Dell monitors, I would definitely want them to all function at 60 Hz. And I don't believe the dual-link DVI ports on the nVidia cards will do greater than 2560x1600.

I think the only way to drive three of these Dells at 60 Hz would be a 3x DisplayPort GPU. nVidia and AMD, are you listening? With an affinity for a wide spectrum of GPU options, especially at nVidia, why is this not a thing?


It's probably worth testing if you can borrow a dual-link DVI->HDMI adapter, but I would be surprised if it didn't work. Those adapters don't actually do a full conversion; they work by telling the video card to send HDMI signals out of the DVI pins. It's obviously physically possible, 4K@30Hz requires less bandwidth than 2560x1600 @60Hz.

nVidia is supposed to release its next generation of cards (AKA Maxwell) in the spring. They will likely support HDMI 2.0, which will support 4K@60hz. I'm really surprised that the new AMD cards that came out this fall don't support HDMI 2.0.


Apparently AMD makes one at least, https://news.ycombinator.com/item?id=7024581

And so does NVIDIA, never heard of their NVS line though.

http://www.newegg.com/Product/Product.aspx?gclid=CPr5wO6T77s...


+1 for the idea of the single 50inch 16K display. I also would love something like that, running a single 40inch 4K display should be possible quite soon and would give me roughly the same DPI as my current 2x 1920x1200 24s.

But since im a coder, separating windows in different monitors might be better than using one super large display with windows cluttered everywhere, but id be willing to try.

What to you use your displays for ? A display like that should be epic for video/photo work.


> separating windows in different monitors might be better than using one super large display with windows cluttered everywhere

There's lots of desktop environments which will provide you with nice window tiling, so you can simulate any monitor split you want. Or even without full tiling, many environments allow to use a halfscreen-left / halfscreen-right placement.


> who will sell me a GPU with three DisplayPorts to drive three of these?

Visiontek makes a couple of eyefinity cards with 6 mini-dps. I didn't see the measurements but it doesn't look too long.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814129...


I so want to believe you're right, but I don't see anything on that page that suggests those DisplayPorts can drive 3840x2160. The footnotes imply, but do not say specifically, that 2560x1600 is the maximum resolution supported by each port. Am I wrong?


I'm not sure what resolutions it supports. Looking at it again it seems like a 2560x1600 max is a safe assumption. Looks like you're stuck with full-length, double width cards for now: http://www.newegg.com/Product/Product.aspx?Item=N82E16814129...


Awesome! That fits the bill just fine since it's still just right for the board in my PC, which has a single PCI Express x16 slot. Thank you for pointing me to this!


All the 7000 series Radeons and many of the earlier ones support DisplayPort 1.2, which gives them enough bandwidth per connection to handle 4k, and the total pixel count of 3 4k displays is only slightly higher than 6x2560x1600. The 7000 series Radeons increased the maximum dimension of an Eyefinity array from 8k to 16k, so 3 4k monitors in any rectangular layout should work.


The AMD W600 [1] has six mDP ports that each support 4K. Also it's half-length and single-width.

[1] http://www.amd.com/us/products/workstation/graphics/firepro-...


Quoting from http://www.displayport.org/embedded-systems/microsoft-pushes... :

Microsoft Senior Program Manager Gavin Gear created his newest setup with a pixel rate that is truly jaw-dropping – 1.5 billion pixels per second (yes, that’s a billion with a “B”). The overall 12K x 2K resolution (across the three screens) on the PC gaming rig was built by three Sharp PN-KN321 4K Ultra HD displays connected with DisplayPort 1.2 to register the best refresh rates enabled through Multi-Stream Transport. Gear also took advantage of ASUS’ HD 7970 DirectCUII’s four DisplayPort outputs.


You're looking at top-end desktop geforce, but in this case you probably want top-end workstation nvidia cards: http://www.nvidia.com/object/quadro-desktop-gpus.html

Quadro K6000 for example has 2xDP + 2xDVI available. It's probably not something you want to buy for home though.


Mac Pro supports three 4k monitors.


Whatever Apple is putting in the Mac Pro I think will do it.


You can't buy those off the shelf. The cards used in the Mac Pro are a cross between AMD Radeon and FirePro cards. You can step up to "full" FirePro cards, and get 2, 4 or 6 display ports, depending on model.


Something to keep in mind is that this is effectively a Windows monitor, not a Mac one.

The reason for this is PPI: Most Apple displays are in the 100-110 PPI range, with Retina Display Macs doubling that to 220 PPI.

At 28", a 3840 x 2160 panel has a PPI of 157, which sits right between Retina and non-Retina densities. This means that on a Mac, you’ll have to use it one of two ways: Either at 1x, where the higher PPI means everything will be much smaller than it is on a normal monitor, or at 2x, where the lower PPI means everything will be much bigger than normal.

Windows doesn’t have anything nearly as neat and simple as OS X’s 2x mode, but it’s had a rougher 150 PPI mode for years, which will probably look just about right on this screen.

The best 4K monitor for Macs will be a 24”, which will have a PPI of 184, just about right for something sitting a bit further back from the viewer than a 220 PPI Retina Macbook Pro display.


> This means that on a Mac, you’ll have to use it one of two ways: Either at 1x, ... or at 2x

Huh. Are you maybe thinking of iOS, or have you confirmed this with a high-PPI external monitor on a Mac? Because it's not true of the built-in screen for retina MBPs. I typically use the 1.5x resolution (1920x1200) or 12/7x resolution (1680x1050), which are both supported by the built-in system prefs, and both look perfectly sharp as far as I can tell. Other resolutions are available via 3rd-party hacks and seem to work fine.

The sharpness is supposedly because things are rendered at double resolution and then scaled down, which required some clever GPU work; whatever it is, it works for me. I'm prepared to believe that things don't work as well for third-party external monitors, though. Or maybe we're not ready yet to handle 60% more pixels, when the rMBP is already pushing performance limits?


I think mortenjorck is talking about HiDPI mode, which doesn't provide as many options as retina mode: http://apple.stackexchange.com/questions/107846/how-to-enabl...


Ah hah! I actually think that HiDPI mode and retina mode are the same thing, but the issue you're getting at is that OS X doesn't yet have drivers for external 4K monitors that support HiDPI mode.[1]

Here's what's going on: on a retina MBP, the screen basically always runs in HiDPI mode. That means everything is rendered twice as large as it should be, for a screen twice as large as the virtual resolution you want, and then scaled down to native resolution. Icons that would be 32px are 64px, fonts that would be 12pt are 24pt, etc., and then they're shrunk down to their proper size.

This is really cool because it lets us run at _any_ resolution between 1x and 2x with minimal loss of quality, even though apps are only coded to run at one of the two extremes.

So, say I have a 2880x1800px screen on my rMBP, but I want a virtual extra-sharp 1920x1200px screen. I render the screen twice the size I want it (3840x2400) in HiDPI mode where everything is twice as large as it should be, and then I scale it _down_ in the graphics card to my native resolution of 2880x1800. This technique allows me to pick _any_ resolution between 1x (2880x1800) and 2x (1440x900) and have it look pretty good, because I'm only scaling down to my native resolution rather than up. Some stops along the way are nicer than others, and 1440x900 looks the best because it requires no scaling at all, but I have lots of options along the way that look good.

The first thing you need for this to work is, obviously, for your apps to know how to render at 1x or 2x. 95% of Mac apps seem to be there by now. (The ones that aren't fixed still work fine -- they just look obviously low-resolution because they've been scaled up and back down.)

Then you need your graphics card to be capable of rendering a screen with twice the virtual resolution you want, and scaling it down to your native resolution in realtime. That's a serious amount of processing, and apparently it's not supported so well yet for external 4K monitors. But importantly, it's something that only has to be solved at the driver level -- the OS and apps are already there.

[1] See http://www.anandtech.com/show/7603/mac-pro-review-late-2013/... and search for "HiDPI."


Yep. It would be great if Apple added full Retina scaling support to the next version of OS X, but I'd imagine the hardware might start to creak scaling 5120 x 2880 to 3850 x 2160.


Can someone explain why this isn't just a continuous dial ?

That is, why are there discreet "steps" in monitor scaling for these elements that you are talking about ? I would think that if I have 157 PPI or 220 PPI or 400 PPI that I would just adjust the dial to fit the element size I want.

What's up ?


You can't just take a rasterized image and scale it up or down, the result would look horrible. You need to scale things while they're rendered, and as long as you only do vector graphics like drawing lines and rectangles, that's fine. However, real-world UIs traditionally use lots of bitmap graphics, and those, again, simply can't be reasonably scaled. And PPI scaling even the vector graphics stuff requires major changes to the software and graphics stacks involved, given that historically everybody has simply thought in terms of pixels, not the angular size of screen elements, which is what's actually relevant to the end user.


"You need to scale things while they're rendered, and as long as you only do vector graphics like drawing lines and rectangles, that's fine."

Yes, I'm with you so far ... my confusion is, I thought that is what the "ready for 4k blah blah" improvements in iOS and Mavericks, etc., were.

I thought that was the whole problem they solved, and I am surprised to see it is not. I can see various apps being broken in this way, but the OS elements should all be scalable arbitrarily...


They are. http://arstechnica.com/apple/2007/10/mac-os-x-10-5/10/

This is not a general solution however. Even the new and shiny Final Cut Pro X uses rasters throughout.

There's only three ways to support higher resolution displays:

1. Invest in better tools, like Core UI or PaintCode.

2. Exactly double the density and use @2x resources.

3. Switch to "Flat UI" design language.

For the system UI elements, Apple chose option one. For the 3rd party UI elements, Apple chose option two. For iOS, Apple eventually chose option three.


What about 4: ship the actual Illustrator files used to make those raster resources, and just render+cache them for each size the user actually switches to? (In est, treat them like JITable bytecode.)


Yeah, I would've also thought that they'd have the issue somewhat tackled by now. OTOH, I suppose it doesn't help much if the OS itself draws itself correctly if the apps are all wonky.


GTK on Linux uses Cairo and other vector rendering libraries to draw widgets and icons. You can set the font size or resolution as low or high as you like. This has been true on that system since before 2008 AFAIK.


App developers cannot write code that correctly renders at an arbitrary multiplier. It's been tried and it never works. Having 1x mode and 2x mode does work, which is why Apple uses that approach.


Considering that every 3D game qualifies as an app that renders correctly under arbitrary multipliers, that argument seems a little suspect.

The real answer is that there is an industry of designers accustomed to doing per-pixel tuning of their art assets and this work is worthless if the assets are going to be scaled anyway. Selecting integer multipliers (so far: "2") allows those assets to continue to render as designed on retina displays.

Surely there are aesthetic reasons to prefer this in many cases, but it's not nearly as cut and dry a point as some people tend to think. Most recent Linux desktop themes are using vector assets, for example, with quite good results.

Whether the Mac/iOS world is a community of pixel-exact perfectionist artistes or a bunch of dinosaurs who refuse to use modern tools is a flame war for a different thread.


  The real answer is that there is an industry of designers 
  accustomed to doing per-pixel tuning of their art assets and 
  this work is worthless if the assets are going to be scaled 
  anyway.
That's an effect, but not the cause – the roots of the raster-based paradigm of modern GUIs are far more technical. I would point to the tradeoffs between speed and memory limitations in early GUI systems: If you look at the original Macintosh UI, you'll see a system more constrained by rendering speeds than by RAM, and thus a high emphasis on bitmaps versus vector rendering.

Look at this Folklore.org story on the origin of the roundrect in System 1: http://www.folklore.org/StoryView.py?story=Round_Rects_Are_E... It was only through a combination of Jobs' perfectionism and Atkinson's ingenuity that such a simple thing as rounded rectangles could be accomplished on the original Macintosh; can you imagine how long it would have taken to render a Finder window full of vector icons?


And yet Linux machines have been drawing SVG icons and SVG widgets for years. This limitation doesn't exist any more.

Look at Display Postscript, which was created by NeXT in the late 80s and carries through to much of Apple's Quartz rendering system today. Why are we still limiting ourselves to pixel graphics?


Let's be reasonable... 3D games and traditional desktop environments have a very different set of HCI expectations. You aren't often clicking an icon in a 3D game.

While, I'd probably prefer to have vector graphics for desktop assets just for the flexibility, hand-tuned scaled images usually look better than their vector scaled siblings.


> You aren't often clicking an icon in a 3D game.

Maybe I'm misunderstanding what you're saying. In case I'm not, here are some counterexamples.

Check out the first two pictures in this thread (most things on that UI are clickable):

https://forums.station.sony.com/ps2/index.php?threads/leaked...

Clickable chat UI (Planetside 2):

http://img138.imageshack.us/img138/6746/planetside2201302131...

Clickable UI within a 3D world (Doom 3 had a bunch of these):

http://static.gamespot.com/uploads/scale_super/gamespot/imag...


Despite being told for fucking YEARS to use dialog-box scale units (twips) instead of hard-coding things, pretty much everyone on Windows took the "pragmatic" approach. Can't blame Microsoft for not trying, but they gave up around VB.Net.


Wait, what? Windows has got full custom DPI control and has had for years. Under Windows 7 there's a dialog box to set it under the Appearance & Personalisation / Display control panel. You can toggle XP scale mode for compatibility with older apps.

Seems rather more flexible than pixel-doubling - I thought that went out with underpowered games consoles, but apparently not.

Where there is an issue is with icon scaling - since these are just multi-resolution bitmaps, if the person who made them didn't include the larger sizes you're going to have an ugly time at non-standard DPI settings.


Yeah, except app devs screw it up and few things work right. Pixel doubling can be done entirely transparently to an app.

And yes, so can other scaling, except a lot of apps opt-in to the "no I know what I'm doing, lemme scale myself" and then screw it up.


> At 28", a 3840 x 2160 panel has a PPI of 157, which sits right between Retina and non-Retina densities. This means that on a Mac, you’ll have to use it one of two ways: Either at 1x, where the higher PPI means everything will be much smaller than it is on a normal monitor, or at 2x, where the lower PPI means everything will be much bigger than normal.

... so? Who cares if the icons are 1.5cm instead of 1.0cm? For writing code, browsing the web, watching movies, having multiple terminals open, what does it matter?


It's unfortunate to characterize a monitor that Mac OS doesn't support very well right now as Windows-only.

At any rate, UHD looks like it's well on its way to becoming a very widespread standard, so it's not unreasonable to expect that Apple will improve support for it soon.


> It's unfortunate to characterize a monitor that Mac OS doesn't support very well right now as Windows-only.

Specially since changing dpi is usually trivial on any X server since the mid 80's.


While certainly not ideal, you can probably use this with a Mac with no issues and turn on scaling. It won't be a native resolution of the monitor and wouldn't be great for day-to-day use, but if you hook your Macbook up to a monitor every once in a while it should be fine.


isn't it expected that things are smaller on a higher-res monitor?


That's what I thought. I'd rather have more screen real-estate than smoother fonts. Which is one thing I'm disappointed about with my Nexus 5. All these pixels and I still have the same amount of space on the homescreens? Hell!


Note: according to the article the monitor is actually UHDTV (3840x2160), not 4K (4096x2160). Content encoded at 4K will need to be either cropped or scaled down, similar to 1080p content shown on a 1240x1024 monitor.


The marketing droids won, you'll just have to get over it. Even the $3,600 Sharp monitor Apple sells that's labeled "4K" has the same resolution:

  With its 32-inch class (31.5-inch diagonal) screen size, 3840 by 2160 Ultra-HD resolution...
http://store.apple.com/us/product/HD971LL/A/sharp-32-pn-k321...

You and I can be bitter about it all we want, but there's no point.


They also finally cashed in the 'big side vs. small side' credit they've been holding onto since they first branded HD (720 and 1080 are the (exact) pixel-lengths of the small sides, while 4k is (less than) the length of the big side).

Edit: Not only that, but they can (defendably) make the claim that 4k is "4 times the size of 1080p" while (deniably) knowing that some people will misinterpret.


Using the vertical line count is a legacy of CRT displays and analogue connectivity/broadcasting where there was no horizontal pixel count only a maximum frequency due to the nature of the technology. There were 1080i CRTs so it is only for the next generation it is possible to change to the terminology used in cinema and professionally for some time.

Every pixel on a 1920x1080 display is replaced by four pixels on a 4K display so I'm not sure a "4 times the detail" type statement is particularly wrong or misleading. It just doesn't derive from 4000 = 4 x 1080 type logic.


Misinterpret how? By expecting that the sides will be four times larger? That exists, and it's called 8K.

"Megapixels" already exists as an entrenched concept, and I don't think it's deceptive to call 4K 4x larger than 1080p or 8K 16x larger than 1080p.


720 = 720 vertical pixels

1080 = 1080 vertical pixels

4k = 4000 = 4000 vertical pixels? Nope.

Shit's confusing. 2k or 4x would be less confusing.


720p -> 720 vertical pixels 1080i/1080p -> 1080 vertical pixels 4k -> 2160p -> 2160 vertical pixels

The 4k confusion comes from the decision to simply double 1080p for consumer equipment instead of going the route cinema is going with true 4k horizontal resolution and still co-opt the easy 4k name.

And really, who wants to walk around saying 2160p? Makes me want to follow up with "1.21 gigawatts!"

I do agree with a statement in an article posted here about 4k a few days back. At this point the cinema guys should toss in the towel and just use the term 4K PRO and let consumer equipment use 4k.


I don't know, 3840 is fairly close.


Yeah, but they're not vertical pixels like the other numbers are.


> "You and I can be bitter about it all we want, but there's no point."

How about "informing other potential consumers" as a point? I certainly appreciated his comment.


4K means multiple things and we just have to get used to it. There is the DCI 4K native resolution of 4096x2160, 4K UHD of 3840x2160, and then there is the 4K in 16:10 at 4096x2560. Of these the UHD is likely to be the thing anyone cares about because it's what TVs will use. The native DCI resolution is probably the least interesting, because actual content will have different resolutions in practice depending on the actual aspect ratio being used. In a computer monitor however I'd prefer the 16:10 version instead of the 16:9 of UHD.


> Of these the UHD is likely to be the thing anyone cares about because it's what TVs will use.

I really hate this logic. 16:9 is a horrible format for displays - 90% of the usage of a work monitor (which is what this is targeted towards).

Sure 4K (UHD) is exciting, but I'm really hoping for the 16:10 or taller formats.


I agree. I mean that most people will care about because most people will be buying TVs not monitors, and content for their TVs, etc. So there is a good chance that 4K as a term will be most commonly used for TVs not monitors.


That's not correct. The resolution you are referring to is the native running resolution of 4K projectors in cinemas established by DCI. If you were being legal about it, which your comment is a little bit, "actual" 4K is 4096x2560.

A "proper" 4K film shown at DCI resolution is cropped by removing lines due to the combination of aspect ratio between film and projector. The trend in film photography, however, is towards the DCI standard; the RED EPIC shoots at DCI 4K while its predecessor, RED ONE, shoots more lines (I cannot recall off the top of my head, but it is definitely larger than DCI because it didn't fit in my Media Composer DCI workspace the first time I worked with it; I'm really reaching in long-term memory so I might be wrong about this).

All of that, however, is in the cinema world. Since NHK demonstrated 4K TV a couple years ago, everybody in the consumer electronics world has understood 4K to mean 4x 1080p, arranged 2x2, and that is what ITU standardized in recommendation 2020. That's intentional.

It's much easier to evolve 1080p equipment to 4K TV than cinema 4K; it's multiplicative and most of the same equipment can be overhauled. Other resolutions affect aspect ratio, and lenses are a significant portion of the expense of a television camera. You're going to have a hard time selling a standard that makes everybody buy lenses again. Don't forget that everybody just did to go HD. (The last lens I shot on in news was $20,000 by itself, which is part of the reason the average photog will get really mad at you if you grab it, as people who don't want to be videotaped are wont to do.)

If a consumer is buying 4K equipment, its resolution is 3840x2160. This will also be the resolution at which 4K content targeting consumers should be encoded, so I'm not sure what cropping you are referring to. If a DP is buying 4K equipment, its resolution is higher. This isn't marketing droids winning, it's completely different industries having a different history of resolutions: DCI 4K builds upon DCI 2K as 4K builds upon 2K builds upon 1080p; cinema has always fought for 16:10 while consumer electronics stubbornly fight for 16:9, as well.

Wikipedia doesn't help this situation by blurring the history of the two, leading to comments such as this one. Cinema 4K and consumer 4K are not in competition with each other, and there was never a chance of 4K content coming out of the camera at 4096x2560 and making it to a consumer display. They're for different purposes. It's also worth noting that ATSC v3 is still in early stages and is not finalized, which will dictate the broadcast of OTA 4K TV content and pull the industry accordingly.

I am aware some 4K monitors have been released to market at cinema resolution. Those are targeted toward film editors and as monitors for shoot work. There's also some other crazy markets; an engineer I worked with in television installed a bunch of early 4K monitors, the ~$20k ones, in an air traffic control tower. They like the resolution because screens get busy, he said. I don't know if that's true, I merely remember the story.


Yes, but you're talking about TV's where it's fine if it has a 16:9 aspect because all of the content is in 16:9. If I buy a monitor, I want it to have a proper 16:10 aspect because it matches the ratio of the W*XGA displays it would replace/complement. More specifically, I don't want to be left out of those extra lines and would feel cheated by Dell in this regard if I either bought it or had to use one due to a budget stricken IT department. A TV is a TV and a monitor is a monitor. The thing they're selling is a TV without a tuner and remote control, marketed as a monitor.


If you want separate resolutions for computer monitors and televisions you are going to pay a premium for both, since different supply chains are focusing on two different kinds of panel. You want the two industries working on the same thing, and that's already happening. I've been 1080p on my workstation for a long time. I can't wait for 4K TV monitors and they will be my next upgrade for my workstation once the price is right.

My comment up-thread discusses why television shifting to 16:10 is unlikely, and it mainly has to do with lenses. Another concern is all of the monitors in studios, trucks, and so forth that are 16:9. Given that, it seems fair to me to have computing move toward 16:9 so the toolchains can be shared between computer monitors and televisions. My 55" 1080p TV was $2,000 when I bought it. You can now buy it under $1,000. What's wrong with taking that experience and working on computer monitors?

Computers are increasingly being used to deliver television and cinema content, as well. It's fairly ideal for me right now to consume 1080p content at its native resolution with no boxing or cropping.


> My comment up-thread discusses why television shifting to 16:10 is unlikely, and it mainly has to do with lenses.

It's not lenses as much as sensors. Most lenses have a circular image circle, which can cover a lot of aspect ratios. (The big exception is anamorphic, but I'm not too familiar with that.) Canon's new HD video lens system, for example, can be stuck on cameras with Super 35 sensors, or standard 3:2 photography 35mm film.


During the digital transition, we had to buy new bodies and new lenses. My understanding is that the transition to 4K will allow the same exact lenses to be used whether on a new body or the same body with a swapped-out sensor train. Going to 16:10, as I understand, would require new lenses, which is part of the reason NHK's demonstration of 4K equipment as a straight multiple of 1080p was desirable.

I'm not an expert on the mechanics of how a lens works; I merely know enough to operate the camera and shoot a decent scene, and part of that is lens selection. If you can operate a 16:10 sensor behind, say, a HA18x7.6BERD with square pixels and no distortion or image loss, I stand corrected, but I don't think that's the case from my experience. Again, though, I just pushed buttons (in a single industry application of "transmit picture through lens and apply to persistent storage", television ENG/production), and I didn't take gear apart.

I know there are lenses today that can operate at 4:3 and 16:9. Maybe the same concept?


It's more to do with director taste than sensors. My camera shoots 16:9 onto a 3:2 sensor, so it could easily go up to 16:10, but 16:10 looks worse than 16:9. Some directors shoot even tighter.


Kubrick (and many others) shot stuff in 4:3!


Anamorphics produce a circular image as well, it's just that it represents an elliptical selection of what's in front of the the lens.


"The last lens I shot on in news was $20,000 by itself, which is part of the reason the average photog will get really mad at you if you grab it, as people who don't want to be videotaped are wont to do."

Grabbing the lens is effective at dissuasion? Good to know.


Just lick your thumb and put it right on the glass.


It's effective at picking up an assault charge. A friend of mine still in the industry lost her lens to an aggressive OWS protester and said protester was immediately arrested and charged. People do crazy shit to photogs, and that you see paparazzi being quick to press assault charges is just the TV news situation magnified a lot.

If you touch a photog or their camera, you're in for bad news. Don't forget they have it on tape...


Thank you for this. Since I responded to a similar thread a week or so ago I have been looking up the details on 3840 v 4096. Not so easy to find a centralized description of the evolution of the term "4k".


While I agree the term is being used wrong and UHD would be better, I don't think enough people would be confused as the term 4k is being used almost exclusively for UHD at the minute.

So I think your comment would be better phrased as a complaint against misuse of the term rather than as a note of explanation is what I'm trying to say.

Apologies if there are a large number of people who would be confused and this is a useful note though.


It's sort of like how only software engineers care about GB vs GiB. The average consumer will not care about some cropping here and there, but someone working with graphics or video will. And I suspect that anyone looking to drop nearly a thousand dollars on a monitor is going to care deeply about its actual specs (ie "can this monitor render my project at native resolution?").


I'm not sure why anyone thinks 4096x2160 (256:135) will ever become a standard for consumer monitors? 16:9 has already won TV and computers. Are we really going to want to watch all of the 1080p content fractionally stretched or with bars on the side?

4096x2160 is fine for expensive cameras because it is designed to be cropped to 3840x2160, but I doubt there will ever be consumer content made in that format.


Much closer to 1080p content shown on a 1800x1080 screen.

Your hyperbole is dishonest.


Related reading:

"Your new '4K' TV isn't true 4K"

http://www.avsforum.com/a/the-not-so-subtle-distinction-betw...


Damn. At least it's 1920 times 2.


Such high resolution displays may sound like overkill for many people, but improved resolution is a big deal to those who care about screen typography—specifically, in layman's terms, about how fonts look on screen.

When displaying letters in small sizes, the pixel grid becomes increasingly coarse and the result often becomes ugly and less readable. To prevent this, "hinting" instructions need to be built into the fonts to help the type rendering, by fitting the horizontal and vertical strokes to the pixel grid, aligning heights, and so forth.

System fonts provided with Windows are professionally hinted, as are many professional webfonts. On the type rendering environments on Windows, hinting makes a huge difference.

The problem is, TrueType hinting—indispensable for TrueType fonts on Windows—is a complex, tedious process that only a very few specialists are able to do. I heard a few years ago that there was only one person who had the expertise to perform TrueType hinting on hangul (Korean alphabet) fonts.

The result is that many if not most fonts are either unhinted or more or less automatically hinted (which is recognizably below the quality of manually hinted fonts) as many typeface designers find the prospect of TrueType hinting daunting. Manually hinted fonts are more expensive.

If screen resolution improved so that unhinted fonts would still look good enough, that would vastly increase the choice of fonts that we can use for the screen. It would also free typeface designers of having to devote a significant chunk of font development on hinting.

These types of issues are even more important for some other writing systems. Chinese characters, for example, can be very dense with dozens of strokes packed into a single glyph, and would be illegible at a modest size on the screen if the resolution was not high enough.

Apple's approach to type rendering is less dependent on hinting, and it is not a coincidence that Apple has been leading the pack on high resolution displays as a way around the problems of rasterization.

Further reading: http://www.rastertragedy.com/


For every designer out there that really cares about typography in their designs that has high end monitors that are expertly calibrated, most of your audience probably doesn't.

Also check your work on a normal monitor.


If your audience is on a smartphone or tablet released within the past two years, they probably are.


Well, if your entire audience happens to be on smartphones and/or tablets in the two past years and have excellent eyesight, then sure. But those people still probably have crappy panels on their desktop and/or laptops.


I think everybody cares about high DPG typography, they just rarely have a choice in the matter. If you ask Joe bloggs on the street to choose the more readable text that they'd prefer between a display from 2001 and 2010 they'll choose the 2010 one every time. Nobody sells a cutting edge laptop with a lower screen resolution, so it's never a choice somebody has to make, so nobody really knows it's a thing, explicitly.


I'm only relying on memory, but I don't think screen resolutions on high-end personal computers improved significantly between 2001 and 2010, even as drastic improvements were seen in smaller devices such as phones. The reason is that large screens with higher resolutions mean even more pixels, and a higher likelihood that one of them will fail during the manufacturing process, resulting in significant numbers of faulty screens that have to be discarded. So it wasn't seen as economically viable to increase screen resolutions much beyond 150 ppi or so.


We are only just starting to see higher ppis in bigger devices (retina macbooks, high ppi tablets, limited ultrabooks).

But even the difference between a 13" laptop and 15" laptop is noticeably crisper when directly compared at the same resolution (1366x768 in recent years).

Even going from an older 1680x1050 22" monitor to a 16:9 1920x1080 is a somewhat noticeable difference.

However when you aren't comparing directly, or switching from one to the other, it is hard to see the clarity improvement.

Looking at a 22" monitor from a retina display makes the fonts look blurry and off but if you start working on that monitor at the beginning of the day it isn't quite so bad.


Yes, as far as ppis on bigger devices, it feels like there has been much more progress in the past couple of years than in the decade preceding them. The difference is certainly noticeable in direct comparison, though most consumers are not likely to notice except in cases such as poorly hinted text.


Even better, you don't have to do subpixel hinting at 'retina' resolutions.


I know, thank god!


Does this monitor have a built-in scaler, or do you get a small picture if you run at less than the full native resolution? I saw some earlier articles complaining about the lack of a scaler on some of the new "4k" monitors, but I couldn't find anything right now about this one.


It's worse than just a tiny picture: my ZR30w cannot display any source formats not 2560x1600. So I cannot connect it to any game consoles, a Roku box, etc. Only a PC video card.

But it does result in an astonishingly low latency, at least.


So I guess entering any kind of BIOS setup is out of the question then! :O


I remember shopping for large format LCD panels in 2011. I equivocated for a while between a 30" HP for around $800 and a 30" Dell, and eventually ended up buying Dell's u3011 for around $1100. I also remember thinking how unreasonably high the prices were for large LCD panels and hoping that they wouldn't stay that way forever.

It was a time when my monitor cost more than my computer. I think it's safe to say that that era has come to an end.

Now I'm just hoping that the price of good desktop speakers comes down next....


I'd suggest to drop the traditional speakers altogether and start using headphones. The industry standard Sennheiser HD650 or Audio Technica M50 will bring more sound quality for the buck from any loudspeaker I can think of. Of course, this is just an uneducated suggestion. Maybe you love to share your music.


He might work from home or in a private office, or these might just be for a home setup. I have AKG K550s for the office (I know I'm paying for the fact that they're closed) but bookshelves, a small amp, and a DAC for home.

As comfortable and great as these headphones are, wearing them still fatigues me and I just don't enjoy having them on. I really enjoy speakers.

In my college dorm, I had TCA WAF-1s with a TCA Gizmo. I've upgraded my amp and purchased an inexpensive DAC + Energy 12" sub since then, but I'm very happy. You can spend a few hundred and have a really nice setup; after that point, you'll see marginal improvements but with huge leaps in prices.


Wearing AKG K550s do not cause any fatigue for me. Well, I guess one has to try before going all in on headphones. I personally don't like the sound from loudspeakers. Well there are some setups that I like but I have neither the budget nor the suitable place =)

By the way, don't your k550s also leak? However I adjust the pads, it still feels like everyone could hear what I'm listening. Never tested them by making someone else wear them but they feel not tight enough so never bothered to bring them to the workplace. I'm working in a shared office, so it really matters. Currently I'm using my M50s but k550s are so much better and if it's not as bad as I thought, I'd definitely use them instead.


I have a truly large head, so it's not really a problem for me. When I was reading reviews, there were plenty of warnings about improper sealing if you didn't have a large / certain shape head.

There's a decent noise floor in my office, and most of the other people use headphones. Even for the ones that don't, there's enough that my music isn't really audible. I'm also pretty vigilant about keeping my volume low, mostly for my own health.


I have HD 485s, but even at the loosest setting I can't wear them for more than 30 minutes because of the way they feel on the tips of my ears.


Well, I don't want to sound condescending but the headphones I'm recommending are both priced well above HD485, apart from being highly appreciated (mostly) in the audiophile community. Also, HD650 is known to be very comfortable and having 15 headphones to compare, I can tell you that you can go for it without a second thought. It's a very nice investment for one's ears =)


I have tritton 720+ and they also hurt my ears after a while. It is because they put the sliding part to low, so it minimaly adjusts the width. (probably to maximise the ear/top of head proportion) Or I have sensitive parts near my ears?


Tritton 700 series are pretty decent for gaming. However, gaming headsets aren't designed for long term listening. Maybe you can consider getting something like Beyerdynamics DT990 Pros as an all-round headphone for daily use.


AudioEngine A2+ - $249, active, with integrated USB DAC. Really nice. http://www.amazon.com/Audioengine-A2-Premium-Powered-Speaker...

Usually pair with DS1 stands (just enough to tilt off the desk and towards your ear), $29 http://www.amazon.com/Audioengine-DS1-Desktop-Stand-Pair/dp/...

and things will sound good.


What is your definition of good desktop speakers and price? In the UK we can get this sort of thing:

http://www.richersounds.com/product/standmount-speakers/camb...

Which are great desktop speakers at £80GBP

Nothing like this in your neck of the woods?


It should be pointed out that these are active speakers with a builtin USB DAC. There are a variety of these things out there at this price range atm and they sound surprisingly good. Recommended.

Alternatives:

http://www.thomann.de/gb/alesis_m1_active_320_usb.htm

http://www.sweetwater.com/store/detail/StuDock3i/?adpos=1t1

(There's a decent chance all 3 are the same product rebranded tbh.)


For price-worthy PC Speakers I recommend Teufel: http://www.teufel.de/pc/pc-stereo21.html

I am also very satisfied with these headphones: http://www.thomann.de/de/beyerdynamic_dt990pro.htm The sound is crystal-clear and due to the big and soft earpieces you can wear them for hours without noticing. Perfect for home use.


If I had my time again with headphones, I'd be buying DT770 (work) / DT990 (home)/ Bose QC15 (travel)

The Beyer's are very comfortable which I think is the #1 feature for long periods of use.

The Bose make my daily commute enjoyable now as I can listen to podcasts with no distraction.



I'm using these at home too.

Previously I went through Logitech's every 2-3 years, I've never even considered changing to something else since owning them.


I've had those speakers since 2005, amazingly good hardware.


I concur. Not only is the sound surprisingly good for the price, but they look pretty cool too.


So we're finally starting to move past vertical resolution of 90's CRTs. It's starting to look like the shift to widescreen is working out ok after all!


Lenovo is also launching a 28" 4K monitor this April: "Along with the resolution, we have a 5ms response time, 72% color gamut, DisplayPort, mDP, HDMI and MHL connectivity, three USB 3.0 ports and dual 3W speakers. Lenovo is promoting a true 10-bit color, and streaming capabilities via other digital devices." http://www.anandtech.com/show/7635/lenovo-at-ces-2014-thinkv...


Hmm, the Forbes article has been updated with specs:

UPDATE: I now have confirmation of the P2815Q’s full specs, and have listed them below. Unfortunately, it tops out at 30Hz 3840 x 2160 and 60Hz for 1920 x 1080. This should prove a deal breaker for gamers, but the monitor still has a solid feature set for the asking price.

Panel tech: Anti-glare TN (not IPS which was previously rumored) Connections: DisplayPort (v 1.21)/Mini-DisplayPort, HDMI 1.4 (MHL 2.0), DispayPort out (MST), 1 USB upstream, 4 x USB 3.02 downstream (including 1 USB charging port with BC1.2 compliance devices on back) Color Depth: 1.073 billion colors Viewing angle: 170 degrees Response time: 5ms Brightness: 300 cd/m2 Power Consumption: 75W

....

That's a bit disappointing that it only does 30Hz at the full resolution (3840 x 2160)...


Yuck. That negates the excitement I felt about this monitor this morning.

We already have a 30 Hz option (the Seiki). Let's see some forward momentum please, Dell.


Am I the only person for whom that many pixels in a 28" display sounds like overkill?

Maybe I'm just getting old, but the resolution on a cinema display is more than sufficient to make things unreasonably small. And getting closer to my desktop screen isn't really appealing either (again, maybe I'm getting old, but to see the pixels on my current screen I have to get my nose almost right up to the display, which I'm never going to do).

Retina displays on mobile devices are more understandable because you tend to be closer to them, or because it's useful to render really tiny text on a small screen, sometimes. But on a desktop display? Seems like the display version of clock-speed fetish.


> again, maybe I'm getting old, but to see the pixels on my current screen I have to get my nose almost right up to the display, which I'm never going to do

The point of going to higher and higher resolutions is that eventually, you don't see the pixels. Pixels are an implementation detail, what you're really wanting to see is the image they represent.


Exactly.

And for anyone who struggles to see the pixels on a typical desktop monitor, that's probably because a lot of tricks are used to disguise them. Try turning off anti-aliasing or sub-pixel font rendering and then tell me you can't see the pixels.


So the point of higher pixels is so you can't see the pixels and to prove this point, you tell people to turn off other (cheaper) technologies that already hide the pixels.

There seems to be some faulty logic here. What is the point of the higher resolution displays if the pixels are already hidden with other technologies?


The tricks aren't perfect--antialiasing is blurry for example. A properly high DPI monitor is a thing of beauty--crisp images and text.


Actually in video/ photography these tricks don't work. So BIG DEAL :)

It's like mixing music with $5 ear buds.


More pixels = sharper image. Try putting a printed high-DPI magazine next to your monitor and compare text and images.


Antialiasing typically produce artifacts and somewhat blurry letters.


Gee, thanks. All these years, and I've been looking at the pixels and not noticing the image! You've changed my life!

ahem...

Condescending explanations aside, you know that there's a limit to the human eye's ability to resolve detail, right? We can resolve up to about 150ppi at 2ft. The Apple Cinema Display is at 109ppi. There's room for improvement, but not 60% more...


Yes, but when you're actually working on a hi-PPI display, you can then lean in to view more detail, rather than zooming in. Much like we inspect things in the real world.

As a photographer, this means a great deal. I can verify the sharpness of an image (a key component in deciding whether to keep it or chuck it) at a glance. Saves a lot of time.


Yeah, maybe...but there's still a practical limit. In order to see the pixels on a thunderbolt display (again, 109ppi) I have to get my face about 7 inches away from the screen.

Closer than about 5 inches, and I lose the ability to focus because the screen is too close -- so there's a band of about 2 inches where I can gain from a higher pixel density than 109ppi, without losing due to eyestrain. And in any case, I'm not going to spend much time in that zone. It's hard to work with your nose in the screen.

YMMV, but I think I'm fairly typical. Most people dramatically overestimate the precision of their eyes.


You can see image degradation from pixelation long before you can make out individual pixels. I can't really make out individual pixels on my MBA (130 ppi) at one foot, but looking at a MBP Retina at the same distance looks dramatically better. On the MBA, the fuzziness from the heavy anti-aliasing used to hide the pixelation is quite apparent, but on the MBP Retina pixels look like sharp-edged solid shapes.


I don't know whether it's due solely to the resolution, but I was pretty shocked to realise I can tell the difference between 300 and 600 dpi photographic prints (assuming there's enough detail in the image to do so, you need to print a 23 MP DSLR shot with high detail at a 7x10" print size to get there). I have had other photographers tell me that they don't see any benefit to retina screens at all... you're giving me the reason why here. :)


109ppi is good enough, but there is a significant difference. It's true that beyond a certain point it doesn't make a difference (1080p phones, I'm looking at you!) but 109ppi is not that point, for most people. Retina web content and applications look decisively better.


Bear in mind that higher resolution display != smaller UI elements. It gives you that option of course, but for most people the benefit is that fonts, icons and images look so much crisper and more readable.

You're definitely not the only one who thinks it's overkill. I've shown my retina iPad and MacBook to a few people who don't get the fuss at all. I guess this reaction could be due to a) not caring much about aesthetics, b) bad eyesight, c) not spending the time to get familiar with it and use it for some actual tasks. Myself, I think the screens are amazing, and each time I go back to my 23" 1920x1080 monitor is a little painful.


I've shown my retina iPad and MacBook to a few people who don't get the fuss at all...Myself, I think the screens are amazing

Over a decade ago, I bought one of those Sharper Image ion air purifiers. I told all my friends how great it was - totally silent but really cleaned the air and left it with a fresh scent.

About a year later I started reading scientific assessments and reviews of it. It in fact did nothing to clean the air and the "fresh clean scent" it put out was potentially harmful ozone.

It was then I realized that I was as able to fall for marketing hype as much as anyone else. That made me much more wary.


I know what you mean, that it's easy to fool yourself and fall into marketing traps. But in this case of monitor resolution, there is a very noticeable difference (to me at least) between current desktop monitors and retina ones. If in 5 years some company tries to push the trend even further with 8K extreme retina or something like that I'm sure I'll be on your side, but for now, please, bring on the 4K desktop displays!

(Actually, when it comes to TVs I do think we're getting into overkill territory with 50" 4K displays, when for typical TV viewing distances 1080p is fine)


With regards to TV, one of the big effects of HD was that TVs got larger, but viewing distance remained constant. With 4k, at standard TV-viewing distances, it doesn't really make much difference below 100 inches or so. On the other hand, I want one on my desk.


The air purifiers were misleading. Better resolution is not. That said, applications/OS should be able to use the resolution. UI elements should become crisper not smaller.


Completely agree, we should remember that resolution is one thing and definition another, although they are clearly related.

I think that anybody should be able to tell the difference between Retina and a non-Retina device side by side. At home I have a set-up with a Macbook Retina and an external 1080p monitor as you do, and the difference is quite noticeable when I visually switch between the two.

I agree that there is a physical limit to improving the definition of monitors, but I don't think we have quite reached it yet.


>> Maybe I'm just getting old, but the resolution on a cinema display is more than sufficient to make things unreasonably small.

You're either getting old, sit very far away, or have bad vision - the Cinema display has a laughably low ppi for very large UI elements, and the pixels are incredibly visible.


In theory stuff should be the same size but alot sharper. That being said, i don't really feel alot of difference when looking at my 24" monitor and on my 5inch phone which has almost the same resolution.

For me a huge 40inch 4K monitor would be more useful as the DPI would be roughly the same to my setup now (2x 1920x1200) but more vertical space and no bezels.


Look at the seiki 39" inch and 4k resolution. It is precisely what you describe.


yep, thats looks about right but its not sold in europe and only has 30hz afaik, other than that its a pretty good deal.


This[1] one is 500 US dollars and is 120. I don't have access to non U.S sites here at work apparently so I can't be certain they have it over there or not.

[1]- http://www.amazon.com/Seiki-Digital-SE39UY04-39-Inch-Ultra/d...

Edit: Sorry, didn't catch that it only does 30hz through HDMI on the specs, and didn't realize it until I read the limitation downstream. Dang...


For coding, you might not notice it as much. I run my 27" monitor at about 30 hz, and it seems Just Fine. Youtube looks okay when not full screen, but frankly I use it for editing code, and I happily trade the refresh rate for the increased resolution.


Yup, this is exactly what I'm hoping for too. 39", 4k, 60hz or more - and that's my next upgrade sorted.


No way. Reading text on a desktop display sucks donkey balls compared to my GS4. You can't get nice crisp fonts on PCs like you can on phones these days and it's horrible. I've been saving articles from my desktop to rad on my phone ever since I got one with a hi res display. Now I won't have to.


I think there's two really great things about retina screens on desktops/laptops:

- Firstly, I definitely notice the difference. The higher res screen is noticeably nicer to read. Sure, you don't need to go to the crazy extremes of some newer high res phones, but standard desktop monitors are noticeably ugly by comparison.

- Secondly, the res is high enough that you can finally change (apparent) resolution. My parents are in their late 60s and have bad-ish eyesight, so they lower the resolution on their screen. The result is incredibly ugly. Unfortunately resolution independent display seems to be a forgotten goal in modern OSes, so having screens that can change res is definitely worthwhile.


A human eye with 20/20 vision can distinguish ~150ppi at 2 feet (a typical screen-to-eyeball distance). Beyond that, it's overkill. The apple thunderbolt display is at 109ppi, so there's a bit of room for improvement, but not 60% improvement.

Beyond this limit, I suppose there are people with 20/1 vision or something, but they're pretty rare.


That is, you can distinguish lines that are about 1/150 inch apart.

But for a display, you want to be able to have diagonal (or curved) lines at 1/150 inch apart, with pixellation artifacts that are small relative to those lines.

That probably translates to something like 450ppi (line width of 3px, so the artifacts should probably be about 1/3 of the line size).


Humans can distinguish a lot more than that. See: http://clarkvision.com/imagedetail/eye-resolution.html.

His calculation gives 530 ppi at 20 inches.

Explanation: "The eye is not a single frame snapshot camera. It is more like a video stream. The eye moves rapidly in small angular amounts and continually updates the image in one's brain to "paint" the detail. We also have two eyes, and our brains combine the signals to increase the resolution further. We also typically move our eyes around the scene to gather more information. Because of these factors, the eye plus brain assembles a higher resolution image than possible with the number of photoreceptors in the retina."


I've never gotten this argument. I mean, I have totally normal (nearly 20/20, but not quite... thanks to aging) vision, yet I can see the blurred edges of diagonal and rounded images perfectly fine on an iphone 5s at 2 feet. I think people just aren't really looking. I tried a thunderbolt display recently, and it was painfully low resolution. I haven't seen the math recently, but I have hard time believing that any phone today, let alone any monitor or television, really reaches the resolution of print.


I have my monitors mounted on arms, so they can be anywhere from 1.5 meters away with me leaning back, to almost right in my face. The ability to physically "zoom" the whole monitor adds another dimension to use, and it's one where you really appreciate higher pixel density.

Aliasing is another big deal. Jaggies are very visible until you get into quite high pixel densities.



The article has been updated. No 60hz 4k is a deal breaker

"UPDATE: I now have confirmation of the P2815Q’s full specs, and have listed them below. Unfortunately, it tops out at 30Hz 3840 x 2160 and 60Hz for 1920 x 1080. This should prove a deal breaker for gamers, but the monitor still has a solid feature set for the asking price.

Panel tech: Anti-glare TN (not IPS which was previously rumored)

Connections: DisplayPort (v 1.21)/Mini-DisplayPort, HDMI 1.4 (MHL 2.0), DispayPort out (MST), 1 USB upstream, 4 x USB 3.02 downstream (including 1 USB charging port with BC1.2 compliance devices on back)

Color Depth: 1.073 billion colors

Viewing angle: 170 degrees

Response time: 5ms

Brightness: 300 cd/m2

Power Consumption: 75W"


I was just about to purchase a Dell 27" monitor (1080p) as a second screen for my laptop. Glad I didn't as prices of UHD monitors seem to have become reasonable REALLY fast. I know nothing about displays. Can someone explain to me the difference between this monitor and (for example) the Sharp 4K display that is an option with the new Mac Pro and costs several thousand $?

NB: I will be programming, working in Logic Pro, and watching YouTube. No photo or video editing.


Buying a 27" with 1080p resolution is a bad choice imo, my 7 year old Dell 24" has a higher resolution than that (1920x1200). For 27" you need to go for 2560x1440 resolution or you will just waste space. 4K is even better obviously :)

In terms of differences to the Sharp Display i guess there arent many from a consumer point of view, but id wait for reviews and see if it has any weak points.


I agree; provided that your eyesight is good. I have 24" 1080P screen and higher resolution would definitely be better but without my glasses 27" might even be preferable.

And this is assuming the distance to the monitor is a little over an arm's length, if the monitor is further away you would want bigger for a given resolution.


Your 7yo Dell is probably a different screen ratio, 16:10, than his which is probably 16:9. You probably have very similar DPIs, but your 1920x1200 monitor just has more vertical screen space.


I have a 39in 4k monitor that I bought for $500. The only problem is that 4k monitors through HDMI can only run at 30Hz, which is extremely noticeable. There might be a way to run at 60Hz but I haven't got it to work yet. Here's the monitor: http://www.amazon.com/Seiki-Digital-SE39UY04-39-Inch-Ultra/d...


Curious: what's the 30Hz effect? Flickering on a static desktop? Or just less-smooth animation?


Jittery animation.


Ah, I have this TV on my wishlist just for this use, I wasn't aware of the HDMI limitation. Is this an HDMI =< 1.4 issue and would it be better with HDMI 2.0?


Even DisplayPort 1.2 doesn't have enough bandwidth to push 60Hz. You're going to have to wait for a newer spec.


DisplayPort 1.2a does support these monitors AFAIK, through treating them as two panels daisy-chained together.


It shouldn't be long before some Korean/Chinese factory starts selling these for half price on ebay.


If I could buy one of these for even 6000 RMB in China I would get it in a heartbeat. Alas, it will take a long time for that to happen.


That's $1000 USD... Can't you just get the Dell version imported?


Try getting it through customs without paying ~$1000 customs duty.

If it can't fit in my carry on, it is very difficult to just ship to China.


Fly into Guangzhou (or train in from Hongkong).

Flying I've never had my bag even run through a scanner (8-9 times flying in) - unless they are doing it between plane and baggage collect, in which case they are really fast!.

Train, you run your bags through scanners, but no one seems to care (I've bought HEAPS of electronics in from Hong Kong ($2-4k worth at different times).

Beijing and Shanghai - I would say you are right, they seem to be a bit more uptight there.


In Beijing, they scan bags right before you get out to meet your loved ones. They appear to select people at random, and from my observation you have 5%-10% chance of being selected.

I'm not sure what they're looking for, though, as I've never seen them open a bag after it went through the scanner.

I was quizzed about my road bike (which was obvious as it was packed separately) but after I explained it was old, and 'just a normal bicycle' and not 'for competitions', they let me through without a fuss.


They are scanning all bags in Beijing these days, at least when coming out of T2 customs. I'm more worried about transporting it as,baggage in the first place. I'm just not a good yellow cow.


Add oil!

You're right, though. Definitely not worth the risk to check it in.


What happens if they find you have stuff that you have not delcared? I know for cheap items they charge you the tax or trash them. For several thousand dollars worth of equiptment though, they might not be so nice...


Please yes, someone do this. Pretty much every programmer I know would get one (or two ^^)


I could do it if anyone wants help me financially or with marketing. I've got many years of direct factory sourcing in China. First came here a decade ago and I've been living here for the most part of the past 3 years.

Is there a way to be contacted through HN?


Sure, just put a contact email in your profile.


Any idea how new of a macbook you would need to power this? I'm guessing my 2012 13" air can't run it right?


Looking at another 4K monitor in the Apple Store (http://store.apple.com/us/product/HD971LL/A/sharp-32-pn-k321...), doesn't look like it:

"Note that 4K DisplayPort operation is only compatible with the new Mac Pro (Late 2013). 4K HDMI operation is compatible with MacBook Pro (Late 2013) and the new Mac Pro (Late 2013). HDMI input is not available in the European-market Sharp PN-K321."


"MacBook Pro (Retina, Late 2013) and Mac Pro (Late 2013) computers can use 4K displays and Ultra HD TVs with OS X or Windows (via Boot Camp)."

http://support.apple.com/kb/HT6008?viewlocale=en_US&locale=e...


(Note that according to that page, the MacBook will only do 30 Hz.)


60 if display port is available (which it is on this Dell).


You can drive 4k with a recent macbook, but it's laggy as sin. My Amazon review: http://www.amazon.com/review/R4OJ5D5RPILCH/


Thanks for that note. I was at the Apple store in Pasadena this week checking out a new rMPB (OS 10.9.1) and the Sharp 4K monitor. They connect by HDMI, not DP, at the moment. This limits refresh to 30Hz.

I did not notice the lag on text input that you did (but, I did not enter much text; I didn't think of this as a possible issue). I also did not notice any significant lag with the mouse. This may be user-dependent.

The thing I did notice is poor rendering of some fonts. There were colored fringes around the system font that both the Apple rep and I noticed. Other common fonts were affected as well, but not all of them. It seemed like a problem with subpixel rendering to me. Changing the display mode to mirroring did not help.

The Mac Pro did not have these problems. It's quite smooth (60Hz) from the Mac Pro.

It was bad enough that it disqualified the monitor for me. I assume it's an artifact of the HDMI driver they're using, and "probably" fixable with a better driver. But you never know.


Interesting.

FWIW, this isn't 30 hz vs 60 hz. My windows box is limited to the same 30 hz and has no issues.


Even a brand-new Macbook would be unable to power this display. All currently sold Macbooks can only power displays up to 2560 by 1600; the Macbook Air can power one external display at this resolution, and the Macbook Pro can power two. (Edit: The iMac can also only power one additional display at the same resolution.)

If you want to use a Mac with a 4K display, consider getting a Mac Pro, which can power up to three of them.


The MacBook Air from 2012 or later can drive two 2560x1600 displays at 60fps: https://twitter.com/Antagonist/status/252936243606859777/pho... (It only has one Thunderbolt port, but Thunderbolt displays can be chained.)

While the integrated graphics on the 2013 models can drive 4k displays at 60fps, only Thunderbolt 2 has the necessary bandwidth. The latest rMBPs have Thunderbolt 2, so they should be fine once OS X's drivers are updated. Everything else is stuck at 30fps.


This article claims the hardware in the latest macbooks can support it, but the current osx drivers max out at 30hz: http://9to5mac.com/2013/12/23/new-retina-macbook-pros-can-dr...


I'm using a 4K monitor with a late-2013 MacBook Pro


I think you are wrong here.

I've used a late-2013 rMBP with a 4K monitor (OS 10.9.1) at 30Hz. It's a supported configuration (http://support.apple.com/kb/HT6008?viewlocale=en_US&locale=e...).


Looks like I'll be finally buying dedicated graphics card for my Hackintosh.


I can't believe we might finally start seeing some movement in the resolution world outside of the Macbook realm. Think this will ever make it to laptops? Dell announces the year 2025 Ubuntu Programmer Edition ;)

On an unrelated note, I have a last-gen Intel Haswell CPU with the HD4600. There's no way I can power this, can I?


Apple aren't the only ones that have been innovating in this space. For example, the terribly named Samsung Ativ Book 9 Plus has a ~20% higher pixel density than MBPs.

It looks as though the HD4600 should be able to push UHD using DisplayPort


It already made it to laptops: http://www.slashgear.com/toshiba-tecra-w50-and-satellite-p50... Toshiba Tecra W50 and Satellite P50t get 4K display updates


Saying it's already made its way to laptops is a bit of a stretch - the article you linked to said they'll be available mid-2014. This Dell monitor is available in a couple of weeks.


I'm successfully running a 50" Seiki 4k monitor off of the HDMI port on my motherboard. Only 30Hz with HDMI, but I also have DisplayPort, I might be able to upgrade to a 60Hz monitor without having to get a dedicated graphics card.


I wonder whats the better option for replacing my 2x 24inch Dells (total resolution 3840x1200)

1) Get one 40 inch 4k monitor to have one giant display with about the same DPI but more screen real estate and no bezels

2) Get two 28 inch 4K displays for super high DPI but less screen real restate and bezels.


Keep in mind, when the resolution the same, your screen real estate is the same with different sized monitors. If you have a window on a 4k screen taking up 25% of your screen, it'll take up 25% of the screen on a 28 or 40" monitor. But it's physical measurement would be larger on the 40" screen. But you still can't fit more things on the 40". Go for the dual 28" monitors, then you really get 2x the screen real estate, and have the benefit of dual monitors.


How big a window is on a certain resolution depends on the DPI scaling. Screen real estate would be roughly the same, but everything would be much more crisp with higher DPI.

So with 2x 4K 28inch screen the screen real estate would be about the same to my current setup but everything would be sharp. Whereas with a single 4K 40inch screen i would get double the vertical space than my current setup and roughly the same DPI on a single monitor. It like 4 of my current Dells melted together into one giant monitor.


How do you separate tasks? If your current setup is 2x then I would suggest staying with it unless you have identified why one display is better, other obvious size.

The only issue with two or more monitor setups is I have found that unless they are identical its never possible to color match which can be disconcerting.


should work with a decent window manager, also i dont like bezels and more vertical space to put console/logs on the top end would be good. That and the total scifi looks of it :D


How far away will they be? How good is your eyesight? How big is your desk?

I would probably go for the option (2) provided your eyesight is good and the screens close enough to get some benefit from the higher density.


about an arms length..Eyesight is pretty good, not perfect though. The thing i like most about it would be more vertical space for not so frequently used stuff and no bezels.


I would go for option (2) but I think I'm less worried about the vertical height and having separate screens on different data works for me so the bezel isn't critical.

At the cost of bezels you could go for option (2) with the screens rotated to portrait.


I have the same setup and would go with 2) but that's because one of my screen is vertically oriented


Or maybe also a third alternative could be 1) + keep one of your old screens.


If this gives me (personally) a significantly better viewing experience at normal distance than the 1680x1050 I have now, I won't just buy it, I'll eat it.


Of course, you'll have to buy another one then.


I'll eat that one, too.


Hopefully it will have a DisplayPort out like the U2713H or a Cinema Display.

The fact that the U2713HM doesn't have one makes high-res dual displays a pain for PC laptops.


You mean mini DisplayPort?

Because the U2713HM has a DisplayPort.

I actually have the U2713HM and in my experience DisplayPort is a pain in the behind. I don't know if it's the monitor or the connection in general but whichever it is, it seems to suck.

If I move my desktop an inch it will drop the signal completely, I'm not exaggerating one bit. It seems like if I even shake the cable a bit I lose the connection and when it does that it doesn't reconnect at all anymore, at least on Linux, without replugging the cable. Because it's a desktop I have to crawl under the desktop to replug the connection.

At first I was using it on Intel graphics (i7 2600k) and sometimes I didn't get video at all on boot, dmesg had messages about "DisplayPort link training failed" or whatever. I had to reboot to get signal or replug both monitors in specific order and play with xrandr. I also started experiencing massive GPU hangs on Intel, don't know if it's related, but eventually I gave up and bought a passively cooled ATI. I have the same exact problems with that. I even bought a second DisplayPort cable, thinking that might be the problem. Nope. I even keep getting the similar "link training failed" messages with ATI.

The connection seems to work fine if you get it working once and treat the cable like it's cursed and avoid touching anything ever.

Don't know if it's the monitor or the connection in general though. What I do know is that the monitor also buzzes when there's lots of text on the screen (i.e. just open Wikipedia in a browser) which drives me insane at night if it's silent. Googling seems to indicate it's a problem with this model. Some "premium" display huh?

If the future of display connectors is DisplayPort I will cry myself to sleep at night.


I've got two U2713HM. One at work, connected to a Mac Mini (HD4000) via displayport. The other at home using HDMI (so not up to full native res). The only problems have been caused by the Mac Mini (and have happened no matter what the connection being used was). We're also using it to connect a few other large monitors to various Macs.

I don't think your problem is anything to do with displayport. In fact, I'd recommend using it to anyone connecting a large non-Apple monitor to anything with a miniDP or DP connector to drive it. Shame that miniDP to DP cables are so hard to get hold of (down here in NZ, at least).

Edit: I'm running Linux, most of the others are on OS X. As I said, only problems have been the Mac Mini's fault (happens with all connectors and OSes).


Seems more like a Linux Driver/Display Port disagreement (or maybe your hardware) than DisplayPort as a spec. DisplayPort has been spotless on my 2012 MBPr under OS X and Windows. Ditto at work for numerous Lenovo, Dell, and Asus laptops running Windows and various flavors of Ubuntu.

DVI on the other hand, on my MBPr, will occasionally spaz out and display static. I've tried with multiple monitors of the same model with to same effect.

(For what it's worth we pretty much use only Dell 2412 monitors at work).


I have no issues with the following hardware: HP LP2475w monitor (came with a DP cable), Intel DZ77BH-55K motherboard and i7 3770 CPU (HD4000). I use Fedora (Linux).

What sucks is that the DP output is only version 1.1 even if DisplayPort 1.2 was published in 2009.


You have to use MST with DisplayPort 1.2 - sort of like Dual Link with DVI - to drive it at full resolution at 60hz.

The connector is already maxed out, so chaining is not possible.

You'll probably have to wait until HDMI is updated, and use one DP and one HDMI.


Keep in mind that this appears to be a Dell "Professional" series monitor and not one of their UltraSharp lineups. So something somewhere has been sacrificed in order not to be included in the US class, and it shows in the price.

I also believe this specific monitor only does 30Hz refresh at 4K, so...that's a big part of it.


Unless it can do at least 60hz at 4k then no one should buy this. 30hz is ridiculously bad, even for desktop usage.


Just about all of these screens, with the exception of the first gen Seiki, can do 60Hz just fine. It's the computer, and interface, that struggle.


Not at 4k resolution. They all advertise themselves as 120hz, but that's only at 1080p. Show me a 4k monitor that is using display port 1.20 that is not at a ridiculous price.

A computer isn't going to struggle with that resolution for non-gaming usage. A $150 graphics card can easily handle 2x 2560x1440 monitors for desktop usage without breaking a sweat.

You're going to end up having to spend a lot more to run a single 4k monitor too because your graphics card absolutely must have display port 1.20 or hdmi 2.0 (which isn't even out AFAIK).


I assume that it being an Ultrasharp means it will be an IPS. If so, pretty sweet. And it might help drive down prices for the Korean 27s (1440p) monitors on eBay, which are much easier on the eyes (especially with text) than other panels.


The $700 monitor they are talking about is not an UltraSharp, but a P-series monitor. In the past the P-series label has used for both TN and cheaper IPS panels. Dell already has a couple of 4K UltraSharp monitors, but they cost a lot more.


My understanding is that this is a TN panel, not IPS.


It's not Ultrasharp, where did you see that?


You're correct. I mixed up the first sentence with the next.

But it seems to be an IPS, of some sort:

"The P2815Q packs an IPS LED display will have a full 3840 x 2160 4K resolution. It launches globally on January 23. Dell hasn’t yet discussed things like refresh rate or range of inputs (I’m sure DisplayPort is a given), but they do promise the same “screen performance” as the new UltraSharp 32 and UltraSharp 24 Ultra HD monitors. That’s certainly encouraging since their UltraSharp line is normally a cut above when it comes to professional displays."


I'd have a hard time deciding on this for $699 (maybe $500 with discounts), or the direct successor to my U2410 standard monitor, the 24" UP 2414Q. $1399 list, and probably $1k or so with discounts).


I have my displays for nearly a decade, because I refused to downgrade to new HD displays. Finally something is happening.


Can't wait to see their pricing in Australia. If it's under $1k, I'd like a handful!


You're kidding. They'll inflate the price by 20%, just like everyone else does. Hi Apple and Adobe!


I know nothing about Australia, but $699 plus 20% is $838.80, which is $940 in AUD, which is under $1k.


Really? 1 USD is currently 1.12 AUD. How did you get that conversion?


940.00 AUD / 838.80 USD ~= 1.12

What's the problem?


(Thanks for saving me from questioning my sanity)


I'm an idiot.


If it's above $1k it'll be worth while just importing them.


Shut up and take my money!


Darn it, just bought their 27 inch 2560x1440 for $600 a year ago.


Who ever thought they were joking?


I think the "wasn't joking" is in reference to the "less than $1000". I think many people assumed the usual marketing phrasing here indicated that the display would likely run $999 or similar. $699 is dramatically less, which is surprising to me.


my gut reaction: god damnit.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: