Hacker News new | past | comments | ask | show | jobs | submit login
The 5K Retina iMac’s screen runs at 60Hz at 5K resolution (arstechnica.com)
193 points by lelf on Nov 8, 2014 | hide | past | favorite | 54 comments



I'm very glad to see arstechnica using the blurbusters tool for testing monitor rate!

It was initially a Win32 exe that I created 5 years ago [1] with a simple idea to display very distinct frames, back when people were using all kinds of silly less accurate methods like timers to do these things. Blurbusters did a great job of making the tool more accessible to more people (visit website vs. download and run an unknown exe). They were even respectful enough to ask for my permission before adding it their website (which I of course was happy to give permission to, and not that I would mind too much if they didn't, it's a very simple idea after all).

[1] http://hardforum.com/showthread.php?t=1423433


Except that they didn't do so! They just took a screenshot, the contents of which include the instructions that a screenshot is meaningless.


Hah, I just noticed that those 2 really nice photos at the bottom are a comment from blurbusters, not a part of the article.

One step at a time, hehe.


Well its kind of part of the article -- it's a promoted comment (and the only one).


I don't quite understand how it works. My guess is that it doesn't actually perform any test itself, you're the one that is deciding whether the refresh rate is actually 60Hz. The big green "VALID" doesn't say you're monitor is at 60Hz, but that it is drawing at 60Hz to the frame buffer, and you can now observe the motions to decide for yourself whether it is smooth enough that the refresh is running at 60Hz?


The comment at the bottom adds the key piece of information - you need a camera that you can set the shutter speed to 1/60th of a second.


That's exactly right. See the comment at the bottom of the article with two photos, it describes it in detail (confirming what you said).


I'm very glad to see arstechnica using the blurbusters tool for testing monitor rate!

You should be, it's a good trick and at least uses external hardware which makes it more trustworthy than trying to measure what the system is doing from within the system. (e.g. 'it can be confirmed with apps that measure your refresh rate') Usually that's fine, but still a 60Hz refresh doesn't mean that whatever program is rendering onto the screen is capable of getting 60 frames per second effectively being put out by the screen without dropping any frames. Which is ok for most everyday usage, but e.g. for neurosciences doing research of the visual system that is hardly sufficient. You really want to be 100% sure that whatever you tell a program to display, it does this correctly frame by frame. And the only way to make sure that happened is to use external, continuously recording, hardware (photodiode or high speed cameras).


"UNSUPPORTED: VSYNC is not available on the Linux platform".

What feature exactly is supported on Windows and OSX but not Linux that provides correct vsync?


Linux supports a variety of compositing managers (programs that receive pixels from the application and are responsible for actually making the video hardware draw them), and these each have various settings that control how they interact with blanking. I have mine set to sync to vblank and so Chrome animations seem to sync to vblank too. Changing the settings on the fly, though, is implementation dependent, and whatever implements the feature for this website probably doesn't want to bother.

It does work fine on ChromeOS, so it's a "generic Linux" thing, not some specific problem with the Linux kernel itself.


> Even when using SwitchResX to force the display out of HiDPI mode and into a non-scaled 1:1 5120x2880 resolution, SwitchResX continues to show 60Hz (I’d include a screenshot, but it looks identical to the previous one).

So they used Blur Busters multiple times with multiple display configurations and not a single time did they actually think to read any of the bright red instructions.

This is not what I've come to expect from Ars Technica.


Very bad UI design. Instead of "VALID" it should say "NOW TAKE A PHOTO".


Where are the bright red instructions? I'm interested in this, but I'm having trouble finding an explanation of how to use http://www.testufo.com, or of what it's measuring.


See this for the specific test used in the article: http://www.testufo.com/#test=frameskipping

The instruction is on the top


Thanks!


Yes. The reason why laptops and all-in-ones get the panels with high resolutions is because the manufacturer gets to build the source, sink, and transport layer, and isn't stuck with what a standards committee can make for them. (Details here: https://news.ycombinator.com/item?id=8549629)

But hey, at least the standards committees are working on new forms of DRM so the NSA can't tap your video cable and see your screen. Or something.


Lost me at the end, but good point in general, I think?


> But hey, at least the standards committees are working on new > forms of DRM so the NSA can't tap your video cable and see your > screen. Or something.

I think he was referring to the fact that industry fails to provide end-users with good, ubiquitous and easy to use cryptography for privacy, while on the other hand video standards have repeatedly been delayed through industry mandating ubiquitous cryptography for digital rights management.

e.g. one of the (likely: many) topics that are being discussed in the committees designing the next generation of DVI/HDMI/DisplayPort standards likely is how to encrypt the video signalso that a customer is barred from recording a movie. Which delays the availability of future standards describing how to drive your 5K monitor at 60 Hz over then-improved HDMI/DisplayPort connections.


I think he's referring to HDCP, which is a standard meant to prevent tapping into a HD video output for the purposes of recording.

It's of pretty dubious effectiveness, and quite honestly has screwed me in the past more than a few times even when I was doing something 100% legitimate (like renting a movie on iTunes and trying to use my MacBook's HDMI out to the TV... and it refusing to play).

Not to mention there are many, many more ways to record HD content from a source than tapping the video output.


Well, it's certainly of no effectiveness whatsoever since the HDCP master key leaked and anyone can trivially decrypt a HDCP stream ever since. The NeTV (http://www.kosagi.com/w/index.php?title=NeTV_Main_Page) does this in realtime.

That doesn't mean they have stopped preventing people from watching their just bought content on a beamer or display without HDCP support. Oh no, that stuff continues right now.

In the history of DRM, this is probably one of the most bizarre failures. You can reasonably assume it never stopped anyone from making illegit copies (capturing very high bandwidth interfaces like HDMI is decidedly non-trivial and simply not worth the time investment if there are much easier sources), while denying people who just seconds ago shelled out cash for your product access for a reason they will not understand and will certainly not appreciate.


> The NeTV does this in realtime.

No, the NeTV does NOT do any kind of decryption. What it does do is encrypt its own image using the same key in parallel, so that it can overlay its own display on top of the incoming display stream.

It does include an implementatiom of HDCP that could be used for decryptioon if you work hard enough at it (and I would guess someone worked hard enough at it), but as it comes out of the box, NeTV cannot be used to strip off HDCP.


Is there anything that does decrypt HDCP in real time? Something where you plug an encrypted cable into one end and it outputs an unencrypted source on the other? One that maintains things like audio?


Sure: all SiI9187 based HDMI splitters do, e.g. amazon.com/dp/B005HXFARS


The amazon pages say they don't decrypt anymore.

That's a fragile attack anyway, as I imagine the key they're using would be revoked. The only attack that works is one that uses the master key to generate keys that haven't been revoked, which I doubt their hardware does. (Illegal and all that. Just being an HDCP endpoint is easy; take the chips from your TV product and stick them in the splitter. "Oops, sorry." But then you get revoked.)


My complaint is that the standards committees are wasting their time on a replacement for HDCP: something nobody wants or needs. Meanwhile, where's my 120Hz 4k monitor.


Don't worry!! Spread spectrum interference patterns generated with MASERs allow the NSA to reflect back what you see through your own eyes... probably from that white van parked out front.


The test used:

http://www.testufo.com/

Created by the person running Blur Busters, who is doing really neat stuff.

http://www.blurbusters.com/

The forums have some fascinating discussions:

http://forums.blurbusters.com/

I think a lot of other HN'ers would like it.


I don't know why this is news. Did anyone really think Apple would release a display at 30Hz? I very much doubt it. The real question is what PWM frequency the backlight runs at! (If any.)


Most 4k screens on the market have limited refresh rates compared to lower res.

That's why it's news.


Right, but those are standalone screens limited by available video cables/ports and their capabilities. This is integrated without concern for any of that.


Only over HDMI. DP works at 4K 60Hz.


Which 4K monitor that's a monitor, rather than a TV, is less than 60 HZ?


You're mistaken. There probably isn't a single 4K monitor that doesn't support 60Hz. Unfortunately some people buy 4K TVs like the infamous Seiki to use as monitors.


The one I've seen talked about the most is the Dell 28 Ultra HD 4K (1) and it is indeed 30HZ.

1 - http://accessories.us.dell.com/sna/productdetail.aspx?c=us&c....


Wow, you're right. That's a terrible monitor with TN panel to match. From a reputable(?) brand no less. Anyway, I think my original point still stands. Apple is no Dell.


So that's a budget 4K monitor. What did you expect for just over $400?

Dell makes plenty of high end monitors, they just don't give them away at loss-generating prices.

The 3008WFP-HC (not 4K) is a 30", 2560x1600 monitor, 30 bit, S-IPS panel.

I'm not sure why anyone is surprised that a $400 4K monitor is not running at 60Hz. It's not like Apple would offer that? You're right. But their solution would also not be anywhere near the same.

Hell, they haven't even updated the Thunderbolt Display. Even today, it requires a MagSafe to MagSafe Adapter to work with ANY Macbook currently on sale (seriously, Apple? You can't replace the connector with MagSafe 2?) and doesn't support USB3.


Recent Thunderbolt displays at least ship with the adaptor in the box. If you’re going to be compatible with both and you’ve already engineered a Magsafe 1 connector and a Magsafe 2 adaptor that’s a reasonable compromise.


Eh. If you're manufacturing units now (which they are - my TB Display has a March 2014 date of manufacture, perhaps the "reasonable compromise" should be to ship with the connector that has been standard for over two years now and an adapter for the old standard.


And his point still stands. This is news.


The Seiki, AKA the first cheap, readily available 4K TV, only does 4K at 30hz due to the limitations of HDMI:

http://www.amazon.com/Seiki-SE39UY04-39-Inch-Ultra-Discontin...


Ah, you're right. I was thinking about the 120Hz vs. 60hz issue.


We got several of these at work and the screens are just phenomenal. I want one at home too.


It's really quite incredible. I also just got a new - plain, boring - Dell monitor, and the background lighting (or whatever it is) really makes the difference between that and older, dimmer screens.


That frame rate analysis trick by taking a photo with phone camera is just amazing!


Nice to see an independent review to back this up! Keep meaning to nip down to the Apple store and have a look but a little concerned I'll be coming home with one!

But until we have DP1.3 and external matching 5K thunderbolt display this really isn't for me. The authors comments regarding the previous thunderbolt display "However, the Retina display does make things on the other 2560x1440 displays look… a little grody." really do put me off.


> Keep meaning to nip down to the Apple store

If you're due for an upgrade anyway, then I skip the Apple store. Instead, I'd simply order it and get that "OMG" moment when you unpack it :-)


How has burn-in been the past 2 years? I've got a two-year-old MacBook and I can read my mail after I log out.


That’s image retention, not burn in. It’s not permanent, though still very annoying.

Do you have the 2012 15" Retina MacBook Pro? Some of those (those with LG screens) were suffering horribly from image retention. I got the screen exchanged twice on mine. Now I luckily have the Samsung screen.

There are currently no known cases of image retention with this iMac. It looks as though they got the issue under control and this was really a first gen (large screen) retina tech problem. (Retina iPads were or maybe still are suffering from image retention, too, though, strangely, with my iPad 3 I care much less about that, for whatever reason. It’s also a bit more mild.)


If you get the Samsung screen replacement, you won't get any image retention. Although there is a trade off, the Samsung screens have more yellow whites and the contrast is less (I remember black being as black as the bevel with the LG screen). I wish I could've won the lottery and gotten one of those LGs without any retention.


I took my laptop in to the Apple store and they replaced it without question due to image retention.


Image persistence =/= burn in


Uses SwitchResX to get the iMac to actually display in 5K?


No, they used it to turn HiDPI mode off, so that each physical pixel is addressed as a logical pixel, instead of the normal 4 phyical to 1 logical mapping.


Some people might read that and misunderstand. It does display in 5K whenever possible, but appears to be a half res screen to some software for compatibility purposes. You are still getting the benefit of the higher resolution though, unless you're viewing bit maps that are lower res, or use badly written or non-native software for example.

On that last point, how do things like Xwindows or Qt apps work out? Qt should be ok because it tends to use native widgets, as I understand it, but I imagine it depends on how the app is written.

I noticed an odd quirk when I got my first retina iPad. Some iPhone apps did render some content at full resolution. I first noticed with iPhone camera apps. The app interface would render at iPhone resolution, but the photos were at full native display resolution. Has anyone noticed any quirks like that with the 5K iMac?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: