I'm very glad to see arstechnica using the blurbusters tool for testing monitor rate!
It was initially a Win32 exe that I created 5 years ago [1] with a simple idea to display very distinct frames, back when people were using all kinds of silly less accurate methods like timers to do these things. Blurbusters did a great job of making the tool more accessible to more people (visit website vs. download and run an unknown exe). They were even respectful enough to ask for my permission before adding it their website (which I of course was happy to give permission to, and not that I would mind too much if they didn't, it's a very simple idea after all).
I don't quite understand how it works. My guess is that it doesn't actually perform any test itself, you're the one that is deciding whether the refresh rate is actually 60Hz. The big green "VALID" doesn't say you're monitor is at 60Hz, but that it is drawing at 60Hz to the frame buffer, and you can now observe the motions to decide for yourself whether it is smooth enough that the refresh is running at 60Hz?
I'm very glad to see arstechnica using the blurbusters tool for testing monitor rate!
You should be, it's a good trick and at least uses external hardware which makes it more trustworthy than trying to measure what the system is doing from within the system. (e.g. 'it can be confirmed with apps that measure your refresh rate')
Usually that's fine, but still a 60Hz refresh doesn't mean that whatever program is rendering onto the screen is capable of getting 60 frames per second effectively being put out by the screen without dropping any frames. Which is ok for most everyday usage, but e.g. for neurosciences doing research of the visual system that is hardly sufficient. You really want to be 100% sure that whatever you tell a program to display, it does this correctly frame by frame. And the only way to make sure that happened is to use external, continuously recording, hardware (photodiode or high speed cameras).
Linux supports a variety of compositing managers (programs that receive pixels from the application and are responsible for actually making the video hardware draw them), and these each have various settings that control how they interact with blanking. I have mine set to sync to vblank and so Chrome animations seem to sync to vblank too. Changing the settings on the fly, though, is implementation dependent, and whatever implements the feature for this website probably doesn't want to bother.
It does work fine on ChromeOS, so it's a "generic Linux" thing, not some specific problem with the Linux kernel itself.
> Even when using SwitchResX to force the display out of HiDPI mode and into a non-scaled 1:1 5120x2880 resolution, SwitchResX continues to show 60Hz (I’d include a screenshot, but it looks identical to the previous one).
So they used Blur Busters multiple times with multiple display configurations and not a single time did they actually think to read any of the bright red instructions.
This is not what I've come to expect from Ars Technica.
Where are the bright red instructions? I'm interested in this, but I'm having trouble finding an explanation of how to use http://www.testufo.com, or of what it's measuring.
Yes. The reason why laptops and all-in-ones get the panels with high resolutions is because the manufacturer gets to build the source, sink, and transport layer, and isn't stuck with what a standards committee can make for them. (Details here: https://news.ycombinator.com/item?id=8549629)
But hey, at least the standards committees are working on new forms of DRM so the NSA can't tap your video cable and see your screen. Or something.
> But hey, at least the standards committees are working on new
> forms of DRM so the NSA can't tap your video cable and see your
> screen. Or something.
I think he was referring to the fact that industry fails to provide end-users with good, ubiquitous and easy to use cryptography for privacy, while on the other hand video standards have repeatedly been delayed through industry mandating ubiquitous cryptography for digital rights management.
e.g. one of the (likely: many) topics that are being discussed in the committees designing the next generation of DVI/HDMI/DisplayPort standards likely is how to encrypt the video signalso that a customer is barred from recording a movie. Which delays the availability of future standards describing how to drive your 5K monitor at 60 Hz over then-improved HDMI/DisplayPort connections.
I think he's referring to HDCP, which is a standard meant to prevent tapping into a HD video output for the purposes of recording.
It's of pretty dubious effectiveness, and quite honestly has screwed me in the past more than a few times even when I was doing something 100% legitimate (like renting a movie on iTunes and trying to use my MacBook's HDMI out to the TV... and it refusing to play).
Not to mention there are many, many more ways to record HD content from a source than tapping the video output.
Well, it's certainly of no effectiveness whatsoever since the HDCP master key leaked and anyone can trivially decrypt a HDCP stream ever since. The NeTV (http://www.kosagi.com/w/index.php?title=NeTV_Main_Page) does this in realtime.
That doesn't mean they have stopped preventing people from watching their just bought content on a beamer or display without HDCP support. Oh no, that stuff continues right now.
In the history of DRM, this is probably one of the most bizarre failures. You can reasonably assume it never stopped anyone from making illegit copies (capturing very high bandwidth interfaces like HDMI is decidedly non-trivial and simply not worth the time investment if there are much easier sources), while denying people who just seconds ago shelled out cash for your product access for a reason they will not understand and will certainly not appreciate.
No, the NeTV does NOT do any kind of decryption. What it does do is encrypt its own image using the same key in parallel, so that it can overlay its own display on top of the incoming display stream.
It does include an implementatiom of HDCP that could be used for decryptioon if you work hard enough at it (and I would guess someone worked hard enough at it), but as it comes out of the box, NeTV cannot be used to strip off HDCP.
Is there anything that does decrypt HDCP in real time? Something where you plug an encrypted cable into one end and it outputs an unencrypted source on the other? One that maintains things like audio?
That's a fragile attack anyway, as I imagine the key they're using would be revoked. The only attack that works is one that uses the master key to generate keys that haven't been revoked, which I doubt their hardware does. (Illegal and all that. Just being an HDCP endpoint is easy; take the chips from your TV product and stick them in the splitter. "Oops, sorry." But then you get revoked.)
My complaint is that the standards committees are wasting their time on a replacement for HDCP: something nobody wants or needs. Meanwhile, where's my 120Hz 4k monitor.
Don't worry!! Spread spectrum interference patterns generated with MASERs allow the NSA to reflect back what you see through your own eyes... probably from that white van parked out front.
I don't know why this is news. Did anyone really think Apple would release a display at 30Hz? I very much doubt it. The real question is what PWM frequency the backlight runs at! (If any.)
Right, but those are standalone screens limited by available video cables/ports and their capabilities. This is integrated without concern for any of that.
You're mistaken. There probably isn't a single 4K monitor that doesn't support 60Hz. Unfortunately some people buy 4K TVs like the infamous Seiki to use as monitors.
Wow, you're right. That's a terrible monitor with TN panel to match. From a reputable(?) brand no less. Anyway, I think my original point still stands. Apple is no Dell.
So that's a budget 4K monitor. What did you expect for just over $400?
Dell makes plenty of high end monitors, they just don't give them away at loss-generating prices.
The 3008WFP-HC (not 4K) is a 30", 2560x1600 monitor, 30 bit, S-IPS panel.
I'm not sure why anyone is surprised that a $400 4K monitor is not running at 60Hz. It's not like Apple would offer that? You're right. But their solution would also not be anywhere near the same.
Hell, they haven't even updated the Thunderbolt Display. Even today, it requires a MagSafe to MagSafe Adapter to work with ANY Macbook currently on sale (seriously, Apple? You can't replace the connector with MagSafe 2?) and doesn't support USB3.
Recent Thunderbolt displays at least ship with the adaptor in the box. If you’re going to be compatible with both and you’ve already engineered a Magsafe 1 connector and a Magsafe 2 adaptor that’s a reasonable compromise.
Eh. If you're manufacturing units now (which they are - my TB Display has a March 2014 date of manufacture, perhaps the "reasonable compromise" should be to ship with the connector that has been standard for over two years now and an adapter for the old standard.
It's really quite incredible. I also just got a new - plain, boring - Dell monitor, and the background lighting (or whatever it is) really makes the difference between that and older, dimmer screens.
Nice to see an independent review to back this up! Keep meaning to nip down to the Apple store and have a look but a little concerned I'll be coming home with one!
But until we have DP1.3 and external matching 5K thunderbolt display this really isn't for me. The authors comments regarding the previous thunderbolt display "However, the Retina display does make things on the other 2560x1440 displays look… a little grody." really do put me off.
That’s image retention, not burn in. It’s not permanent, though still very annoying.
Do you have the 2012 15" Retina MacBook Pro? Some of those (those with LG screens) were suffering horribly from image retention. I got the screen exchanged twice on mine. Now I luckily have the Samsung screen.
There are currently no known cases of image retention with this iMac. It looks as though they got the issue under control and this was really a first gen (large screen) retina tech problem. (Retina iPads were or maybe still are suffering from image retention, too, though, strangely, with my iPad 3 I care much less about that, for whatever reason. It’s also a bit more mild.)
If you get the Samsung screen replacement, you won't get any image retention. Although there is a trade off, the Samsung screens have more yellow whites and the contrast is less (I remember black being as black as the bevel with the LG screen). I wish I could've won the lottery and gotten one of those LGs without any retention.
No, they used it to turn HiDPI mode off, so that each physical pixel is addressed as a logical pixel, instead of the normal 4 phyical to 1 logical mapping.
Some people might read that and misunderstand. It does display in 5K whenever possible, but appears to be a half res screen to some software for compatibility purposes. You are still getting the benefit of the higher resolution though, unless you're viewing bit maps that are lower res, or use badly written or non-native software for example.
On that last point, how do things like Xwindows or Qt apps work out? Qt should be ok because it tends to use native widgets, as I understand it, but I imagine it depends on how the app is written.
I noticed an odd quirk when I got my first retina iPad. Some iPhone apps did render some content at full resolution. I first noticed with iPhone camera apps. The app interface would render at iPhone resolution, but the photos were at full native display resolution. Has anyone noticed any quirks like that with the 5K iMac?
It was initially a Win32 exe that I created 5 years ago [1] with a simple idea to display very distinct frames, back when people were using all kinds of silly less accurate methods like timers to do these things. Blurbusters did a great job of making the tool more accessible to more people (visit website vs. download and run an unknown exe). They were even respectful enough to ask for my permission before adding it their website (which I of course was happy to give permission to, and not that I would mind too much if they didn't, it's a very simple idea after all).
[1] http://hardforum.com/showthread.php?t=1423433