Now that windows 8 and mountain lion both for the first time credibly handle HiDpi I agree, it's about time we switched. But laptop makers are hardly to be roundly chastised for not going super density until now (though higher than 1386x was always workable) since the windows experience of hidpi was pretty broken. Fonts would get clipped inside too small bounding boxes, things wouldn't line up, chroming would be too small and so on.
Now all we need is linux to credibly support it as well, or at least a linux built for mouse use. There are still many, many usability issues in gnome with a high ppi screen.
Also, lets not forget how far mobile gpus have come in the last few years, it would have been impossible to push that many pixels with anything but the most minimal of 3d use cases.
It's a much more complex problem than linus would suggest by simply having oems switch panels. Witness how relatively complicated apple's solution is, which came after years of supposedly "somewhat" supporting hidpi. Use the hidpi macbook pro at 1920x on the ivy bridge gpu and it's still noticeably laggy at some 3d operations.
Care to share why you think Windows 8 handles DPI differently than Windows 7?
At least in terms of native applications I've seen literally no differences; maybe Metro handles it better but Metro on the desktop is, well, Metro on the desktop.
PS - I know it is now called "Modern UI" but that name sucks.
i do mean metro, mostly. aside from engine improvements microsoft has been really beating the drum with 3rd party developers to support high ppi screens out of the gate. booting the retina mbp on windows 8 works surprisingly from the start, i have it installed with bootcamp.
MS has been trying to get 3rd party developers make resolution independent apps for very long time now. Last time it was WPF that was supposed to solve all problems, now it's Metro.
WPF and Metro are of course both XAML so the same push from the WPF days applies. I've build apps in Silverlight and WPF that scaled just fine. You just need to make sure you start with proportional design and stay that way, using some key min and max height/width properties where necessary. Most desktop app devs are used to fixed sized gui widgets which is where the problems start.
Once you get into a proportional mindset, the hardest problems tend to be related to text and images. One trick I learned for the latter was using vector art as much as possible and converting SVG images to XAML (care of XamlExport[1]). Then scaling no longer becomes an issue. For raster art, I would hope Metro has something similar to iOS's retina graphic substitution magic. Silverlight 4 didn't which was a bit of a drag.
Text was always a little trickier but when dealing with full screen apps I would usually just have some code that changed the text size based on the delta between the new and old window size.
Vector art doesn't scale very well. Most icons are designed in Illustrator (as vector art) and then touched up in Photoshop for each scaling used (as pixel art), mainly because the real time rendering of vector art just isn't at professional quality yet. So the only solution, which has been adopted by iOS and Metro (and probably Android) is to include multiple icons for each DPI class.
In my experience with converting to XAML you don't experience the same degradation since your art is being actively redrawn. You can add gradient brushes and other embellishments that scale well. Perhaps I will dust off my windows laptop and do a blog post about it.
That said, you mentioned that Metro will do image substitution a la iOS. Thats fantastic and something I sorely missed during my Silverlight dev days.
It seems more baked in this time, with metro automatically engaging two different scaling targets (full hd and 2560x) as well as the baseline ~100dpi. So people will be in hidpi modes without engaging them, and a good number of them since 10" full hd is looking like a high volume form factor. All of the ui elements are vector, fonts obviously are, and there is built in support for switching out bitmaps for the 3 stated targets - even in css from the internet. Visual Studio 2012 has these dpis built in to its ui builder previewer along with the typical form factor previews. I guess time will tell but it certainly feels like the most credible attempt to date to me.
> Care to share why you think Windows 8 handles DPI differently than Windows 7?
I can't answer the to specifics of how rendering is different between 7 and 8, but the frameworks released with 8 have or are moving towards vector and resolution-independent rendering.
I used Win7 on 1920x1080 and has zero issues with the DPI handling. Oh yeah except for chrome, but, i don't use it then. On firefox i set devperpixel to 1.2. Other apps all work with the system dpi setting.
I didn't spend a whole lot of time looking, but the High-DPI settings I played with seemed to provide half-assed results: fonts and some components of the window were scaled appropriately, but other bits (spacing between icons comes to mind; a bunch of text was prematurely truncated or something) were just not quite there. It's a big improvement from previous versions of the OS, but the implementation seemed a bit off compared to iOS and what I've seen on Retina MBPs.
You turn the ui scaling up to 150% in windows 7 or previous and don't experience visual and alignment issues? Or do you mean that you don't find the elements too small when used at default size? There is no argument that the latter works fine, obviously 1680x isn't even particularly higher density than 1386x. For most people, though, once you get to say 11" 1920x or 15" 2880x you really need the scaling and my experience is that it used to be pretty half baked.
Look at this skype screenshot at 125% for example:
I've used DPI scaling in Windows 7 for years now, since I used Win7 as the OS on my HTPC. My television is the only large enough surface where with the higher resolution i've needed larger text.
That screenshot you link to is of Skype, and I would not use Skype as a standard example for anything. Even if MS owns it now, it is a poorly designed program by today's standards.
The only UI problems I ever had with scaling was with iTunes, and that was at >150%. I write it off as Apple not optimizing the UI for Windows.
> That screenshot you link to is of Skype, and I would not use Skype as a standard example for anything. Even if MS owns it now, it is a poorly designed program by today's standards.
How does this in any way negate it's worth as an example of a problem? It's effectively saying "That problem, it's an invalid problem because I said so."
Because the cause of the problem is Skype's (and iTunes) poor development, not Windows.
I didn't state the problem is invalid, I merely provided anecdotal evidence from my years of experience in using DPI scaling in Windows to demonstrate that it's a non-issue.
> Because the cause of the problem is Skype's (and iTunes) poor development, not Windows.
Which is the key factor here. It's not Windows' fault specifically, but it's a problem that occurs when using (very common software on) Windows. To the end user, what's the difference?
If the problem is with third-party software, then end users have a decent chance of finding acceptable alternative software without the problem. The problem is also likely to be limited to some specific applications they use, rather than coming up generally.
If the problem is with Windows, finding acceptable alternative software is unlikely, if it is possible at all. Even worse, users are likely to see the problem affecting most of the applications they use rather than just some specific applications.
I'd say end users would notice and care about the difference between those two scenarios.
If the commonly-available applications for an OS are ugly, that OS is going to be perceived as ugly. And vice versa.
If Skype and iTunes -- which together are probably installed (one or the other) on a huge percentage of home systems, and a nontrivial number of corporate ones -- are ugly at higher resolutions, then there's a serious problem.
Sure, they shouldn't write shitty software, but it's now Microsoft's problem due to how it impacts the platform as a whole. At least in the case of Skype, one assumes they could fix it (since they now own the product); iTunes is ... probably more complicated.
I'm not saying there isn't a problem. I'm just saying the problem can be perceived differently even by people who don't understand what's going on under-the-covers.
If people generally spend most of their time in IE/Chrome/Firefox,Word,Excel,PowerPoint... and they all look great and Skype and iTunes look bad, I'd expect people to complain about Skype and iTunes.
If, on the other hand, people spend a lot of their time in applications that look bad I'd expect them to have a broader complaint (which may or may not be about Windows since, by assumption, we are talking about people who don't understand what's going on under-the-covers).
I think you're all missing a key part of the equation here. It was the OEM who configured the OS, the OEM who selected the panel, and the OEM who advertised the feature that presumably went into my buying decision. If my (windows+skype) looks bad and my friend's otherwise identical (windows+skype) laptop from another brand looks fine, I'm blaming the vendor. Who, coincidently enough, is the party that takes the economic hit if I return it. OEM's know this deep in their core - and this is one of the significant reasons you haven't seen them pushing devices that require scaling for normal eyesight.
Windows XP at 2560x1600 is ok on my 30" monitor, but it would be next to unusable on a 15" display. The Icons didn't scale to the resolution, so they would have been tiny.
2560x1440 on my 27" iMac is ok, but I have to use CMD-+ on almost all web pages. Only in Safari is Magic Touchpad zooming smooth and iOS like, but I'd like to use Chrome.
NeXT had bitmap icons (TIFFs) displayed using display postscript. The problem with vector graphics in icons is that you'll still need multiple versions for different resolutions or the icons will look terrible at most resolutions, and then it's horribly inefficient to render them on-the-fly all the time so you'll end up caching them in memory and rescaling them... and -- oh look -- bitmaps.
Well, eventually it's going to come out as a bitmap—they'll still end up as pixels on the screen. Vector icons just give you the flexibility to do it really really well. Cache them, prerender them, I don't care what works best, just match the icon DPI to the screen DPI and we're good.
His point is that it's not good enough to simply match the DPIs and render. Small icons need actual visual differences, not just scaling, in order to really look good. Some icons completely change their character at small sizes, because what looks good when small is not the same as what looks good when large, regardless of DPI. So going pure-vector doesn't completely solve the problem, although it certainly can make it easier.
There is that, and there is also the quality of how aliasing is handled by real time vector art renderers. We just aren't their yet to be relying on vector art for icons (even though they start out that way in Illustrator).
That's true at low-DPI, but not so much at high-DPI. I wonder if a bundle of low-DPI bitmap + a vector for high-DPI would work better than the standard practice of bundling multiple bitmaps...
I think it's still true, although less so. Certainly a certain amount of fiddling takes place in an attempt to align to pixels etc., but sometimes icons just need to change at different physical sizes. Small elements may need to be enlarged to remain visible and such.
Quartz2d is effectively "Display PDF". Bitmap icons are necessary (at least in the icon portfolio) as they can degrade more gracefully at lower pixel counts.
It seems like a chicken and the egg problem but it really isn't: there's far more support for HD res than there is HD hardware out there.
Go to any etailer and check 1080p on laptop options: from 300 models you get 50 at best. There's very little choice too, AFAIK the Zenbook is the only 1080p ultrabook out there and I could only find 2 AMD laptops with HD screens.
Like I said before I don't know what's crazier: that a 10" tablet has a 2560x1600 screen for $399 or that a 15" laptop for $800 might not have a 1080p screen.
The irony is that its been years since I saw a sub-1080 monitor for sale, but XGA/720p laptops are still the norm.
It uses the DPI setting from X, and you can then adjust the font size to your preference with the gsettings option 'org.gnome.desktop.interface text-scaling-factor'.
I believe the idea is to set the DPI's in X to the physical DPI of your monitor, and then use this setting to adjust the scaling of elements to your personal preference. That this setting is not exposed in the default config dialogs is a drawback, and an unfortunate result of Gnome's current philosophy. I think you can access it via gnome-tweak-tool though.
> Now all we need is linux to credibly support it as well, or at least a linux built for mouse use. There are still many, many usability issues in gnome with a high ppi screen.
I've never had a problem using high DPI screens on Linux. There's a setting in xorg.conf, but I haven't had to touch it in years because the correct settings are usually auto-detected.
I don't use Gnome, though, so maybe it's causing problems?
The hiDPI thing should only be meant as a transition. What we ultimately want, is for pages to look the same on the web, and the UI to look the same on the apps, but just have higher PPI. The hiDPI thing just enlarges them because Windows 8 and Mac OS are not resolution independent. What I want is for everything to work on 2560x1600 by default, with the same sizes we have today - not enlarged, and not smaller either.
The HD4000 runs surprisingly well at HiDPI res, actually. Haven't experienced any lag myself, and when needed the integrated nvidia GPU kicks in automatically without a hitch.
Well, Sandy Bridge and Ivy Bridge integrated graphics is OK for most users (i.e. not for gamers and engineers, but for most people they are more than they need).
What usability problems do you see with DPI in Gnome? I can freely adjust the font-scaling across Gnome and... well for the past 2 weeks straight I can't remember having any issues.
I'm so damn happy people are starting to realize how awesome resolution is. I've been buying 1920x1200 15" Dell laptops for 10 years now, and never bought a Mac because they've always had terrible resolution. I run Linux anyway, but I'm going buy Mac hardware next, unless a PC maker creates a competitive display (which I assume they will).
For years, the things I have cared about in terms of computing have been, roughly:
1) Internet connectivity. Without it I'm mostly dead in the water.
2) Screen resolution. I want as much code/information on the screen as possible. I currently have a 1920x1200 Dell myself, and am I bit scared/saddened that I won't be able to replace it.
3) Memory. Always good.
After that come processor speed and storage space. I suspect my next machine will have SSD, because I've heard that it's such a dramatic improvement.
I completely agree with Linus though - I very much want a high resolution laptop screen.
Once you've gone SSD you'll be amazed you got by without it. It's the biggest single-part speed upgrade I've seen since the 90s. Even just a small on for a system partition, and data kept on a traditional HDD, you'll see great gains.
(And yes, also agree, it's a shame that monitor and laptop resolution seemed to stall at 'HD' for ages)
Any laptop I buy for any one, I always replace it with SSD. Higher dpi doesnot matter much for a programer but higher screen estate or resolution matters more.
Obviously if you're working with massive datasets or many many VM's this may not be true, but everyone I know who has 16GB RAM has claimed that it's more than enough. I've got 8GB, and haven't any complaints. I even know people with 32GB who say that with hindsight they'd have been perfectly happy with only 16GB.
An SSD however, is incredible. I originally wrote "essential", though I guess it's not... but it certainly feels like it is once you have one. The downside is that it will make working with non-SSD desktop machines irritating-as-hell.
RAM is so cheap today, I just maxed it out on my laptop. Granted, I have seen only probably 22GB out of 32GB used when I had YACY (P2P search network written in Java) running, but I still expect having system use rest of RAM for caches which in theory should speedup access to file system (on my SSD drive)
Hell, I had 2 GB RAM before and it seemed ok, 8 GB now is more than I need. I had to send the SSD for replacement, though, and it was easily the most frustrating week in recent memory.
FWIW Windows 7 Home Premium (64-bit) has a maximum memory limit of 16 GB.
Professional, Enterprise, and Ultimate can go up to 192 GB. Windows 8 is limited to 128 GB, but the Professional and Enterprise editions can do 512 GB.
SSD on my MacBook Air is the one thing that has made more of a difference than anything else.
I can live with only running four or five major apps at a time in the 4 Gigabytes that came with my 2010 MacBook Air - but the SSD changed my life. I'd put that on your list above memory.
After having bought an SSD a couple of years ago I can honestly say it is at the top of my requirement list (after internet connectivity, but it's basically impossible to buy a computer that can't connect to the internet) for all my computers, easily above memory and screen resolution.
One of the reasons that I finally jumped over to the Mac was the fact that it was one of the few laptops that provided 1680x1050 at around 15".
My previous 2 Windows laptops had been that resolution, but when I came to upgrade about 2 or 3 years ago, that resolution seemed to have been abandoned for 15" laptops by pretty much everyone apart from Apple. Most of them had gone to 1440 or even less. A handful of pretty expensive, at least Apple price, and usually quite bulky, ones offered 1920x1200 but given that the scaling didn't look too great on them at the time I though that this might be just a bit too small for me, and if I was going to be paying about the same price for a Windows PC as I was paying for a Mac one - and not get the resolution I wanted anyway on Windows, I thought I'd bite the bullet and see what all of the Mac fuss was about.
I am a little surprised by this - both Sony and Dell have had credible 1920x1080 options at 15" for several years (admittedly I prefer x1200, which has also been available) - both have also had 1920 13" options for at least a year.
Like I say, the 1920 seemed too small to me, given the not particularly great scaling in Windows at the time if I did want to use other resolutions, and the ones that I did find were usually around Apple prices anyway.
I was recently asked for laptop buying advice. I told them that if they're going to be using it more than casually, then prioritise the screen over everything else; Minimum 1080p IPS screen. I was shocked that here were less than a 5 models (that you can buy new) out there that fit the bill. Such a sad state, I expected that to be the starting off point.
1680 by 1050, which is the best you could get on a 15" Mac previously, isn't exactly "terrible resolution" (by historical standards anyway). But otherwise agreed. My main machine is a 15" Retina, and I don't think I could ever go back to a lower-PPI laptop.
I bought the first generation MacBook Pro, a 17", for that reason. Other resolutions were just too small for what I was doing, and 1680 seemed like just enough. Eventually when they started offering the resolution upgrades on 15" models I upgraded to my current machine. The way I see it, there's no way I can be disappointed by the resolution of my next machine. No more pains over spending $100-300 for just a small bump to make things usable. Of course, the downside is that I can't let myself even try out the retina MacBook in store for fear that I won't look back– I'll wait until I can afford it.
Last time I went shopping for a machine 1680x1050 was my target resolution for a 17" screen with dual hard drives, I'm still using it four years later (it has had HD upgrade to 2x500gb 7200RPM drives) It happens to have a couple of GeForce 8600M GPUs in there so its just about usable for gaming too.
After I dropped it (laptop bag strap broke) and smashed the corner I looked around for a replacement (The mac book pro was tempting but oh the $$$) and couldn't find one for sensible money. So I bought some new plastics, redid the thermal paste on the CPU (it had started thermal throttling during long compiles) bought a second hand screen (and a new 4 port USB pcb that got damaged during the fall) off ebay to replace the broken clips that hold it together when its closed. So I also have a spare screen now.
I'm dreading replacing it but I reckon I'm probably still good for a year or two yet.
20 years ago notebooks had 640x400 or 640x480 screens...
If you include the 15" retina MBP then that's 17x more pixels in 20 years. And that includes greyscale to 8bit color to 16 bit color to 24/32 bit color, passive-matrix to active-matrix, and dozens of other panel technology improvements not directly related to resolution.
The XGA (1024x768) was intro'd in the early 90s. Maybe you missed the earlier discussion but we've all been lamenting the crappy default resolutions since then. We were not discussing bit depth or black and white vs. color.
The problem is there are just as many bad IPS LCDs systems out there as there are bad TN LCDs. The main problem seems to be either uneven backlighthing (especially on LED based LCDs) and really bad anti-glare coatings that make grey colors sparkle. I just returned an Asus IPS LCD because of the later case.
There's also the problematic panel used in the Sony S15 and others, which is 15" 1900x1080 IPS but has poor colour accuracy that makes reds look orange.
Dell has had a number of them over the years, one I remember was the Latitude D820 (15.4", with a 1920x1200 option) back in 2006 but I know they had such offers before as a colleague had one in 2005.
I'm not sure they offer WXUGA screens anymore though, it seems everyone has nerfed back to "full HD" (which offers 120 vertical pixels less) these days, which is bullshit. Since 2010 (when this trend started), the interwebs have been full of people trying to find out manufacturers who still provided WXUGA screens.
You mean WUXGA.
I also detest the shrinking of vertical pixels and the manufacturers pushing it on to consumers. Even on desktops, the prices for non widescreens (more vertical space) seem to be priced excessively to drive people to get widescreens.
>> Since 2010 (when this trend started), the interwebs have been full of people trying to find out manufacturers who still provided WXUGA screens.
Exactly, it's like laptop vertical resolution in particular has gone in the wrong direction the past couple of years. What happened? Trying to view web sites on one of these resolution challenged laptops means a lot of scrolling up and down, feels like the 90s all over again. Glad I'm not the only one and glad Linux raised the issue.
> Exactly, it's like laptop vertical resolution in particular has gone in the wrong direction the past couple of years.
Indeed it has, formerly 1920x1200 screens were dialed back to HD at best (so 120 vertical pixels lost), and where the standard laptop resolution used to be WXGA (1280x800), WSXGA (1440x900) and WSXGA+ (1680x1050), all of that has been rolled back to the completely crazy 1366x768 garbage. TV's "HD" has been a blasted plague on computer displays.
Agreed. Trying to find a decent high-resolution screen is a real pain today. Dial back 5 years, and it wasn't such an issue! Everyone makes "HD" up until 24"... 24" HD is nasty.
I would hope for something approaching 3000 x 1500 in a 24" screen :-(
I have the D830 with the WXUGA screen and even though it's an old 1.8Ghz Core2Duo, I still refuse to buy a new laptop because I can't find anything with a 16:10 ratio, everything is 16:9 and it's difficult to find even 1920x1080 screens at a reasonable price.
My Toshiba Portege ultralight back around 2001 had 1920x1200 on a screen about 12". After that I got 15.6" wuxga Dell Latitudes and now a Thinkpad W500 from a Lenovo outlet because their current offerings dropped down to short-screen rez.
The 14.1 inch Thinkpad T21 released in 2000 had a resolution of 1400x1050. Not as good as you were asking, but an awful lot better than nearly all 14 inch laptops you've been able to get for the last 5 years.
One of his comments far down the page just struck me:
"It's ignorant people like you that hold back the rest of the world. Please just disconnect yourself, move to Pennsylvania, and become Amish.
The world does not need another ignorant web developer that makes some fixed-pixel designs. But maybe you'd be a wonder at woodcarving or churning the butter?"
The 13" MacBook Pro is now 2560x1600. Give it 18 months the entire Apple line will be a minimum 2560x1600.
That's fine if you are willing to drop $1000 on a laptop, I don't expect we'll see $400 laptops @2560x1600 for several years - the Tablets have the advantage of free operating system, lower computing requirements, smaller physical screens, and, in the case of Amazon/Google, a willingness to subsidize the hardware sales in order to capture downstream Content/Search revenue.
While I have sympathy with your point, I haven't seen evidence is that Google is subsidizing (as I think Amazon is) rather than just selling at cost. The reason this is an important distinction is that it highlights how much the other handset makers, especially Apple, have been gouging their customers.
Google's disconnect from the hardware also shows how you actually can have performance on older hardware - apparently the new real-time Google Voice is on older iOS devices that Apple said couldn't handle Siri - if you don't have an interest in the upgrade cycle.
I wince a little at calling it price gouging. It's true that over the long term prices should be near marginal cost of production. But these are short term products. They need to cover their R&D costs. Costs that come with a risk that needs to be covered too. I feel a little milked when I buy a mac and kind of annoyed because I don't feel like I have an alternative (This is obviously an illusion. I don't like Windows or Linux personally, but they are both very reasonable alternatives). But macs are much older and they are still getting better (but not cheaper aargh!) ever year.
On tablets and phone, these high profits have been driving wonderful innovation bringing us better devices at lower prices every year. It's not like we're getting overcharged for stagnant products. This December's devices would have been an incredible last year.
Price gouging implies overcharging on essential items because the consumer has no choice. The only power Apple have over their consumers is that they really really want the stuff Apple makes, right now.
Any time Google (or any other major corporation) sells hardware at under a 20% margin, they are subsidizing their sales with revenue profit from other divisions, or the future potential of profit. It's important to note that you need to cover R&D, G&A, Marketing, etc... beyond that marginal cost. The exception would be in situations where you could make up the lost margin in huge volume (commodity sales, wholesaling, retailing other people's product). Google's average margin from other units of their business is 30%+, so it would make sense for them to invest in those business lines, unless their are some strategic reasons to focus on the tablet market - which, I'm sure, we all recognize there is.
To put it more clearly - Google is not selling tablets to make a profit on hardware. They are selling tablets so they can profit from search/advertising through those tablets.
Re: Google Voice on older IOS equipment. I installed it on my iPhone 4 today - it runs significantly faster than Siri on an iPhone 4S. Search results are instant, as well. Completely agree with you that this is an example of where the disconnect from hardware greatly benefits the consumer.
"Tablets have the advantage of free operating system"
Which tablets do you mean? Certainly not those running Android, iOS or Windows.
In the case of Android, manufacturers pay Microsoft for the privilege. Android users pay with their personal information and by looking at ads. iOS isn't free either, development of the OS is included in the hardware price. And Windows RT is certainly not free, manufacturers license it from Microsoft.
The Tablet that Linus is referring to, the Galaxy Nexus 10, has no marginal license fee per device. Apple, likewise, doesn't pay marginal license fee's, nor does Microsoft.
Laptop Makers end up paying fees per device - typically to Microsoft. OEM Fees are around $50 for Windows XP/Windows 7.
There are quite a few free apps that use ads as a way of supporting themselves but the OS has no ads itself. And you can almost always purchase premium versions of the apps that remove the ads as well.
Not to mention with Android there is the possibility of using something like AdAway
Using Google products with the idea that you can avoid giving out personal information to them is not wise. I was also considering the Nexus, but why bother fighting against the product's design.
Yep, Ads in Google search, ads in Google Maps. Honestly, I consider Google Play to be full of ads now as well since I only ever open it for apps and it shows a bunch of music, movies, and hardware I don't want instead until I drill down/search.
Someone else in the thread is complaining you can avoid these things if you try, install alternatives, but it is still true Google makes money after the sale, and that they pay Mozilla a ton of money to be default search on that platform, so it is very valuable to them.
Very few devices run stock Android, the Nexus series being the exception. I was referring to one of the most popular Android tablets, Amazon's Kindle Fire.
However, since Android comes with Flash, you're likely to see more ads than on other platforms, even if you use a Nexus device.
weird, we have a kindle fire and it doesn't show us any special ads. and i've never noticed flash ads being especially present on the galaxy tab. it sounds to me like you're trying to talk with authority about things you haven't used and don't really have much perspective on.
You've never noticed Flash ads on a Galaxy Tab? Unless you turned off Flash, use a pop up blocker, or if you use a text browser like Lynx, I seriously doubt that: they're pretty hard to miss.
Nope, you've forgotten about the original kindle fire, you know the one thats been on sale for a year and not a month or two. no ad subsidy program there.
look, you obviously don't use an android tablet so please stop telling people who do use them what their experience is like. we understand that you find your kool-aid delicious, let's move on.
The Silk browser used on the Fire processes browsed pages on Amazon's servers. In a perfect world, Amazon would store your browsing activity and manipulate the content it passed along to you for it's own profit. But fortunately, they have placed a Santa clause in the user agreement which prohibits them from doing so.
Android has not come with flash since Jelly Bean: Adobe long since stopped supporting it and it's absolutely not available on the Nexus 7 unless you sideload some legacy APK.
Jelly Bean is the latest version of Android. Only 1.8% of Android devices run Jelly Bean. Most new Android devices being sold today do not run Jelly Bean and will very likely come with Flash.
The Nexus One, Nexus S and Galaxy Nexus all come with Flash.
And yes, Adobe made their intentions known a year a go, but that's not quite the same as ending support. Especially since many new Android devices will still have Flash.
Yes, but the Galaxy Nexus both preceded a working version of Flash for ICS and never thereafter shipped with it anyway (it had to be manually installed from the Store).
Except in the case of Windows, it's not the case that OEMs have to pay licensing fees to put an OS on their tablet, and considering the state of Windows right now I can't imagine even its fees are prohibitive.
What a load of malarkey. You have full control over how much information you give Google when you use the device. You can even get apps with a throwaway gmail and give them no data at all. Further, there are no ads built in the OS and I never see any, period. It's just like on iOS, I don't know what on earth he's on about.
The Nexus comes, by default, with google as a search engine, google mail as the mail client, google maps as the map engine.
This is something that google pays organizations like Mozilla, and Apple, hundreds of millions of dollars to have on their browsers.
The vast majority (95%+) of people take the default search engine and map engine that comes with their tablet/smart phone. Google is willing to take a loss on the 5% who might decide to use some other system (though, given that google makes pretty good mapping and searches, there is a better than average chance those 5% will stick with google anyways)
That's why google is willing to sell a tablet for $200 that others might have to sell for $250 or $275 in order to make a profit - they don't have the search engine revenue.
Bias against Google? I love their products! I Use their Search Engine. I Buy Products from their Ads. I Use their gmail client all day long - happy to get directed ads. Use their Map Client.
I love all the free stuff I get from Google.
Their latest GoogleVoice App is amazing - if anything, I have a huge bias for google.
It's okay to be analytical about something you love. Apple makes money on hardware. Google makes money on services. Each have different business models, and will price their products accordingly.
My objective was to emphasize that the entire business objective of Android, an operating environment that Google has spent billions of of dollars developing and protecting, is to provide a platform for Google's advertising in search, in maps, and, soon, in voice.
Because Google gets little, if any profit from hardware sales, and no (to my knowledge) license fee's from third-parties vendors, all of their profit has to come in the form of advertising.
What this means, is that when comparing a hardware product from Google, and from Apple - we need to understand that Google's profit comes downstream from the advertising revenue, whereas the bulk of Apple's come's front loaded, from the hardware sale.
This will then have an impact on the margins that each of the organizations will be required to pursue at various stages of the product lifecycle.
Apple will start off with a high (30-40%) margin up front, but has less pressure to monetize the user eyeballs in its services.
Google will start off with a lower (approaching 0%) margin up front, but then has much more pressure to monetize user eyeballs in its services.
Neither business model is inherently good/bad/otherwise, but you case see how the incentives for Apple and Google are differently aligned - we've already identified one - Apple did not release Siri on the iPhone 4, even though the phone was more than adequate to run Google Voice, which was release for the iPhone 4. Google's got no skin in the game selling you more hardware. Apple isn't really incentivized to release it's premium services on old hardware...
Again, it is more than easy to use an Android phone without using those. I know tons of people that use Android phones exclusively with personal accounts because they don't use Google for mail/calendar. You can even use maps without giving up your location, it just makes it damn near useless... but again... it's an unavoidable issue. For it to be useful, you're going to give up something, and if it's not to Google, it's to someone else.
"it's an unavoidable issue. For it to be useful, you're going to give up something, and if it's not to Google, it's to someone else."
One can choose to give one's personal information to a company whose core business isn't to sell personal information to other companies. That excludes Google and Facebook.
Right, like I've said 4 times now, it's incredibly easy to use an Android phone without sending a blip of data to Google.
Also, you're absolutely insane and bordering on fanboi territory to act like Microsoft and Apple aren't collecting the exact same usage data from location services etc. You have some bone to pick with Google over advertising yet ignore the fact that it's completely a choice to use it. You people act like Google is breaking into your house and installing a GPS chip in you. It's dishonest.
"it's incredibly easy to use an Android phone without sending a blip of data to Google."
Assuming that's true, that doesn't change the fact that most users will send loads of personal data to Google without knowing it, let alone knowing how to turn it off.
"like Google is breaking into your house and installing a GPS chip in you."
They don't need to. They know that most users don't understand computers and don't read end-user agreements.
But even if you don't use Android, they'll indeed drive by your home, take pictures and collect information about your wifi network, which enables them to link your digital life to a physical one.
I'm very happy to tell you that, like most people, I was upset to find that Google collected that data. I was even less impressed with how they handled the deletion and disclosure of their mistake. I'm happy to discuss that, but it's pretty desperate to reach for that when it started with more or less: "Android has ads and Google watches everything you do on it"
"it's pretty desperate to reach for that when it started with more or less: "Android has ads and Google watches everything you do on it""
That might've been your interpretation, but I wrote:
"Android users pay with their personal information and by looking at ads."
By which I mean a typical Android experience. Not the Android OS, or the official Google Android experience. Most Android users are subjected to lots of ads because they don't buy apps; they download ad-supported apps. They don't buy third party navigation apps, they use the ad-supported Google Maps app. Etc etc.
"You have full control over how much information you give Google when you use the device."
You don't even have control over how much information you give Google when you use their search engine on a desktop computer. Even if you sign out of Google, turn off cookies and use a browser without geolocation. Google will always continue gathering information about you and providing you with custom search results.
But even if you could turn off all data collection, the vast majority of users won't know about it. As a result, they pay Google by sharing their personal information, their interests, their location, their contacts, etc. Google's core business is advertising, they can't help themselves.
"there are no ads built in the OS and I never see any, period. It's just like on iOS, I don't know what on earth he's on about."
I was referring to Amazon's Kindle Fire, but the Android experience in general is filled with ads. Not only does a core app like Google Maps include sponsored links, Google pushes consumers and developers towards ad-supported apps by offering limited payment options in the Google Play store. Three quarters of all apps in the Google Play store are free, and many of those have ads supplied by Google or its daughter companies.
" Google pushes consumers and developers towards ad-supported apps by offering limited payment options in the Google Play store."
Ok, and so does every single marketplace ever that charges a brokerage fee. I think it's completely acceptable to whine about an application that can give you a shortest path between any two spots in the world, 3d buildings, 3d perspective, vector tiles, the best business listings outside of the old white pages, etc, etc, for giving me RELEVANT sponsored results when I look for something. yeesh.
"so does every single marketplace ever that charges a brokerage fee."
That's true to some extent, but Google does both. Google charges 30%, just like the other app marketplaces. On top of that, it even makes developers pay for chargebacks.
I used to have a "huge" 22" Iiyama CRT (20" Effective) boasting 2048x1152. I have forever been confused why current desktop screenS try to satisfy you with 1080p. This crap was getting hyped up On screens years after I was already enjoying higher resolutions at good refresh rates. And guess what? My 2 screens ay work are two 27" iiyamas LCD. And they don't go over 1080p... shouldn't it be even easier to get a better pixel density with lcds in bigger screens ?
I agree with the "tiny font" bit. As a patient with severe myopia my mac air is a lot more comfortable on the eyes than my 16" widescreen acer.
Well there's at least 27" TFTs with 2560x1440 and 30" TFTs with 2560x1600.
I guess part of it must be that single-link DVI and older HDMI doesn't allow for more than 1920x1200, and cheaper graphics units would be overwhelmed with anything over 2560x1600.
This was back in 2001, the screen ran off of a G200 (which was bundled with the T220) (http://en.wikipedia.org/wiki/Matrox_G200). A "modern" IGP is perfectly able to drive desktop and desktop application at that resolution (hell, an eyefinity card supports 6 screens to at least 1920x1200 in 3x2 configuration, that's a total resolution of 5760x2400... and you can play games on that)
A card with 6 outputs costs a few bucks. The Intel HD4000 I believe has support for 2x2560x1600 I believe.
Note that you're also comparing components with vastly differing prices. Maybe the market for $2000 monitors just isn't big enough.
EDIT: Eizo offers 30" displays with 4096 x 2560 resolution. However, they are not exactly cheap. About $20-$30K. Meanwhile a 27" 2560x1440 display goes for $600, or as an import from Korea even for $300.
There were two components to my comment: the first one was that there were high-DPI desktop screens back in 2001 which could be driven with 2D accelerators released in 1998, and thus a modern IGP would have no trouble driving these, and thus that "cheaper graphic units" wouldn't be overwhelmed with "anything over 2560x1600".
And the second part was modern mid- and high-end dedicated (but still consumer) GPUs being able to drive much, much higher resolutions than that without breaking a sweat.
I have to confess I am flabbergasted by your ability to misunderstand (willfully?) what I think to be plain and clear english.
1. I never said anything about people having to accept 2D-only (and the G200 was not 2D-only, by the way), I pointed out that GPUs nearing 15 years old were already able to produce resolution you consider "overwhelming" and that a modern IGP would thus have more than enough power to handle it
2. The dedicated GPU note was to point out just how far beyond those resolutions you consider "overwhelming" a "modern" (Evergreen — and the first Eyefinity release — is 3 years old) dedicated GPU goes
I understand fully well that it is not _difficult_ to drive a lot of pixels if you want. Many modern cheap cards and IGPs however cannot, for whatever reasons: they physically have no support for dual-link DVI. Maybe they saved 50 cents on some connector that way, or whatever.
Note that I never said anything about overwhelming a dedicated GPU. That point was about cheap IGPs, specifically from Intel. (EDIT: OK I didn't explicitly say it, my fault.)
Ultimately I guess it boils down to priorities. PCs have gotten a lot cheaper, and some of that has been done by making things worse.
Personally I would like high-dpi as much as anybody here - although for now I think I'd first want a lot of screen estate (13" is a bit limiting at times), but I probably use my computer for reading a lot more than other people.
A $50 AMD card will get you at least 3 1920x1200 outputs. I am not sure if the HDMI port on them can be driven any higher, it possibly can though.
I am not sure what the max resolution on AMD's Trinity line is if they have Display Link or HDMI. Again, it may very well be up to the max Display Link or HDMI supports, which is pretty damn high.
Edit:
Turns out Intel also supports 2560 x 1600 over Display Port.
I've never seen the point of early adoptations of HDMI. The first and last HDMI cable I bought constantly caused flckering and weird artifacts. I've only used it for a while because aesthetics and being lazy: no extra audio cables needed. A really old VGA cable provided more superior video quality on the same setup.. And HD+ resolutions, no problemo.
I think it's mostly because the interfacing controller thingie is so redicilously cheap - "thanks to" to the "flat screen revolution". Same controllers can be used everywhere, costing next to nothing.
TBH I'm not sure if I want 2560x1600 yet if there is going to be a significant drop in battery life. I have an IPS 1366x768 on a 12.5" screen and it looks great. Fonts already go smaller than I can reasonably see for programming - I can't imagine a higher resolution would materially improve my workflow.
Personally I'd like to see a higher refresh rate. Even with triple buffering I don't think that horizontal scrolling is smooth enough. I'd love to have a 120hz laptop screen. The new Windows 8 start screen, and switching between workspaces in Linux would be so much nicer. Still it's not exactly necessary, just a nice to have.
Battery life seems to have held steady for the HiDPI/Retina Macbooks and for all the tablets we've seen. So the real sacrifice is in weight and thickness.
Linus's rant is compelling, but as the owner of a 2011 MacBook Air, I'm not completely sure I want to add 1.5 pounds (the difference between the MBA and the 13" Retina MBP). Granted, complaining about 4 pounds sounds ridiculous, but having something you can effortlessly carry from room to room with one hand is pretty great.
Do you think that's just the resolution, or also the quality of the display? I think the iPad 3 is a better quality screen as well as higher resolution. I tried a lot of latpops when I was in the market, and I found the ones with a higher resolution TN panel were worse (to my eyes) than lower resolution on a high quality IPS.
Yeah, the colour range is much better on the iPad 3 than on the iPad 1.
That is a bit irrelevant. I'd rather have 256 levels of grey on a retina than a good colour non-retina. I can't get good readability of text (both web and A4 documentation) together with quick page turns (no eInk) otherwise.
Storage - it affects the size disk you need/want because high resolution images / video take up much more space.
Memory - You need more memory to build up screens for a higher density display. Further you need the bandwidth to shove that data around.
Compute - If you want to 'render' to the display rather than just copy bitmaps around, or composite complex bit maps, you need to spend a lot of time computing which can make other things slow.
So when you look closely at tablets you will see interesting places where they have been adapted to support these densities.
But more importantly there is 'change' in the systems where there is new money being invested. So tablets are getting all of the 'change' now, less so with laptops, and hardly at all with desktops.
The reason this will change though is that I expect we've convinced display manufacturers that 'regular users' (the bulk of the purchasers) want 'high dpi' displays. Its not easy communicating with an entire industry but success Apple has been having with 'retina' displays, and the more recent Android tablets with higher resolutions, means more people will jump in to support them. And more importantly when the choice is available folks reject lower density displays. So in the great 'tuning' these guys do where they calculate how to get the most money out of each hour of running their factories, the equation is tipping in favor of high dpi displays.
That said, I'd love to have a couple of 32" 2560 x 1600 displays for my desktop, but I think that is still a couple of years off from being 'mainstream'
"That said, I'd love to have a couple of 32" 2560 x 1600 displays for my desktop, but I think that is still a couple of years off from being 'mainstream'"
Overclockers have a sale on 27" 2560x1440 displays today. They're selling for £311. I bought one about a year ago for £450 - worth every penny.
I've been watching the korean ones on ebay after reading the coding-horror article. An acquaintance of mine has a 2560 x 1600 native resolution video projector. He was showing it off by projecting on to a 120" screen. One of the things that struck me about that setup is 'the wall as monitor' had some very interesting stuff at that density. It felt like it could be a very productive environment.
I believe the issue with using it as your primary monitor is that projectors have limited bulb life, and the bulbs are often almost as expensive as the projectors themselves.
I have a Korean no-brand 27" 2560x1440 IPS screen that I picked up for 250,000KRW (that should be about £145). I'm in Korea so that makes the big difference on cost plus lower taxes. They will come down in price far more yet.
In an ideal world I would agree completely, a better DPI is amazing both in terms of font readability AND for watching full screen media (movies, TV shows, games, etc).
But in the real world higher resolution means small screen elements. At 1600x900 fonts are readable at 125%, at 1920x1080 even at 125% fonts and some elements are literally too small to be comfortable read (you get eye-strain after less than an hour).
Now I would turn it up to 150% "text size" but that breaks SO many native Windows applications (e.g. pushing text off the viewable area) and does the same on Linux too (Ubuntu).
Ideally everything should remain the same size no matter what the resolution, and the DPI should just grow upwards. This is how it works on platforms like the iPad.
So, I disagree with him, I don't want higher resolution displays because Windows, Linux, and OS X still suck at handling resolution (and if you use a non-native resolution it hurts the performance since the GPU has to re-scale constantly).
So you just want to stop evolution forever. That's just silly. I use a 47" TV with KDE, I set DPI to like 200, and the resulting fonts and interfaces are very nice when viewed from couch.
I have to ctrl+ every site a few times, but it works flawlessly after that.
How does making things smaller and smaller with each resolution increase equal "evolution?"
I would argue that making everything a fixed consistent size regardless of resolution and then increasing DPI as the resolution is increased would be a better way to "evolve" things.
"Zoom" should be an OS function (e.g. 125%, 150%, 200%, etc).
"Resolution" should be hidden from the end user entirely.
"DPI" should automatically scale with the supported resolution.
That way if the user is hard of seeing they can "zoom" things but aside from that everything would look identical no matter what resolution you had (e.g. identical on 1600x900 or 1080p, or higher).
Why? I argued that big resolutions are good and interfaces will soon catch up. Browsers are ready, operating systems are ready. Some legacy software might become tiny, but you can live with it by changing resolution when you need to use it.
I disagree, desktop Operating Systems cannot handle high resolutions at all. They just keep making things smaller and smaller instead of sharper and sharper. OS X has JUST added basic support for a sharper lower resolution mode, but even that is highly limited to their new displays.
Resolution from a OS perspective needs to be scrapped and re-invented.
That's simply not true, OS level support has been improving for years and years.
There is a chicken-before-the-egg problem with getting ISVs to support high resolution modes (ie, for third party developers to use the support the OS offers) since there isn't much incentive until their users have hardware to use the high resolution modes. Apple has started shipping high resolution screens, once others do too software support will be more likely to catch up too.
> Now I would turn it up to 150% "text size" but that breaks SO many native Windows applications (e.g. pushing text off the viewable area) and does the same on Linux too (Ubuntu).
In the web too. For too many years it was standard to use hard coded font sizes and specifying measurements in pixels in CSS.
It's about time web designers start thinking about accessibility and start using flexible font sizes and using em's (unit of length relative to text size) in web designs. Even when it means doing small compromises in the design and you can't make the web site pixel perfect with regards to the photoshopped design mockups.
Pixels in web (css) are resolution independent. They are basically defined to be equivalent to a pixel on a desktop 96 PPI display. So using pixels in CSS is fine, if bit confusing. And of course browsers do all sorts of silly things, I can't recall if eg gecko has enabled PPI scaling on default.
Pixels dimensions and font sizes in CSS are fine because the browser apply a multiplication factor to the CSS values, so that the page and fonts display correctly on a high res screen. This is how Mobile Safari works.
The problem with websites is the image files--JPG, GIF, and PNG. The resolution is locked in when the graphic is made, so files made for the the low-res web will look blurry and pixelated when the browser scales them up for a hi-res display.
> The problem with websites is the image files--JPG, GIF, and PNG. The resolution is locked in when the graphic is made, so files made for the the low-res web will look blurry and pixelated when the browser scales them up for a hi-res display.
Such sites aren't blurry in absolute terms, just in relation to sites that have been optimized. That is, they look the same as they would on a standard screen.
>Ideally everything should remain the same size no matter what the resolution, and the DPI should just grow upwards. This is how it works on platforms like the iPad.
And on the "Retina" Macbook Pros - i'm not sure why Linus goes batshit over a Apple marketing name, it's a term Apple uses for marketing their HiDPI screens (iPhone, iPad and now Macs) and not in use by anybody else to describe their high resolution screens.
The true HiDPI effect is only apparent in the "same size as before but 4 times the pixel density" default configuration.
Changing it run on one to one with 2880x1800 display achieves nothing besides really tiny unreadable font sizes.
One would hope that if screens of this resolution became ubiquitous, people would start to notice which apps behave poorly and fix them. It might also push forward an ecosystem for supporting such displays. E.g., tools to allow per app dpi settings to isolate badly behaved legacy apps for which the source is not available and on which no active maintenance is done.
>Ideally everything should remain the same size no matter what the resolution, and the DPI should just grow upwards. This is how it works on platforms like the iPad.
No that's not how it works on the iPad. Apple would not have increased the number of pixels by a factor of exactly 4 in one iPad update if the software gave them the freedom to choose any display resolution.
I know people who tried Ubuntu on the new Macbook Pros (2880x1800) and everything was so tiny they were literally unable to alter the resolution using the GUI. They could not even click individual elements. They had to manually do it via the shell and even then they weren't able to see what they were doing...
I wouldn't call that "pretty good." In fact I would call that shockingly bad.
Linux still falls in the trap of re-sizing things to meet the resolution. As you increase the resolution things get small, and as you decrease the resolution things get bigger. This is a broken design.
Zooming should have no relationship to resolution. Resolution should increase DPI, not zoom or scaling (or however you wish to word it). A font should be the same size on 1600x900 as it is on 1080p.
It's true the default text size is too small even with a low-resolution monitor, but I've been dealing with that for years just by selecting a suitable font size within each of the half-dozen or so programs in which I actually read a significant amount of text. I find that works fine.
> a better DPI is amazing ... in terms of font readability
I haven't really found this myself: I typically prefer a nice pixel font at a lower DPI than a TrueType font at a higher DPI, especially for coding. The manual effort that goes into a good pixel font just makes for better readability imo.
I've always been a sucker for high-quality displays. Apple's Retina MBP is what finally converted me from being a Windows user. I have to say, I really love it.
And... I may be the only one here... but I think they should go a little higher than 2880x1800. I know normal viewing distance is something like 15 inches, but I like to sit closer to my screen when coding and it sure would be nice to have all semblance of "pixels" completely disappear. How cool what that be?
And if they started using AMOLED screens instead of IPS, then that would really be the perfect screen.
It depends on what primaries you choose for your OLEDs. AMOLED has perfect black levels (LCD technology does not) and near-perfect viewing angles.
Unrealistic colors are a problem of reproducing the information you acquire correctly. All of the colors humans can see are represented on the CIE 1931 chromaticity diagram. Choose any three points within that color space and draw a triangle between them. This colors within the triangle are those that the primaries at the vertices of the triangle can reproduce. The rest is just a software issue (controlling the correct voltages to each LED).
Interesting to note that Jeff was so very wrong with his prediction. Only half a decade after his prediction, we're already past 2560 x 1600 on the MBP Retina displays :). Pretty soon PC manufacturers will also catch up.
Technology generally improves faster than one presumes (accelerating returns and all).
Resolution matters when you need to work with text. For me, Macbook Pro 15" retina is the best thing ever happened, and it is impossible to look back now.
Seeing as how Apple's "retina" displays are actually made by other folks (Samsung and LG for the 15" rMBP), I suspect we will see high-res panels on other laptops very soon.
I don't get what point he's trying to make at all. And I don't know why it makes either a blog post, or top story on hn, who cares what screen resolution Linus has his laptop set to.
I can work just fine on my 2009 MBP with 1280x800 (or whatever it is) the text is perfectly readable, there's no noticeable pixelation at distances past a few inches from the screen and having everything shrink, as a result of increasing the res, would make it unusable.
He's probably exaggerating for effect, but it's not even remotely true that laptop resolutions have stagnated. They have steadily increased to the point where we now have retina screens on regular work laptops.
I think if Linus created a blog post saying he'd just set his background color to blue, it would make the top spot here!
It's a shame to see Linus stooping to mock apple's use of the term retina. They (Apple) name everything, like the fusion drive or any number of previous technologies. It's to humanize the tech so the average person walking into the apple store doesn't have to talk in tech-speak. It's just a marketing term, and every company has them.
The definition of "reasonable resolution" changes over the years, VGA seemed reasonable compared to EGA.
Like most things you don't realize the difference until you use it. If you spent a week working on a retina MBP at 1920x1200 you'd never want to use 1280x800 again. 4 vim splits at useable width, 1000px+ browser viewport width with inspector open to the right, etc.
I'm really happy with the 1440x900 quadrupled. Text is unbelievably gorgeous, and when I pop open QuickTime Player to check out video, I see my 1920x1080 data unscaled. STUPENDOUS.
One nice thing about the choice of 2560x1600 as a standard pick is, it to my knowledge is the largest resolution supported by DVI, and specifically DVI dual link at that. This might start pushing the market towards DisplayPort, which used to be limited to 2560x1600, but is allegedly being expanded upwards.
(DisplayPort is my favorite display connector to date, and I hope to see adoption grow)
Seems like this guy is not bothered about battery. I have a 1366*768 display which just works fine. Most of the times I'm just running a terminal and this resolution saves quite some battery.
The brand new 1700€+ 1080p 17" Dell XPSes around me at work barely manage to get 2h idling. For my colleagues it's basically an embedded UPS, while my MacBook Pro still handles 5+ hours after three years.
Like my MBP, Retina Macs still have 7 hours of battery, so I don't see the landscape changing much: many laptops, retina-class or not, cheap or not, will get crappy battery life, and a few will get it right.
Yes please!!!
I can't say how much i am disappointed by that 1366x768 crapsolution.. before my current laptop i had a FullHD laptop and that was faaar better..
Even my first laptop from over 10 years ago had a better resolution then most standard laptops have, how is that?!
(i had a 15" 1400x1050 display, then a 16" FullHD and now a 13" 1366x768)
It's fine if you're okay with gigantic fonts that are usually seen on laptops. A lot of us prefer much smaller fonts to fit more code on the screen.
For example the equivalent of 9pt on a 900p display allows actually reasonably split-screen editing. On an actual 900p display though, it's much too blocky to be easily read without straining the eyes. At 1600p though, it's crystal clear.
I sure wouldn't want to spend my whole waking life wearing glasses that slightly-pixelized everything I saw. And since I spend a large amount of my waking life looking at "glowing rectangles", I'd rather they not be slightly-pixelized either.
Fonts are clearer, it stops me from getting eyestrain (because I don't register everything as slightly blurred), and it is a great help when working on photos.
Not just laptops - I'm sick of the standard PC monitor being such crappy resolutions as well. Even with dual screens on my work PC, there's not enough room for the xterms I need at a decent font size.
I'm talking about the actual physical size of the font, which is about 5mm. If I make my font size any smaller, because the resolution is crap, the font is unreadable.
Higher pixel density would mean I could make the font size smaller and it would still be clearly legible.
I doubt it. The average resolution today is probably 1920x1080. I guess you're referring to the fact that 4 years ago, 24" displays often came with 1920x1200 instead. But I doubt enough people bought them back then to make it the average resolution (don't blame me, I bought two). Most people probably bought craptastic 22" TN displays with resolutions worse than 1920x1080; I know I talked my share of acquaintances out of it.
1920x1200 displays are still available, incidently. And overall, the desktop TFT market is much higher quality these days. TN panels are going out of style, for one thing, thank god. You can now get a 22" Full-HD display with an MVA panel for around 150 USD. Still pathetic compared to tablets, but we're getting there.
Agreed, screens need to gain higher resolution across all form factors. I would be ok with terrible frame rate for 1-2 years while the hardware catches up with 4x the pixels.
Nexus 10 with a keyboard (and battery) casing and some desktop linux on it sounds like a good cheap high res laptop. Performance wise it would be fine for the stuff I personally do, guess that would scare other coders off.
Yes, for some an option, but usually cloud doesn't fly here (I live in the mountains) and the languages are usually not the problem, the environment is. I like my tools :)
On my Pandora I do scala/clojure/java (yep, Swing) coding (the new JVM for ARM by Oracle is really good, I never thought I would say that, but it's an amazing piece of work for such 'small' memory and performance footprint as the OpenPandora), Haskell works well, the whole linux build chain works well, LAMP stack works well, I can run Rails, Django, Apache, Node.js. And amazing batterylife with the option to swap out batteries on the go. Only, the screen is way too small and doesn't work under anything more than almost darkness :)
The iPad 3 works fine everywhere, but I cannot do much work on it and I haven't found an Android pad which works well enough for fulltime coding nor have I found a good enough case to work with for any of those. Most cases are really bad chinese things which have a broken power supply (batteries or anything in between loading and the batteries) after a few weeks usage. Anyone? :)
If the Nexus 10 sells anywhere near the amount the Nexus 7 does, there might finally be some good accessories. I would almost do a Kickstarter to make different form factor clamshell keyboard/battery/connection docks for android phones and pads. Almost...
No, that's what I meant (reading back, I didn't actually say that; I just thought it during writing); they didn't deliver cables for the monitors yet :) I have the components to make one, but didn't get to that yet (it's none standard and probably I'll just get theirs when they finally created it). When i'm behind an actual desk I just work on it via another computer or via github (depends where I am). But i'm often 'on the road' (which can be actually on the road OR walking in the mountains) and then I just program straight on the little thing. It's actually quite comfortable if there is not a lot of sun.
But yet it's very much the opposite of high resolution so it takes more planning what you are going to do, but for algorithmic work and hard problems nothing beats mountain walks and then typing/testing on that thing. And the occasional Crash Bandicoot III play of course.
For me the optimum config would be something like an OpenPandora (before it I had (have actually) the Zaurus c860 which did the same minus the games) with a docking 'station' or, but now i'm dreaming as it seems nothing comes close to this yet, AR glasses with a sufficiently high resolution connected to it.
I have been experimenting with a Twiddler 2 (http://www.handykey.com/); everyone complains that it's too slow for typing, but again, when you are thinking up stuff which are not kilometers of boring (crud) coding typing speed, imho, doesn't matter too much. And I can type while walking with it, but no AR (or actually VR) yet.
I think a good solution could come from review websites. If they all agreed that any screen size below 2560x1600 (or maybe a little less) would only score a maximum of 5/10 it would certainly rock the boat.
Before this happens, it would be really nice if Windows supported different DPIs on different screens. IIRC Linux can already be hacked to do this. Windows cannot. The result of this is I have a 21" 1080p desktop screen sitting next my 14" 1080p laptop screen and I cannot read text on my laptop screen! (I'd move the laptop dock closer but then it'd be sitting on top of my working space.)
As for Windows 8 doing high DPI, Tech Report has a decent article (http://techreport.com/review/23631/how-windows-8-scaling-fai...) on the lackings of Win8 high DPI settings even in Metro. Though feel free to ignore their complaints about browser scaling, each browser takes a different approach to how they break web pages when scaling. (Suffice to say 1 pixel borders and non-integer scaling don't go together well!)
They go into detail about the different scaling options, but the scaling options are all things that the user has to manually enable! Hardly auto-DPI. There is a balance to be struck between "more information on screen" and "better displaying information on screen" that Microsoft apparently decided to not even attempt, instead giving the user a blunt instrument with which to toggle between "way too big" and "way too small".
I still have no idea why people bought into shortscreen laptops hook, line, and sinker - I guess that's marketing for you. Maybe we're just in a minority of people that want laptops, and not expensive DVD players? I'll never buy a laptop with that bottom seventh of screen real estate missing.
Where do you even look to find something otherwise? As far as I could tell the last time I was able to buy a laptop with a 4:3 ratio screen was in 2006, and they were significantly more expensive than 16:10 models. Since then I haven't ever even seen the option.
Well the last time I bought a new laptop was 2007. I figured there had to still be some Thinkpads Ts with 4:3 screens (given that they're used by people who are in touch with the realities of traveling and all), but alas I see that's one more thing Lenovo has crapped up. Yet another reason to keep this IBM-branded T60 going as long as possible (despite the periodic keyboard swaps).
How do those spacers on the sides of keyboards not just scream waste needing optimization? I guess that 7% area savings for LCD manufacturers overrides common sense. Well okay, I see the writing on the wall - time to learn to use ed(1).
The one issue that cannot be ignored is that a higher resolution display will consume significantly more power and cause the GPU to consume more power and generate more heat. The relationship is roughly linear to pixel count, in other words 2x more pixels is equal to 2x more power and heat. From 1366 x 768 to 2560 x 1600 it's roughly 4x more power/heat.
It WILL indeed be the new standard resolution, very soon, especially considering the "push" the PC industry is getting from Apple's Retina Macbook Pros. The demand for high res panels is just to darn high and it has to be met. Like in Apple's situation, the cost can be taken care of by charging a premium and then using economies of scale to bring down production costs and making these kind of displays the new norm.
However, I believe we're going to see a huge blast in these super high-res panels right after Intel Haswell is released. It'll provide the Ultrabooks with graphic performance capabilities that'll be good enough for these high-res panels.
Nice to see; but obviously it is just a small step where a larger one is needed. They'll never scale it bigger to compete with their own 1080p hypnotoads.
> In fact, if you have bad vision, sharp good high-quality fonts will help.
As someone with absolutely terrible vision, I'll have to disagree with this point. I've been keeping my screen res around 1024x768 for years because moving it higher just makes it so darn hard to see. I've now come to the point where some monitors and video cards won't even go that low. I'm probably a unique case. Still, I do wish accessibility was better for visually impaired users.
FWIW, Macs do magnification the best out of the box. Zoomtext on windows costs a bit, and I'm not sure anything exists for Linux that's even comparable to MacOSs magnifier abilities. Even with a mac, there's too much mouse movement involved for my tastes.
Totally, BUT, it will trash an admittedly imperfect economic sector. I've been following this for a few months; and that the market hasn't moved is incredibly harmful. The laptop market has basically stagnated, meaning a lot of pent-up demand that people are not likely to commit to other personal productivity improvements. So, in my opinion, in order to preserve inventory values, these guys fight as hard as possible to hold back technical development. I.O.W. If this valuation structure were ever to break, the corporates involved could take huge hits. Someone with more knowledge will have to take it from there.
this may not be a constructive comment, but oh god yes please. integrated graphics are good enough these days and you can always drop down to 1280x800 without scaling problems if you want framerate.
I was going to post the same thing, but I'm just going to hook onto your post. I've been roaming the web for a good high res desktop monitor for development purposes for a couple of months and there's simply nothing out there. Based on my research, there hasn't been enough motivation for monitor makers to increase the resolution.
I think, and I might be missing a very obvious demographic, that developers trying to cram in 12 vim buffers on one screen would be the main use-case, and that's just not large enough of a market to justify developing ultra-high-res panels at 24" or 27" at an affordable cost. I'm sure you can make and outrageously expensive monitor of that quality, similar to that 28" that John Carmack paid 10k for in 1995, but not that many people would be able to justify the purchase.
Color accuracy is infinitely more important than the number of pixels I can push on a screen for photography. My color-accurate monitor is 1440 x 900. If I need to check for sharpness, I zoom in anyway (and I would zoom in if I were in a retina display, too, practically until I could see the pixels)
I'd assume because there's little need for higher framerate displays. Videos are usually 30fps at the max, your cursor doesn't need to blink that fast, and games are perfectly playable at 60fps.
Because LCDs have much better persistence of vision than old CRTs? On CRTs, I would sacrifice resolution to get to a minimum 85 Hz refresh, because 60 was unusable and gave me headaches. But 60 updates per second on an evenly-lit LCD screen is pretty tolerable. That said, more precise motion would be awfully nice!
I can vouch for the retina display making text significantly more readable to those who need reading glasses, even if the text is exactly the same size.
Except that your Retina "aware" apps can display at a native 2560x1600. Aperture (the latest version), for instance.
But yes, it's sad how the Retina MacBook Pro has the capability, but other than doing really, really sharp fonts (when it's using the built in font renderer) - it goes unused in most situations.
That will change. Somebody needs to write a Retina aware Terminal App.
Not that I could tell - it did the typical screen doubling trick, and presumably renders the fonts nicely - but I couldn't get extra screen real-estate the way I would if the screen was 2560x1600. The 15" MacBook Pro Retina actually felt _smaller_ than my 2010 MacBook Air. It screen shrinks back to 1280x800, and my 2010 MacBook Air runs at 1440x900.
Or maybe it was the Font Handling - I was hoping that it would look okay (just small) at Andale Mono 4 - but it pixelated.
Regardless - In my 5 minutes of futzing at the apple store I could get it to do what I wanted (Basically, make the Screen look 2560x1600 to Terminal.app, but have the OS treat it as 1280x800)
I'm sorry, but I just don't get what you're saying.
> I couldn't get extra screen real-estate the way I would if the screen was 2560x1600
The resolution of the 15" is 2880x1800, not 2560x1600 (that's the 13", is that the one you're actually talking about? I'll go with that for the rest as what you say doesn't seem to make sense for the 15"). You don't get access to the native resolution of the screen with the built-in tools as far as I know, there always is some scaling applied.
> It screen shrinks back to 1280x800, and my 2010 MacBook Air runs at 1440x900.
That's just the default scaling (1:2), if you go into the display settings you can change the virtual resolution to 1680x1050, 1440x900 or 1024x640.
> Or maybe it was the Font Handling - I was hoping that it would look okay (just small) at Andale Mono 4 - but it pixelated.
Andale Mono 4? As in 4 points?
> Basically, make the Screen look 2560x1600 to Terminal.app, but have the OS treat it as 1280x800
I fear I don't get what you mean anymore than previously.
You are right - I'm wrong. 2880x1800, not 2560x1600.
What I want is for the screen to appear to be 1440x900 (the 1:2 scaling you mentioned), or 1680x1050, but for the Terminal Windows to appear to be as though they were on a 2560x1600 screen - really small.
I thought I could do that by making the font Andale Mono 4, and I was hoping that the font would be displayed properly because of the retina screen - but it was still pretty unreadable. You can't really get "microprinting" on a retina screen - it still has a ways to go in terms of resolution.
For all manufacturers, this would reduce time of autonomy, and for many, significantly so. And obviously, price range.
It's a trade-off between various things, as usual. What Linus thinks may work for him (I happen to agree), but I can easily see someone wanting an ultraportable with basically VGA resolution and battery lasting entire day of active usage.
It's not a question of architecture. The real issue is batteries are expensive, heavy, and few people buy laptops based on battery life. So, while a you can take an ultra thin laptop w/ SSD add 2 lb's of battery's and be good to go it's now just a normal laptop with poor specs and a long battery life that few people actually care about when they can just plug in or use an external battery.
iPad's on the other hand are highly dependent on battery life. And they get to sacrifice a lot of specs and get to use worse hardware than a typical 'ultra portable' without people really noticing as it's a different type of device. Still, you will find plenty of Android tablets that get stuck in the spec wars and end up with terrible battery life.
It's not that ridiculous. Intel integrated graphics can acceptably run a laptop of that resolution (see the Retina MBP). When Intel's new IGPs come out next year, it really should be a non-issue.
It is ridiculous in a sense that basically only the latest Intel's IGPs can run it acceptably. There's a whole plethora of quality ranges laptops provide for various purposes, it would be incredibly silly making such an incredulous resolution "a standard".
Also, app ecosystem is not even on Macs supportive of those high res, imagine what mess would there be in Windows applications. Augh.
Windows is actually not too bad. I'm running Windows 8 on my Retina MBP in high-DPI mode, and it looks crisp and most apps degrade gracefully (some look absurdly small, but that's rarer).
Admittedly, this isn't something that could happen today. But Intel's IGP are the low end. In a few years, they'll be everywhere. It's difficult to argue that something that requires a modern integrated graphics chip is anything too demanding, when you look at how far ahead dedicated chips are.
It doesn't. I'm just refuting the argument that if iPads can do it, laptops can too. Those are two wildly different architectures that are only beggining to converge, touch interface and resolution being prime candidates for feature convergence.
Currently there are 311 comments in this story. It's hard to find any mention of it, but does anyone actually run their Retina Macbook Pro at native resolution for every day usage, coding / programming, etc? I have mine hooked up to an external 2560 x 1440 Apple Cinema Display.
"So with even a $399 tablet doing 2560x1600 pixel displays, can we please just make that the new standard laptop resolution? Even at 11"? Please. Stop with the "retina" crap, just call it "reasonable resolution". The fact that laptops stagnated ten years ago (and even regressed, in many cases) at around half that in both directions is just sad.
I still don't want big luggable laptops, but that 1366x768 is so last century. Christ, soon even the cellphones will start laughing at the ridiculously bad laptop displays.
And the next technology journalist that asks you whether you want fonts that small, I'll just hunt down and give an atomic wedgie. I want pixels for high-quality fonts, and yes, I want my fonts small, but "high resolution" really doesn't equate "small fonts" like some less-than-gifted tech pundits seem to constantly think.
In fact, if you have bad vision, sharp good high-quality fonts will help.
#noexcuses"
No. My screen is 1440x900 and it's honestly as big as I want it to go. See, what usually happens is that when you increase resolution, everything becomes smaller. Desktop icons, text in apps, web sites. Yes you can manually resize everything but it's a pain and it doesn't always work right. Web pages get deformed since you're only changing the fonts. And it's not like I'm ever watching movies in higher resolution than HD. So I'll keep my 'low' resolution screens.
What really needs to happen is for desktops to become resolution independent so that you can set the size of the elements based upon your preference and not some fixed pixel grid.
It is very difficult to design UIs in the desktop style that can stand up to DPI changes like that and not end up with text rendering problems or pixel cracks between elements.
That's why OS X only supports 1x and 2x HiDPI; everything else just resizes the whole screen.
Your operating system might not be able to handle more resolution, but the OS that comes with pretty much any new computer (Windows 8 or Mountain Lion) is going to be handle it fine, so there's no reason not to have higher DPI on new laptops.
The difference is how the signal is sent. When you change the resolution in XP for instance, a low res picture gets sent to the display which then has to up sample it, and it looks sucky. On the new MBP the picture sent is always the native resolution of the monitor, you just change the size of the UI elements according to your preference.
Every now and then Linus says something slightly stupid, maybe even on purpose. The "shock title" effect.
Obviously 2560x1600 on 11" is utterly useless as it is on tablets. 1366 is equally dumb. Whatever goes in between is generally fine. I start being happy at around 1920 for 13" (and it's 4:3 friend) since, after that, i can't see pixels at all and in native mode I can't see the text too well either.
I don't think that's a personal thing.
I am currently reading books from the Humble Bundle ebooks collection. Just plain regular text books on my tablet is fine, but the comic ones (SMBC, XKCD) are not readable. It is nothing to do with the size, and everything to do with the tablet screen being 150 ppi. Being able to read that is not "utterly useless".
As you can see from many of the comments, people do want to buy higher pixel density screens on their laptops. Apple even came out with laptops where that is the distinguishing feature. But if your choices are outside of Apple then simply cannot buy high pixel density screens.
Laser printers when they first came out were 300 dpi. That is a very good indication that those kind of pixel densities make for better legibility. Sure you can read stuff at 75dpi, but it isn't as productive.
GUIs go to great lengths to work around the low ppi, using anti-aliasing, (auto) hinting, subpixel rendering and various other techniques. This kind of thing generally isn't available for bitmaps such as the comics in the Humble eBook Bundle which are consequently extremely difficult to read (the images are a higher dpi than the screen so it is a screen problem not a source problem).
You are right that there are people for whom extra ppi, extra cpu, extra memory, extra power saving, better colour gamuts etc won't make a difference. But there are also a group who do want ppi improvements and Apple has done so across their laptop and tablet lines, Asus has done for some of its tablets, and the Nexus 10 has done so too. If nobody bought those then it would demonstrate lack of demand, but people have been buying them. And people really want them for non-Apple laptops too as Linus stated and others have concurred.
In that case your argument should be that "retina" ppi should be the target. 150ppi as on my tablet definitely isn't. 130ppi as on my laptop is not either. I join Linus and others calling for higher ppi displays on laptops. There is debate on the exact number but it would certainly be at least ~220 ppi on laptops and ~300 ppi on phones/tablets.
Yeah of course, most terminals are at 80 characters, nobody can expect most Linux developers to have big ass monitors. It will be the standard for a long time, I'm sure.
Maybe some games? Maybe some designery stuff? Maybe some video creation stuff? (It might be useful for doctors and medical images, but I kind of hope they're using special purpose monitors for that stuff).
And does pushing those extra pixels have a cost in energy use?
Yes, it does. A great answer would have included a list of things that people use their computers for where they need this higher resolution, or where they do notice it and it makes a significant difference. Or maybe someone who has this extra resolution but who doesn't find it any better than a lower resolution, with a list of what they use their machine for.
Or perhaps someone has links to great information. Because while that stuff is findable with a bit of www searching it can be tricky to sort the wheat from the chaff.
If we're limiting the question to the specific "me" (rather than a general hypothetic user) - I do a lot of work with text. Text in web browsers and text in word processors. I do a little bit of film watching - just DVDs. I don't particularly care about battery life, but I'm happy with lower performance if there's a clear eco benefit.
In pitch clearly, but not in raw resolution yet (except maybe the 4.7 inchers, not sure about those).
But they're inching closer and closer, the standard android resolution is 720p which is only a step away from the "standard" bullshit 1366x768 laptop standard.
But in other areas (colour accuracy, viewing angles, contrast) they're already miles ahead.
I think this is mainly because of Apple's aggressive use of high quality displays in their iOS devices, competitors had to try to match or surpass them and the competition has given most phones really nice displays.
Hopefully once "Retina" displays in the MacBook Pros become slightly more affordable this will have the same effect on the laptop market.
I've been thinking the same thing lately. 2560x1600 needs to be the standard resolution for all laptops from 11"-15". In fact the 15"+ ones can start having 4k resolution (300 PPI) about 2 years from now, as both Intel integrated GPU's and ARM GPU's will support that resolution.
Mouahah, more pixel nonsense. Yeah, sure, put this resolution on a 11' screen, and have a sluggish GPU handle what's moving on screen on a 300-400$ laptop.
This is nonsense. You need pixels to a certain amount to have a good looking picture, but the benefit of having way larger resolutions is like a log curve: it stagnates as you go up and up, since you would notice the pixels less and less.
I am surprised to see Linus making this kind of claim, he used to be more practically-focused. Now he sounds like a marketing guy from Apple.
I say, why stop at 2560 * 1600. This is ridiculously low. Make 10 000* 7000 the new standard laptop resolution. Yes, we can. Tomorrow, please. Even if the capacity and the plants to make it do not exist, yet.
Pfft. I run a 27" 2560x1440 display on Intel HD4000 all the time. I don't know where your comment is coming from, but in case you didn't read the post, the technology is here. The tablets, you know, with these 2560x1440 screens, have GPUs that power them. Meaning the technology is available and could (ought) to be in laptops as well.
I really wonder about the people complaining about running 2560x1600 on a HD4000. Four years ago I was happily driving both an external 2560x1600 monitor and the built in 1440x900 panel with a 8600M, which was a good bit slower.
People really underestimate how far Intel has come.
I think comparing 2560x1440 on 27" and 13" is not fair, the pixel density is not the same, and in my experience is actually beyond any difference we can see, and I'm yet to see a double-blind study that proves otherwise.
So? The smaller screen is more dense but it doesn't change the fact that 10" 2560x1440 displays exist. And they're not prohibitively expensive. (See: The Nexus 10, with that screen).
I can tell you what, if I can go from 2560x1440 on a 27" and go back to my Macbook Air and want to rip my eyes out... we can stand to improve.
Obviously, without scaling the UI, this resolution would be unusable on a 10" screen, you wouldn't be able to click much of anything. Linux is good at DPI scaling, Windows 8 is supposedly better.
I dont know if you read MY comment but the capacity is not there. It will take years until you can supply this resolution for all standard displays on the market. It requires new lines and only Apple can do it now because they only make one model of everything so it helps them scale in volume. Get real.
And try running games in that resolution with your crappy GPU and you ll have a nice slideshow that make you feel like you have a 286 playing Doom all over again.
Now all we need is linux to credibly support it as well, or at least a linux built for mouse use. There are still many, many usability issues in gnome with a high ppi screen.
Also, lets not forget how far mobile gpus have come in the last few years, it would have been impossible to push that many pixels with anything but the most minimal of 3d use cases.
It's a much more complex problem than linus would suggest by simply having oems switch panels. Witness how relatively complicated apple's solution is, which came after years of supposedly "somewhat" supporting hidpi. Use the hidpi macbook pro at 1920x on the ivy bridge gpu and it's still noticeably laggy at some 3d operations.