When working in UIKit developers are warned now to use images that are larger than 1024x1024 because that is the limit on the size if a texture in openGLES. I wouldn't be surprised to see this being doubled for the retina and Apple using that as the hard limit.
( You should avoid creating UIImage objects that are greater than 1024 x 1024 in size. Besides the large amount of memory such an image would consume, you may run into problems when using the image as a texture in OpenGL ES or when drawing the image to a view or layer - http://developer.apple.com/library/ios/#documentation/uikit/...)
There's also no reason you can't split the image up (e.g. 2048x2048 -> 4x1024x1024) and render it in 4 pieces. Where a hard limitation like that exists, implementations frequently sidestep them.
If you do this though, make sure you split at JPEG block boundaries. That means that every column should have a width divisible by 8 except the rightmost, and every row should have a height divisible by 8 except the bottom. Otherwise you may see a noticeable seam.
I think it's actually just been doubled to 2048x2048. OpenGL ES 2.0 seems to define a 64x64 as the minimum requirement for texture size to meet the spec, which is really not much at all.
We ran into this issue while building our HTML5/Mobile Webkit panorama viewer. It took a long time to figure out what was going on, I almost feel that webkit would've been better to just skip the image and log an error to the console, since by resizing your image without your permission typically causes rendering bugs since they literally resize your image down in-line without telling you.
We didn't realize it could be circumvented with PNG's, that's an interesting idea, though it certainly hurts on the bandwidth side.
I expect the iPad3's high resolution to cause a lot websites pain on the bandwidth side. When it comes to the web, people aren't as picky with their high resolution desktop displays as they are likely to be with their iPads.
That's a great rule for general purposes, but it's intended for situations where you have lots of small assets that you are asking the client to download.
In this instance we're talking about 2 megapixel images, so the extra HTTP requests are not even going to register in the mix. With modern browsers pulling down 6+ parallel downloads, breaking the image up might make plenty of sense.
If you use domain sharding you can get more parellel downloads on older browsers which reduces the problems some due to more requests (parellel request limit on old browsers and unneeded cookies for static resources, but theres still overhead), but that's a separate issue. It's best to avoid the problem altogether by minimizing HTTP Requests (as you suggest).
Well we already slice our images, and even those slices are too big. We were annoyed to have to maintain another set of files just for the iPad. So we just use a smaller (and blurrier when scaled up) image at the moment. I was hoping they would just open it up :)
I tested and confirmed that if you save the JPEG using Progressive formatting, then it displays fine on the iPad. The Apple "Big Hero" image also has progressive JPEG formatting.
This doesn't explain why regular JPEGs are treated special though.
Another issue, but also there: PNG images above a certain height failed to load on the iPhone 4. We were facing that problem with our mobile testing tool. Converting the images to JPG solved the problem.
> JPEG images can be up to 32 megapixels due to subsampling, which allows JPEG images to decode to a size that has one sixteenth the number of pixels. JPEG images larger than 2 megapixels are subsampled—that is, decoded to a reduced size. JPEG subsampling allows the user to view images from the latest digital cameras.
Had this problem with a large background image (over 2 Megapixels, I believe). The solution that we came up with was to split the image in two, and then use css3 multiple backgrounds on the ipad, iphone. At that time, those were the only culprits.
I remember hitting a similar limit in webOS. There's a WebKit image cache setting set either at build time or startup time about how large of an image will be allowed to persist in the internal decoded image cache. When an image exceeds that, the decoder produces a scaled down version that's grown when displayed. We could adjust it on our platform with a setting in /etc/palm/browser.conf, but it wasn't adjustable without making the core file system writable and restarting the system manager.
This bug is sort of funny (in the frustrating way) when you're looking for high-resolution wallpapers for the iPad - At least on InterfaceLIFT on-device, you get a 1024x1024 image when trying to fetch the iPad3,x version.
Question: does the HTMLImageElement actually contain fewer pixels, or is it just displayed low-res? Could you take your jpeg, paint it on a canvas using script, and then display it full-resolution?
On a mostly unrelated note, I wish blatanly marketing driven terms like "retina display" would stop getting thrown around like this. Thanks to Apple's latest marketing adjustment for the iPad 3, it turns the phone I've owned for over 2 years now actually posesses a "retina display" (N900 for anyone wondering). At the viewing distance specified by Apple, the human retina is in fact capable of discerning a far higher pixel density anyway.
How is he "throwing around" this term? He's specifically talking about the iPad 3. Apple uses this marketing term to signal to the purchaser that they are going to get a certain pixel density. I see retina display, I know I will get a "good looking" display by Apple's standards. If I trust Apple, it means something. That's the purpose of marketing and branding. So what if your phone also has the same pixel density?
Do you have a problem with marketing in general? Is it a problem that Audi makes quattro cars or that Volkswagens have 4motion or BMWs have cars with 'x' in the model number? They only signify 4WD after all.
BTW, iPhone 4 is a 3.5 inch 640×960 resolution (326 ppi). N900 is 3.5 inch 800 × 480 resolution (267 ppi). Not that it matters either way.
No, I have a problem with marketing that is intentionally deceptive and can easily confuse non-technical users into believing they're paying for something they're not.
Neither of your car manufacturer examples demonstrate this. However, when a certain US cellular carrier slapped a "4G" sticker on 3G technology simply to make their product appear competitive in the market that was not OK.
Similarly, implying to users the pixel density of their device matches the resolving capabilities of the retina, when actual scientific studies demonstrate otherwise, is not OK with me.
A subset of human retinas can discern a higher pixel density, in the same way that a subset of the human population is capable of running the 4 minute mile. It doesn't make us all professional middle-distance runners.
The majority of people don't even have 20/20 uncorrected vision much less the ability to discern the pixels on a Retina display.
That would be a nice explanation if it actually fit the numbers. That subset of human retinas actually turns out to be 75% of the population. As for average vision, 35% of people can see better than 20/20 with no aids, and once glasses/contacts are added this number jumps significantly.
Apple's displays may well be termed true "20/20 displays" but calling them "retina displays" seems disingenuous.
I was referring to studies made by actual doctors not PR campaigns.
Edit: For those downvoting this I should have probably provided supporting evidence, though oddly no one treated the parent post in the same way. Feel free to follow the references at the base of this article: http://clarkvision.com/imagedetail/eye-resolution.html
It's funny how you send as "supporting evidence" non-specific to the subject matter studies, that just talk about the eye and it's capabilities, while never testing the actual "retina display" math and numbers. It's like you point us to some huge volume of "Ophthalmology" and say "I'm right and here's the proof".
Yes, both those articles confirm Apple has created the "20/20 display" not the "retina display" (which turns out to make a difference to a large part of the population). The article I linked to talks about pixel density calculations, I'm sorry if this was too much effort but all the numbers are there for anyone that cares about the actual resolving power of the retina. Surely you're not going to suggest there are serious scientific sources that focus specifically on Apple screens? I thought actual research was what you made your initial comment about.
>Yes, both those articles confirm Apple has created the "20/20 display" not the "retina display" (which turns out to make a difference to a large part of the population). The article I linked to talks about pixel density calculations, I'm sorry if this was too much effort but all the numbers are there for anyone that cares about the actual resolving power of the retina.
We don't care about the resolving power of the retina for some bizarro "Heroes" type people with super-sight, or in the abstract. We care that we can't distinguish individual pixels or nearly can't distinguish them. Which is the case.
That "large part of the population"? Actually minuscule.
>Surely you're not going to suggest there are serious scientific sources that focus specifically on Apple screens? I thought actual research was what you made your initial comment about.
I didn't suggest such at the first place. What I wrote was that "actual doctors and optical specialists" agree that the claims are accurate. Plus, you don't need "actual research on Apple screens". If you know their DPI and the viewing distance, you can work it out from general research on the eye, and it will hold true for any other manufacturer too.
So, here's another actual expert (noticed the Ph.D and vision scientist parts?):
According to William H.A. Beaudot, Ph.D., a vision scientist who was a research associate at McGill University in Montreal and founder of KyberVision. “In my opinion, Apple’s claim is not just marketing, it is actually quite accurate based on a 20/20 visual acuity,” said Beaudot.
(...) Dr. Soneira also claims in the Wired article that the term Retina Display is more of a marketing term and is “superamplified imaginary nonsense.” Beaudot doesn’t agree. “Since this display is able to provide a visual input to the retina with a spatial frequency up to 50 cycles per degree when viewed from a distance of 18-inches, it almost matches the retina resolution according to the Nyquist-Shannon sampling theorem,” said Beaudot. “As such, Apple new display device can be called without dispute a Retina Display. Could it get better? Sure, but so far this is the closest thing ever done in display technology for the consumer market that matches the human retina resolution.”
Plus, as a tech geek, shouldn't you be jumping with joy that we have achieved widespread adoption of such high DPI screens, instead of arguing for minor details on their naming, and if they match the retina perfectly or just 20/20 vision etc?
So I had previously wondered about this "minority of the population" claim and I looked up the stats. Apparently 35% of the population has better than 20/20 vision without any corrective measures. With glasses/contacts/etc that numbers jumps to 75%. I think it's safe to say "heroes" is a bit of an exaggeration.
I don't know why the expert you're quoting is specifying an 18 inch viewing distance when the article you appear to have quoted from was speaking of the iPhone 4; Apple claims a 12 inch viewing distance here as far as I'm aware. Then he makes some vague statement about it being possible to do better and this is simply the closest we've gotten to a retina resolution so far. Sometimes academics just want a bit of press time because it makes them look good.
Anyway, it's great we're getting increasing DPI on our devices, I'm just a bit afraid we'll get stuck and not progress to true retina displays because the consumers will be convinced they already have it anyway.
( You should avoid creating UIImage objects that are greater than 1024 x 1024 in size. Besides the large amount of memory such an image would consume, you may run into problems when using the image as a texture in OpenGL ES or when drawing the image to a view or layer - http://developer.apple.com/library/ios/#documentation/uikit/...)