I think he meant ColorSync[1] or ICC support for color management. An image that uses a color space other than sRGB (like Adobe RGB or CMYK) will look different on iOS than it does on OS X.
Apparently, iOS 4 did add ICC support in CoreGraphics for apps to use [2] but it seems to me that Apple’s apps don’t take advantage of it.
Calibrating the screen and supporting embedded profiles in images are two different things.
It would be nice if images with embedded profiles rendered correctly to the screen, e.g. with Safari on iPhone, using a canned profile that assumes that Apple has correctly profiled the mean iPhone display and that the variance between particular screens isn't too high. From what I can tell the iPhone doesn't do color management at all and the display resembles sRGB enough to call it good enough for a consumer device.
On the other hand I'd assume that the number of people who would want to use an external colorimeter and make a profile for their particular iPhone was not a market big enough for anyone (let alone Apple) to bother with, and since the iPhone has no mechanism to actually use that profile, it's pointless except for the purpose of converting images to iPhone in a desktop app, but in that case why would you care about your particular iPhone?