This is exactly the wrong attitude in my opinion. How does it help users to all of a sudden have most apps feel ancient? Is it really something to be proud of? That for the next year we'll be working on replacing existing utilities so that they feel "right" and "fresh" instead of doing what we should be doing: thinking of actual new software that is worthwhile to write.
I've been saying this for a while but I think what is happening, and what many developers haven't noticed yet, is that we have exhausted the utility of software for software's sake. The interesting stuff happening in mobile now has nothing to do with "design" in the traditional sense any more. Its not enough to just have a coder and a designer on the team. The really cool stuff is all about what your phone actually allows you to do in the real world. Look at apps like uber, postmates, spotify, and twitch.tv. Most of these have terrible UI's, but that's not the point. The point is that they allow you to do things. I can have a car on my doorstep! I can listen to almost any song I want. They're not just another calculator app or news reader, so who cares if its not the prettiest or easiest thing to use in the world. They are an interface to actually useful services. Software was interesting on its own a decade ago, but the industry has grown up, its time to do things now. That doesn't mean that "UI and UX" don't matter, it just means their definition changes and grows beyond just how you tap things on glass and what pixels you choose to animate.
The reason that iOS 7 seems comforting in the way its described in this article is because it gives developers who haven't realized this something to do again. Marco is absolutely right, for a long time it has felt like all the major categories have been covered on the app store. That's a good thing! It means we've solved lots of problems. We shouldn't daydream of a day when those problems get artificially unsolved so we can have another shot at them. We should move on.
I think Marco is saying this is good from an independent ios developers perspective, not necessarily from a users perspective. As somebody who makes his living as an independent app developer, I'm sure he's thrilled about this. Its undeniably an opportunity for independent developers who don't have to maintain an existing app. Whether its good for users or the ecosystem in general is a totally different discussion.
Indeed. One of the things Marco is glossing over is how the iOS 7 overhaul makes iOS development riskier:
- Will I (and the apps I'm planning) be able to adapt to the UI changes smoothly or not?
- How are these changes going to play out on the iPad?
- Does Apple have any other foundation-shifting changes planned before iOS 7 comes out?
- Will these changes impact user behavior?
- And so on.
From the perspective of an independent developer starting out, none of this matters. Their risk is already off-the-charts (and the downside limited by time and effort, for the most part), so all they care about is the increased reward potential. They don't even really care if that increased reward potential is coming from changes in absolute rewards and/or relative rewards vs other developers.
For everyone else, the risk vs reward calculation might look quite different.
Anyone who isn't an independent developer starting out (i.e. those with existing apps and larger organizations who perhaps have other apps that will require additional work just to maintain their current state.)
> I've been saying this for a while but I think what is happening, and what many developers haven't noticed yet, is that we have exhausted the utility of software for software's sake...
> Software was interesting on its own a decade ago, but the industry has grown up, its time to do things now.
Smartphones are important because they allow people to use computers in a manner that wasn't feasible in the past because of the size of computing devices.
However, smartphones have not produced any important innovations from a software point of view. They are not innovative because they don't need to be; they are simply the logical consequent of the advances in the miniaturization of hardware.
The market saturation you describe is not the result of a glut of innovation, but rather is just a consequence of developers porting existing desktop functionality onto a smartphone. You can take pictures and use voice recognition software to ask where nearby restaurants if you carry around a desktop or a laptop; you just probably wouldn't want to.
Actually, smartphones are a good example of how non-innovative and illogical the computing world can be.
Underneath the touch-based interfaces, they run on a kernel written in a memory-unsafe, non-garbage collected, insecure, and inexpressive language that was designed to target the PDP-11. Because the operating system is so flawed and incompetent that it can't be trusted to prevent rogue processes from stepping on others, you end up with a lack of a useable file system and awful communication between apps. Of course, the joke is that these restrictions are almost worthless because you can almost always bypass the system security and jailbreak the device; then maybe you can do something useful with your phone without.
Its part of a larger trend in the computing world to ignore history and reinvent wheels, and call it innovation. Case in point: web apps, which, unless you are replacing a legacy application, are a terrible idea. Not only will they always face latency issues, but they also use awful languages designed to serve and manipulate text and are at least twice as slow as native code. Software on smartphones is interesting in the same way that a "Hello world!" program is interesting to C programmer; the programmer is so surprised that he has fought an unfriendly, unintuitive system full of kludges and managed to output text to the screen without the program crashing and dumping core that he doesn't notice how trivial the output is.
Of course, what we have today is better than nothing. In a way, the massive and unnecessarily resource intensive process that we have now is uplifting because it shows the determination of the human spirit; just imagine what we could accomplish if we invested in the right things.
This is similar to suggesting that a Desktop computer isn't innovative because it's just a few peripherals connected to a very fast turing machine.
Some examples of Smartphone innovation (in my opinion):
At 2:00 AM in bed, when I'm exhausted, and realize I have to wake up at 7:00 AM, but I'm tired enough that I might not be able to reach my alarm clock, and I can just flex my thumb and say, "Wake me up at 6:00 AM" before drifting off to sleep - I call that innovation. I don't know if you recall, there was a time that the big "thing" at Hotels was having them give you a wakeup call, so you wouldn't have to figure out how the alarm clock in the room worked (if they had one). I haven't done that in 5+ years. Used to do it every night when I was in a hotel.
I'm out walking at night in Redwood City, and Dark Sky pops up and tells me it's going to start raining in my neighborhood in about 15 minutes - so I turn around and grab an umbrella.
I'm out at a customer luncheon in Singapore, and they ask me about a presentation I did awhile ago, on my laptop, and I know backblaze has backed it up, so I'm able to grab it while standing there from my iPhone and mail it to them - that real time delivery makes a positive impression. (Replace Backblaze with Dropbox/iCloud/SkyDrive/favorite cloud mechanism that you can connect to from your mobile device)
Listening to a great german song in a Taxi in Aschaffenburg, and I pull up Shazam - and 90 seconds later I have it on my iPhone (and it's now been synced with iTunes match and is on my iPad and Laptop)
Late at night in PaloAlto, and I need a ride home - and Uber tells me there will be a driver at the restaurant in 4 minutes, and shows them driving towards me, and makes sure they show up - As somebody who had spent hours waiting for cabs in the middle of the night - I call that innovation.
Walking down the Stairs at Southwark in London, and getting the real-time update from TubeMap, that the London Bridge station is closed due to a station fire - turning right around and grabbing a cab to an important symposium.
I could obviously go on - but I find it so hard to believe you don't recognize all of this as, "innovation"
I think he knows. The disagreement here is that characterizing pure-CS-stuff as "innovation" and the other stuff as not is an incredibly narrow way to define innovation.
It feels like LowKarmaAccount really defines innovation as how far down the stack it is, which is a wholly arbitrary (and IMO nonsensical) criteria.
By that criteria a helicopter is not innovative because it's just a slight rearrangement of fixed wing aircraft, an "evolutionary" progression. By that criteria the Internet is not innovative because it was just the natural evolution and consequence of radio and telephony.
droopyEyelids was correct in saying that I was talking about software innovation from a CS standpoint, so I didn't respond to the first reply to my comment. After reading your comment, I realized that I should make it clear that I'm using David A. Wheeler's definition of software innovation [1]. It was linked to on HN a few years ago[2]; it has an excellent listing of important innovations. The list was made to show that the most important computer innovations weren't patented.
Wheeler does include the Internet (internetworking using diagrams, leading up to the Internet's TCP/IP)as an important innovation.
On the page I linked to [1], Section 6 is called "What is not an important software innovation?", which inspired my post in part. I'll quote a portion of it below:
" As I noted earlier, many important events in computing aren’t software innovations, such as the announcements of new hardware platforms. Indeed, sometimes the importance isn’t in the technology at all; when IBM announced their first IBM PC, neither the hardware nor software was innovative - the announcement was important primarily because IBM’s imprimateur made many people feel confident that it was “safe” to buy a personal computer.
"An obvious example is that smartphones are not a software innovation. In the mid-2000s, smartphones rapidly became more common. By "smartphone" I mean a phone that can stay connected to the Internet, access the internet with a web browser capable of running programs (e.g., in Javascript), and install local applications. There's no doubt that widespread smartphone availability has had a profound impact on society. But while smartphones have had an important social impact, smartphones do not represent any siginificant software innovation. Smartphones typically run operating systems and middleware that are merely minor variants of software that was already running on other systems, and their software is developed in traditional ways.
"Note that there are few software innovation identified in recent times. I believe that part of the reason is that over the last number of years some key software markets have been controlled by monopolies. Monopolies typically inhibit innovation; a monopoly has a strong financial incentive to keep things more or less the way they are. Also, it’s difficult to identify the “most important” innovations within the last few years. Usually what is most important is not clear until years after its development. Software technology, like many other areas, is subject to fads. Most “exciting new technologies” are simply fashions that will turn out to be impractical (or only useful in a narrow niche), or are simply rehashes of old ideas with new names."
I wonder, if even with this incredibly strict definition of "Software Innovation" whether anything on a smartphone might still rise to the level of innovation?
I'm thinking of the GeoLocation Applications - that just never existed before, ever, like runkeeper. Or the BodyTelemtry apps, like FitBit. I'm still trying to think if it's possible to categorize Dark Sky as innovation. Or the always-available voice-recognition of Siri.
None of it rises to the level of "Software innovation from a CS standpoint" if we look carefully? Honestly asking now.
By LowKarmaAccount/David Wheeler's definition, no. If you browse the list in Wheeler's article, the only things that jump out at the "application feature" level are word processing and the spreadsheet. Clearly, Runkeeper, Dark Sky, and the like are not going to make that list. It really is focused on the "pure CS/EE" side of things, and since smartphone app development is mostly business as usual with a few different quirks, none of it is likely to qualify.
I'm thinking that something like "robust speech recognition and parsing" might fit on there (though it's tough to determine when such a system would be considered mature enough, and simultaneously differentiated from full strong AI). But that's not a smartphone innovation, it just happens to be a good use case.
It really comes down to the definition/interpretation of "innovation," and I think many people (myself included) would feel a slight when certain things are excluded, but I can see how "taking things that already exist and putting them together in new ways or usefully extrapolating on them" (which is what any "innovative" app has done) doesn't really fit. As he explained about the smartphone, something can be world-changing in a very real way, without necessarily being innovative.
To respond to the quote about "few software innovations identified in recent times", there's been plenty of innovation in the last decade or so. It might not be quite as fundamental as the innovations of the 20th century - perhaps because all those fundamental things needed to be invented but now have been - but it's there. These are not directly smartphone related, because I agree that most of the innovation of smartphones in particular has been in areas other than computer science, but your posts and the page you linked to are more broadly critical of progress in computer science, and most of them are present in or related to smartphones. Anyway:
SMP - parallelism not among separate, isolated computers whose fastest connection is probably an Ethernet port, but among multiple cores in the same die, accessing the same core RAM. Of course someone had a SMP system decades ago, it's not that complicated an idea, but only recently has it become ubiquitous and critical to taking continued advantage of Moore's Law. Although it's fundamentally a hardware innovation, the ways we write multithreaded programs have evolved to take advantage of it - it's only recently that the sort of software used on millions of systems has had to focus on fine-grained locking and lock-free algorithms, rather than slapping big locks on everything with the only downside being latency. And more unorthodox approaches are gaining traction: CSP, though invented 35 years ago, is being rediscovered with Go, various languages have experimented with software transactional memory (e.g. PyPy, Clojure), and Intel's new processors will finally bring hardware transactional memory mainstream, which might finally realize the dream of fast transactional memory.
GPUs - massive parallelism of relatively small, minimally branching algorithms, again on a single chip found in millions of devices; again, a hardware innovation that requires new ways to write software. Yes, I know how old the transputer is, but now it's mainstream.
Virtual machines - a new consummation of the idea of time sharing, innovative in practice if not theory. It's my personal opinion that they're a massive hack, a poor man's multi-user system that accomplishes no more than a traditional kernel could have, with the right operating system design, with all the kludginess you'd expect from a system based on hacking kernels designed to run directly on the hardware into running side-by-side - but when disk space and RAM became cheap enough that it became obvious that each user of a server should have their own isolated copy of all software, allowing them to install and maintain whatever versions of whatever packages they need, Unix had developed so much around the idea of a central administrator that the new paradigm had to evolve rather than being designed. But who cares? Worse is better, the heritage of Unix and perhaps its conqueror - however it came about, ordinary users of multi-user systems now have more power on average than ever before. Consider the difference between a PHP script hosted on shared hosting and a modern webapp stack. And maybe a new design will come around to replace it all, one of these days.
Closely related, cloud computing - I suppose driven by the decreasing price of hardware. The idea of computing as a commodity is hardly new, but in the last few years it has become a reality: virtual servers can now be spun up and down in minutes, as part of massive server farms provided as a service to a huge number of users, for low cost. This is fundamentally changing the way software is designed: scalability is easier than ever, but it has become more and more useful to write distributed systems that can tolerate internal failure.
HTML5. You can enter a URL and instantly and safely run fast code. Yes, it's just a another VM; yes, Java had this a long time ago. But we avoided some of Java's mistakes and CPUs are faster, so who knows, write-once-run-anywhere might become a reality this time.
Sandboxing. We might still be stuck with C and unsafe code, but sandboxing is starting to make software a lot harder to exploit anyway. Software security in general is receiving a lot of attention these days.
Functional programming and other languages with strong static type systems have had a resurgence lately. Going back a bit farther, you could say the same about dynamic programming languages such as Python, Ruby, and JavaScript. There are so many different languages which have taken different paths that it's hard to identify a single clear direction that programming languages have gone in, but there are some commonalities, and they add up to a significant amount of incremental innovation. There is a lot more to say about this, but I'm getting tired and it would take a lot to do it justice.
Ways to write software: test driven development, agile, etc.
> How does it help users to all of a sudden have most apps feel ancient?
It helps users in that the "new design" is an improvement over the "old design".
A case study, the Windows file picker dialog:
Win3.1 dialogs felt foreign to 95, while XP-style apps feel foreign to Vista-style apps. There is no debate that from win3 to win7 the file picker dialog got improved each time. Easier item selection, easier reading, easier tree traversal, quicker arrival to favorites...
What is true for this dialog is true for the whole system. rundll32 tabbed preference panes survive in Win8 to this day. They felt like they came from the future back then on win95, while today they're old and crufty, alien and inscrutable.
This is not change for the sake of change. Design is how it works. Good design is thorough and holistic. If you only improve part of the system, you end up with a hodgepodge of UI versions with no conventions across the board and only results in an overall bad usability and maintainability.
I agree with you, but I think that an occasional design shakeup hits people on a different level, subconsciously. A new look affects the feel of an application, which can actually affect how people view the substantial portions of an app.
That's a valid point, but isn't this a forum of mostly developers and tech-savvy people? While your point stands, I have a feeling that a lot of readers of HN will be looking at this from the OP's point of view.
It seems hypocritical to laud fragmentation when it happens in iOS and decry it in Android. These changes seem similar to the Android differences in gingerbread and post-honeycomb. The resulting effect on developers will be the same. iOS 7 has made it apparent which bloggers are unable or unwilling to be fair in their criticism when it comes to Apple. The mismatched gradients on the new icons are beautiful to them, the wire frame and confusing UI elements are revolutionary, and fragmentation is simply just creating fertile ground for change. Great.
This is a very a good point. As a major Apple fan I completely agree with you. After looking over the iOS changes I see absolutely nothing that is more "revolutionary" than what we have now in iOS6, or even what Android and Windows Phone offers.
At best, it looks like iOS7 is nothing more than a mix of Android and Windows Phone plus a few extra UI elements.
I am not a big fan of Marco's obsession with drama but I do respect his opinion and find his take on certain thing within the tech industry enlightening. Unfortunately, this post does seem like nothing more than turning a blind eye to what Apple has done with iOS7.
Combine Marco and Gruber's [1] insight - This is a major house cleaning/forest fire. Someone (Jony, et al.) has stepped back, looked at the confusion that has developed over the last 3-5 years on the iPhone, and has decided to recreate the iPhone as a conceptual platform. The introduction (and hopefully consistency) of multiple dimensions onto the phone, and a cross-system approach towards geometry, means that iOS7 will be a complete re-think of how applications work and behave on the iPhone.
And don't be so quick to say, "or even what Android and Windows phone offers.." - Those are awesome platforms, and Apple can learn a lot from them (as Android and Windows have learned from the iPhone.) Indeed, the "Cards" approach in multi-tasking is actually reminiscent of what Palm's WebOS did [2] - so Apple has learned from them as well.
I am very intrigued as to how well the Multitasking/Notification driven background/User-interaction-Dependent-CPU-scheduling-for-background-processess will work - That's pretty innovative solution to a common problem of Craptastic background apps that you don't think about sucking your battery dry. [edit: And all the discussions about Apps checking in when you've got the radios on, when you are powered up, etc... - I've been dreaming about exactly that for 2+ years. Finally! (Non-ironically)]
Don't kid yourself - iOS 7 is the big one, and, will likely be as ugly as the upgrade was from 10.6.8 to 10.7.0 was on OS X - My prediction is that iOS 8 will basically be fixing all the glitches and problems created from this complete platform rethink - but, sometimes, to make an omelet...
Listen to half a dozen developers at WWDC right now - see if, based on what they have seen so far with the DP, if they aren't completely rethinking their approach towards how their apps need to work/behave to feel "modern?"
I'm not up in the city right now, but from the few podcasts I've listened to - people are already commenting on how "dated" their app feels.
And - with the exception of the Mail/Calendar (Apple Favoritism, at its worst) and wacky "geofencing wakeup in the background techniques", all of our backgrounded Apps on iOS have basically gone to sleep permanently (until user pulled them back into the foreground) unless they were a GPS/VOIP/Music/Newstand app (and even some newsstand apps performed poorly with background downloads - I'm looking at you NYT).
In addition to the 3-D geometry (where I'm expecting lots of slide over sheets as a new metaphor) - I think all developers are right now considering how they can take advantage of an App that they can "wake up" remotely from a notification and perform activity with.
(and as an extra thought) you do not need to go far to imagine what apps with these new functionalities will be like - Android had them for a long time and Android apps are basically very similar to iOS ones.
There is a lot more new than just background downloads.
And it will change how developers think about their applications. Dynamic text, UIKit Dynamics, UIMotionEffect will change a lot as will new navigation/transition schemes.
Background downloads will change apps, sure. But does that really constitute a "complete re-think of how applications work"? It's the same feature, just implementable in a better way.
And people are in shock after years of iOS looking pretty much the same. When everyone actually looks at things objectively, they'll realise that their apps need to look a little different, behave a little different, but overall be pretty much the same apps they were before.
Although that does sound like hyperbole, the point is that they will be doing a "complete re-think of how applications work" - even though the end result will be much the same as it is at present.
You are conflating design and under-the-hood changes. I have seen absolutely nobody complain about the new APIs you mention, even though they might need a few iterations to work - just like 10.7 did. That is fine, that is how Apple has always worked. Apps that were quick to embrace features of 10.7/10.8 have always gotten a solid push on the Mac App Store, for example.
What annoys many users (and every developer is also a user) is change for the sake of change. There is absolutely no technical reason for changing the style of icons on iOS 7. Clear fit right in on the iOS 6 home screen and was a joy to use regardless.
That is true but do the new APIs offer something that would cause some revolutionary re-think and remolding of a vast majority of apps in the app store?
Sure the new APIs might offer some great features for apps and I am sure they will definitely be used in the majority of upcoming apps, but my problem (maybe not problem just curious observation or minor annoyance) is Marco taking a completely nonobjective stance on the iOS7 changes. I am completely ok with his opinion but the blind hyperbole is a bit annoying.
I'm also an Apple fan. I love my Macbook Pro Retina, it's the best computer I've ever owned. Being a fan of a company shouldn't blind you to their missteps when they happen though. It shouldn't lead to unfair application of criticism.
You're absolutely right. Much of the caterwauling hasn't been fair though. This isn't a final release we are discussing. It a beta. It's a redesign that has been on the boards for less than 7 months, and it looks that way. Does it need more work? Yes, absolutely. I don't thing a single engineer or designe at Apple is sitting back in their chair and thinking "job done". What it is is an interesting direction. Accusations of 'missteps' are off the mark. It's difficult to have a sensible discussion as the signal-to-noise in any Apple discussion is just awful. Criticism is good, however most are utterly incapable of doing so constructively, which is plainly bad.
Android would not have a real fragmentation issue if 70% of all users could easily update and did so within 12 months. Important distinction that negates this argument I think. (I am Android user for day to day phone)
But the fact is that they haven't upgraded, regardless of whether that's due to unwillingness or due to unavailability. Willingness to upgrade doesn't matter if you can't upgrade; you're still stuck on the old version.
That is absolutely, utterly subjective and anecdotal. I think blaming Google for iOS 6 adoption not being (even) quicker is grasping at straws that in all likelihood don't exist.
"The Dec. 12 reinstatement of Google Maps on iOS has apparently been enough for some of those reticent users to finally make the upgrade to iOS 6. After achieving 10 million downloads in the first 48 hours available, MoPub, the San Francisco-based mobile ad exchange that monitors more than 1 billion ad impressions a day and supports more than a dozen ad networks and 12,000 apps, says there has been a 29 percent increase in unique iOS 6 users in the five days following Google Maps' release on iOS.
"
BTW, not "blaming" Google, if anything, I would be blaming Apple for having a sub-par map application (As a frequent international traveler, I can tell you it's still not up to par with Google's maps) - but I'm not blaming anyone. Just recognizing it was a pretty significant issue for many people.
Given I've also known a few unrelated people do this, I believe your blanket dismissal foolish when it's simple to falsify the thesis: check iOS 6 adoption after Google Maps was released.
> Jelly-bean is out about eight months and is at 28 some-odd percent.
That clearly shows that Android users are happy with the current release and see no need to switch to a newer version because core features are missing completely, as is the case with iOS.
I can't tell if you're being sarcastic. In case you're not, I've been an Android user for years now, and that is most certainly not what it shows. The vast majority of people can't upgrade to Jelly-bean, because their device, manufacturers, or carriers don't support it or won't release the update for their particular phone. For example, the Droid Bionic was on Gingerbread with no way to upgrade (without rooting) until October 2012 [1]. And the upgrade was for Ice Cream Sandwich which was already 12 months old by that time (and Jelly-bean, the next version after ICS, had already been out for 3 months by that time) [2].
iOS6 is at 93% and it hasn't even been a year yet. Where's Jellybean? Who's still stuck writing apps that are compatible with Gingerbread and have to use ActionBar Sherlock?
The thing is a number of developers are just writing for 4.0 and above.
I've not saying fragmentation on Android isn't bad. What I'm arguing is fragmentation on iOS is being framed as a good thing (actually a "great" thing) by the author of the post.
You're probably right. But Marco's argument hinges on the idea that:
- there will be fragmentation
- hence it will be difficult for developers to write apps that make the most of iOS6 and iOS7
- hence this is an opportunity for nimble new players to enter the market and capture the iOS7 userbase
I agree that there probably wont be fragmentation, but the point is, Marco is saying if there was, it will be a brilliant opportunity. Therein lies the mental gymnastics.
I think he's more saying that people will be reluctant and/or unable to completely rethink their apps design and functionality and this will present an opportunity for new developers who build things that are uniquely suited to iOS 7 and beat the old slow-moving incumbents.
"Most [developers] can’t afford to drop support for iOS 6 yet. (Many apps still need to support iOS 5. Some unlucky souls even need to support 4.3.) So they need to design for backwards compatibility, which will be extremely limiting in iOS 7."
That's because you are framing the issue as something it's not. It's not fragmentation in the same sense as Android. 93% of apple devices are on the latest iOS version. Right around 30% of Android devices run the latest major release. The number is actually quite a bit lower if you start to consider point releases. Even worse is the fragmentation caused by different vendors and chipset manufacturers writing different libraries and drivers. See libstagefright, which has major differences across different devices even if it's the same version of the OS, for example different native pixel formats are supported by different vendors so writing any kind of a custom protocol video streaming app on Android is a stone cold bitch to get working across devices at 60 fps and that's only targeting 4.0+. It gets way worse when you include older versions that may or may not have libStageFright (most 3.0+ have it but a few don't). I'm familiar with video and audio the most but other areas have the same issue.
Contrast that to iOS, sure different API's get added but at least the same versions of the os behave the same across devices. Sure for a very short time iOS will be split between versions but that will be exceedingly short and you'll never have the true fragmentation problems that Android has in iOS.
He's not saying it's a great thing for users - he's saying it's a great thing for developers looking to enter a crowded space, because there will be some room to enter the market targeting only iOS 7 while established players are figuring out how to juggle the multiple OS versions.
I'm not necessarily disagreeing with the author. Marco talked down on Android for years because of its fragmentation. It's just hypocritical that it's only good if it happens on iOS.
In one of the Google IO talks, Google has been encouraging developers to develop only on 4.0 and above. To quote, "go to where the puck is going" and "develop the best app possible for every phone".
The point that others are making in the thread is this:
If you are an independent iOS developer looking to hit a majority of devices, just develop for iOS 7 - pretty much everyone will be on that in a year.
If you are independent Android developer looking to do the same - you need to target at least ICS, Jelly Bean and Gingerbread to get a 90%+ market share. (And with Gingerbread support comes ActionBar Sherlock and other fun stuff). I'm in the middle of this right now for my app, and hating it.
iOS's 'fragmentation' between ios6 and 7 is totally different than androids because its temporary. There'll be a few month period where established apps haven't updated their ui. iOS has insanely quick (in comparison to other OSes) adoption rates. 15% of devices upgraded to ios6 within 24 hours of release, 50% two weeks out, etc.[1]
So there will be apps with the old ui for people running ios7 who will decide to look for a new thing to do X for the first time in years and thus a market opportunity. Whereas with android, the problem is that phones overwhelming remain on the OS they shipped with. It's not a matter of waiting 3 weeks for a fitting ui, it's that you have to develop for a 1-3 year old OS or write off half the market.
[1] lost url because I'm writing this on ios7 and it's pretty goddamn buggy, but it looks like others have provided it
"iOS's 'fragmentation' between ios6 and 7 is totally different than androids because its temporary."
Says who? This change is much more major and, despite what Gruber and Marco would like to have you believe, is not design perfection personified, and is likely to be polarizing.
Let's forget about the z-axis and new interface updates, there are some major new features in iOS 7 that I'm sure a lot of people will want to upgrade to within the first month. (I predict faster adoption for 4/4s/5 users than we saw in iOS 6)
In decreasing order of importance (my opinion)
o Pre-Launch background Updating (Finally!) - I spend 3-5 minutes, every morning, and another 3-5 minutes, every night, downloading my Podcasts before my walk home. Annoying. Now, in theory - they'll download for me in the morning and night. Awesome - particularly as my iPhone 5 is happily sitting in an elevation dock at 100% power for most of the day, or plugged in at home for most of the night. It Will be interesting to see what developers do with notification launched background apps - hopefully it won't be abused.
o Airdrop for iPhone (Finally! How many times have I wanted to get a file/image/content off my iPhone onto my Laptop just before a plane took off)
o Swipe Control Center - another "Finally" - getting closer to android parity/rooted iPhone parity - I tweak the brightness/lock rotation/pause/play music and settings pretty often. This will length the lifespan of my home button (double click + swipe left to currently get lock + music settings)
o Enhanced Camera/photo management - looks really nice - lot of people will like this - even the die-hard Camera+/Instagram/Flickr/Google+ photo types. I do feel kind of bad for Camera Noir - their window was pretty meager. :-(
o iTunes radio - particularly for all of us who already subscribe to iTunes Match.
o Advanced Siri - I use Siri many times a day, looking forward to this.
The reality of Mobile Device evolution is that four-mobile-phone-years is roughly equivalent to ten-desktop-years or six-laptop-years.
And, let's be clear - it's only a prohibitively long time if you wish to continue to get the new feature release platforms - indeed, it's only Today, June 11, 2013, that the original iPhone, released in 2007, is now being obsoleted by Apple [1], a full five years after it was discontinued.
Is there another company that is updating 4 year old phones? Is there an Android phone released with 2.2 that can run 4.2? I think Apple's backwards compatibility track record is the best there is, but here's to hoping for better!
Not exactly a Google update, but the open source nature of Android makes efforts like CyanogenMod possible. They support many old devices.
My Motorola Defy came with 2.1 Eclair. Runs 4.2 as of now, thanks to Cyanogen. I keep it because it's built like a tank; all phones should have IP67 certification.
Still, bear in mind that it's a pocket computer, built in 2009, with a 600 MHz processor and 256 MB of RAM. Modern pocket computers have at least twice the processing power, two (or more) cores, and 4 times the RAM. They can also offload a whole lot more to the GPU. It was going to hit a ceiling eventually.
If you look at the devices cleared for iOS 7, it's almost definitely a RAM overhead issue. All the 512 MB+ devices (4, 4S, 5, Touch 5, iPad 2+ and Mini) are getting iOS 7. None of the 256 MB devices (Touch <5, iPhone <4, iPad 1) are getting it.
Does it suck? Yeah, definitely. At some point, though, you have to make the tough decision and say, "We have to compromise on either our vision or supporting older devices." When you get to that point, the choice isn't too hard. Is there anyone out there who would really argue that watering down iOS 7 is worth the increased support? (Especially with the US cellphone market and its subsidies being what it is?)
I feel for iPod Touch 4 owners. Like another poster said, those were on sale a few weeks ago. Every other unsupported device is terribly old, at least as far as the pocket computing world is concerned.
No need to spend hundreds of dollars; just sit behind the curve a little bit. Spent $68 on an Optimus S with Ting a year ago, and it gets everything I need to do done at less than the cost of my Verizon dumbphone contract. It's as fast as the computer I took to college not that many years ago, and I got a lot done on that. Battery runs for days too, even with considerable use.
Looking forward to upgrading to an S2 or S3 when prices drop to frivolity. A larger screen will be nice.
Seriously? Four years is quite a long time in today's tech world. Think if you had a four year Android or Windows (whoops) phone if that was a reasonable thought. Bottom line - if you think 4 years ISN'T a long time for a hundreds of dollar device then there's not much hope for you.
The reality is that you're in a vanishingly small minority.
And I'm not hand-waving that notion either. I write apps and I've seen device usage numbers for a number of apps across a number of verticals - 3GS usage really is a vanishingly low number, well below 1%.
Devs don't have anything against old phones, we don't care how often you give Apple your money. If enough people still used the 3GS, we'd support the 3GS.
When you're in the sub-1% group, without some contract guaranteeing support, expecting support is unreasonable.
And Apple didn't become the most valuable company in the world by not playing to the consumer-driven culture, making sure there is a good reason for people to drop hundreds (or thousands) each year on updated smartphones, tablets, and laptops.
It's not like your device will stop working the day iOS 7 is released. Your phone won't have any features taken away and you can keep it using until the hardware fails. (I'm still using my Nokia N900).
A lot of iOS 6 features rely on Apple infrastructure, e.g. iCloud, which will probably become even less reliable on iOS 6 as the months roll by. So no, the device becomes less useful. Try using iTunes on Lion if you don't believe me.
> those not so old iPhones that are stuck on iOS 6 forever.
Of all the devices supported by iOS 6, only the iPhone 3G S and the 4th gen iPod touch won’t get the upgrade. The iPads that could run iOS 6 will also be able to run iOS 7.
The iPhone 4 has been out for 3 years and will be able to run iOS 7 when it is released later this year. How many 3 year old Android, Blackberry and Windows Phone models do you know of that can run the newest version of the OS?
(This post was edited to reflect that the 4th gen iPod touch won’t get the upgrade to iOS 7. The ‘iPod touch 16GB’ was listed on iOS 7’s page, I understood that to mean the 4th gen model, which also came in 16GB capacity.)
>The iPads and iPod touches that could run iOS 6 will also be able to run iOS 7.
Not quite, at least as far as the iPod Touch goes. iOS 7 will only run on the 5th generation iPod Touch (see http://www.apple.com/ios/ios7/features/ - bottom of the page), the version with the 4-inch screen.
Apple was selling the 4th generation device up until May 30 of this year. In other words, there's people who bought this thing two weeks ago who will not be able to upgrade to iOS 7.
This is mostly a good point, but is missing an important difference. A lot of important Android APIs are provided as part of Google Play services and made available back to Android 2.3, whereas an iPhone 4 running iOS 7 still will be unable to use Siri.
I understand your point, and it’s a good one, but iPhone 4 isn’t allowed to run Siri because of hardware reasons, not API support. The noise reduction in newer models is a lot better than on iPhone 4. Hackers have managed to get Siri running on iPhone 4, it just doesn’t work as well.
That would be the 3Gs. I had mine for 3 years before I upgraded to the 5. It felt like a pretty old phone. What are the odds that people who have a 3-4 year old smart phone, really want to use a lot of apps?
I have a 3GS. And even though I've had it for so long, it still doesn't feel dated to me. Apple did a good job with the hardware and the software support. It is finally starting to feel a bit sluggish after almost 4 years of very heavy use. Compare that with my old phones which I essentially threw away after a year or so and never missed them...and the crappy Razr that died after a year.
With each suggestive generation selling more than all previous generations combined, it's not a big issue for long. Obviously this pattern can't last and I'm not sure if it hold for the iPhone 5 even, but its a big factor in the equation.
Edit: removed bad link to old article. I can't find the Asymco graph I'm after, sorry... Maybe it was someone else's?
Fragmentation implies "fragments". iOS 6 has a 93% install base. An ever-shrinking 7% isn't much of a fragment to worry about. iOS 7, due to it's new shiny factor, will likely see an even faster adoption than iOS 6.
It really helped that Apple introduced over-the-air updates and PC Free (activation without a computer) in iOS 5.
iOS 7 introduces automatic updates, but I don’t know whether that’s only for apps or for the OS as well. If both, then versions past iOS 7.0.0 should have an even faster adoption rate.
I think you forgot the part where Android fragmentation leaned more towards the 3 year old version. 2.3 is still huge. If Android had simply evolved fast and all the new phones had the new Android the problem wouldn't be so bad.
I don't know about "Dropping Fast" - it's slowly dropping away, even after they changed how they gathered statistics to only count devices visiting the android store.
And really, I'm not entirely shocked that a Twitter-owned service is getting more sharing on Twitter compared with a service that gets a reasonable amount of obstruction from Twitter.
That doesn't mean people will riot if it isn't available on their device. If you've got Android 2.x and can't get Vine, you likely don't really care as you haven't ever used Vine.
I think you forgot the parts where even devices with a 3 year old OS [1] regularly get updates of many APIs that have been (sensibly) decoupled from the OS, have access to substantial compatibility libraries and so on.
[1] By which I mean Froyo, Gingerbread doesn't hit its third birthday until December (at the earliest).
"iOS 7 is different. It isn’t just a new skin: it introduces entirely new navigational and structural standards..."
Beyond the parallax effect, what are these new navigation and structural changes? I'm not trolling here, I'm genuinely curious. I'm about to build a new iOS app and I did not see major navigation or structural changes that would drastically affect how I design an app's UI or UX.
Yeah, I think this is somewhat of an overstatement. UINavigationController? Still there. UITabBar? Still there. UIHamburgerController? Not there. Another radically different way to organize UIViewControllers? Nope.
That being said, there are a few new things that do stand out:
1) Everything is supposed to be fullscreen as often as possible, with the chrome hiding as much as possible. "Deference to content."
2) Transitioning between view controllers should ideally "zoom" in to new content. Like the calendar app zooming into a day view from a month view or a year's worth of photos zooming into a collection of chronological "collections." In other words, your transitions should communicate _how_ the view controllers relate more than just you're replacing one with another.
3) Text Everywhere. (only new on iOS)
4) More abstract but also more physical (read: mimicking the laws of nature, like physics, rather than mimicking things humans have made, read: skeuomorphism)
One change highlighted in the keynote presentation: instead of tapping the ‘Back’ button on the upper left corner of a UITableView layout, in iOS 7 you swipe horizontally from the left edge of the screen.
Excellent point on swiping horizontally instead of a "Back" button. That's a great change, but one I already try to incorporate as the dedicated header bar is (or was) distinctly iOS.
If I'm building a cross-platform app (or have the intention of going cross-platform eventually), I try to steer clear of any interactions or UI elements that are too obviously from one platform or another.
I really appreciate apps that have their own distinct look and feel which don't use "proprietary" UI features (like the header bar and back button). The most recent app that I've admired is Dots. It doesn't look Androidy or iOSy. It looks like it's own thing. The team behind that did a fantastic job.
I agree that Dots is great, but it's a game--the UI is by nature specifically tailored to the application.
If I'm using a news reader or productivity app, I hate having to spend time learning a novel interface. The more conventional the UI, the more easily and quickly I can use it. Unless uniqueness adds a tangible benefit, it's just a pain in the butt.
FWIW I'm using the iOS 7 beta, and this behavior is inconsistently present. It works in nested lists, but other screens e.g. the music Now Playing screen don't do it at all. Hopefully it gets OS-wide integration.
iOS 6 added a swipe gesture on all navigation bar back buttons, and even in 6.1.4 it doesn't work in Apple's Messages out of all apps. I would be surprised if this new gesture ended up working consistently.
The Back buttons have to be one of the weakest elements of the current UI. I know from experience that less-computer-literate people are blind to them. We'll have to see how easily they pick up on the new conceit.
There's a lot of awesome new multitasking and background stuff you can do now. It's a mix between the UI changes and the underlying parts that I think Marco is referring to.
Youre probably just being obtuse, but: he just means the principle of things coming from somewhere or going to somewhere, as someone once put it. Reinforcing conceptual relationships via physical movement and manipulation that mirrors how things may behave in the physical world, to some degree. See zoom-in/swim-into interfaces, or the way screens transition rather than just blink into view, etc.
SpriteKit is going to be huge. Expect about a year of getting developers ready with it, and good at it, then Apple TV may have something awesome for us.
Compared to Cocos2d, I'm not that interested in Sprite Kit as a framework. But combined with the new gamepad APIs, I think you might be on to something in regards to the Apple TV.
Interesting suggestion - I was surprised to see such a direct competitor to cocos2d, and despite the development budget of Apple I'm a little skeptical of their commitment to building a game engine. Maybe it'll form the basis of a better set of game engines. I'm definitely skeptical of their physics engine over Box2D.
I'm getting pretty tired of blog post titles that give no real hint at all as to the topic. My mind labels them as "pretentious" because they're pretending to be deep when they're not; by that I mean that rarely do the posts offer up any kind of non-trivial insight. (Compare with PG's similarly titled essays.) Earlier today we had "Two i's" from DHH's crew (where you had to infer from reading the post that it was i for interesting and i for important). Dustin Curtis recently had "Glass". Please, you will still be the coolest people on the planet if you don't try to title your posts per the ineffable style of Apple product advertisements, and as a bonus we will even like you and your work a little bit more.
You would, of course, prefer something like, "iOS 7 reset creates new opportunities for Developers" or "iOS reboot creates conditions ripe for disruption" or, ....
Honestly - "Fertile Ground" is a pretty damn good title. I can already imagine people casually discussing it at WWDC "Did you catch Marco's "Fertile Ground" post?"
I would be happy with "Apple's iOS 7 is Fertile Ground", "Hands-On with Google Glass", and "Two I's for Productivity: Interesting and Important Tasks".
Of course, these have all lost their Zen master cachet, but for me that's a distinct advantage. Good writing doesn't need to be cool.
He has a lot of titles like, "Mountain Lion" and "Walter Isaacson’s ‘Steve Jobs’ " and "You Do Not Need to Manually Manage iOS Multitasking " but he also mixes things up (in a positive manner, I would argue), with, "I’ma Set It Straight, This Watergate " (about iBooks Author not following standards for eBooks), and "Get the Fainting Chair" - about Google being shocked by Apple not renewing their license for Google Maps on the new iPhone.
I guess it's unfortunate that Blogs don't have subtitles.
Marco's not attempting to communicate clearly in the sense of "push data down foobarbazqux's infohole." He's trying to communicate clearly in the sense of "grab foobar's attention, state his perspective provocatively, and attempt to not only reach an audience but make them think and talk."
Clearly he's succeeded, from the conversation going on in this post. Marco is occasionally pretty obnoxious, and his ideas are frequently not always well thought-out, but I like writers who attempt to be not only clear, but compelling.
Paul Graham has plenty of terrible essay names, and plenty of terrible essays at that. One of the things that frustrates me about him as a writer is that on occasion he attempts to strike an "objective" tone while offering a skewed and entirely subjective perspective. Besides, conveying pure abstract information should not be the point of an essay. Tone, emotion, purpose, and construction should all be deliberate. Otherwise you get Mashable—ultraspecific titles geared to ultraspecific articles which are so drained of anything beyond pure bullet-point content that you could train a machine to read and interpret it. It's such a waste.
It takes you two seconds to click on a link and look at it, ten seconds tops to decide if you're going to gain something by reading it. If you're clicking on so many links per day that twelve seconds here and there is putting a dent in your productivity/well-being, then there is a worse problem here than ambiguously-titled essays, and ironically, it's a problem that you'll start to solve by seeking more challenging pieces of writing to tackle to distract yourself from the constant useless information mill that the Internet so readily provides.
To be honest, I only clicked on Fertile Ground out of a yearning for self-punishment. I knew exactly what kind of article to expect from a title like that, it's just that I like to complain when it happens that I get what I want and I also don't like what I want. The sense of superiority makes me feel a bit better, kind of like those cranky old guys that think most everyone else is an idiot. The main difference with PG seems to be that I have to read 5000 words before arriving at that all-too-familiar sinking feeling. Well, I hope I've contributed to the overall cynicism levels around here, and I wish you an absolutely rotten day.
I was only complaining because I literally had no idea whatsoever that it was about Apple, never mind iOS 7, and I wouldn't have opened the link if I had known that. Is "Apple's iOS 7 is Fertile Ground" really that atrocious?
I can understand how given the context of WWDC yesterday it might be more obvious, and that there is an argument for treating blog posts as ephemera, but in general I like it when things can maintain their meaning for months, years, or indefinitely.
Look to other operating systems that evolved their UI in similar fashion and a few of their dominant software players over the years:
Windows-Office, Quickbooks, Quicken, IE/Chrome/Firefox/Netscape (which have shifted favor over years, but not because of UI changes in the OS)
Mac OS (X and classic)-Adobe PS, Illustrator, ProTools, Office
UI changes, even major ones, have had little to no effect on the dominant software titles for those systems. There have occasionally been new categories of software introduced. For instance, high quality video editing software for the home market, which was made possible by better home cameras and major advances in speed and resources of home computers. Pervasive internet allowed the browser wars to happen. It wasn't minor UI changes in the OS that allowed new players to come onto the scene, it was major technological advances.
If you go back far enough, you can argue that the change from command line to GUI allowed for just such a revolution described (it definitely did: Wordstar/Wordperfect lost to Word, Lotus lost to Excel, AutoCad nearly lost its throne, etc.). But, nobody in their right mind is arguing that iOS 6 to iOS 7 is the difference between DOS and Windows 3.1 or between an Apple IIe and Lisa or the first Macintosh.
History isn't always the best indicator in the tech industry, but in absence of other indicators, I'll bet on history repeating in some form.
The chrome around something like Photoshop is a small part of the entire experience, and unlikely to change a purchase decision. This is not true for many smartphone apps, where the look and feel is one of the key features and selection criteria. Apps that don't update will feel crappy and old and that'll put them at a major disadvantage.
This will be a chance to catch any "lazy" dominant apps if they don't upgrade soon enough. I'm not sure if there are any dominant players that are lazy - seems like you'd need to keep on your toes if you're gonna stay dominant. So not a whole lot might change, anyways.
There were some notable changes in the Classic-to-X change. Quark XPress replaced by InDesign, BBEdit replaced by TextMate. Probably a few others, too.
Adobe's tools stood unchanged in the X switch, in no small part because they got X versions out pretty quickly that felt relatively at home. Quark, IIRC, took forever to make an X version, so Adobe was able to bring out an X page layout app and eat their lunch. Of course, it didn't hurt that everyone pretty much loathed XPress, especially its heaviest users...
While I agree there were some shakeups, I think it also proves my "major technological change" point. Mac OS to OS X was a major technological change. Went from a pretty, but clunky, single-tasking, OS to a quite advanced UNIX-based multi-tasking OS. UI was the smallest of the changes that happened there. There was also a major processor and architecture change in that story line. Big differences. Having a native app was mandatory to take advantage of a huge jump in capability.
I hope I'm not in some crazy minority here, but I actually value stability and UIs not changing radically all the time. I feel like there's a craze afoot at the moment to "redesign everything all the time" and I'm not a big fan of it. If you're in tech, sure, it's the "price of progress" but for the lay user (i.e. the people who pay us money for software) it's just annoying.
That was my thought too. I'm a bit agog at seeing a developer lauding a platform owner for burning down the platform. If you're going to invest time and money in a platform, the last thing you want is to find out after that investment has been sunk that the platform steward likes to periodically smash everything.
Marco argues that while this behavior is bad for incumbents, it's good for new entrants, but that doesn't make any sense; if you're a new entrant, your goal over the long term is to become an incumbent. Blowing everything up may benefit you today, but if you survive long enough to see the next demolition spree, then you'll be the one getting 'sploded. It's like arguing that living next to a volcano is good for development because it periodically clears out old building stock.
A good platform is one you can build a business on. Sudden, dramatic change that flushes your investment to date down the toilet is bad for business.
A platform that is popular or has the most apps is not necessarily the "best", as Windows proves. I'm not sure that Apple ever wanted 750,000 apps on iOS. It seems that they should prefer 100,000 great apps to 750k mediocre ones.
The platform was designed the iPhone platform between 2005 and 2007, and while it served its purpose very well I don't think that we should limit innovation for the sake of supporting old apps.
I think his point was that not that the platform is being "burned down" but that it was going "arise anew, like a phoenix from the Ashes" - it's not like iOS will be gone in three months - but it will be markedly different.
Also - he was highlighting that this will suck somewhat for incumbents (which, having sold Instapaper, and "The Magazine" he is not), but that it offers great opportunities for people willing to fully commit to the new iOS 7 worldview (who in turn may be wiped out in another 8 years - but hey, 8 years is a long time.
Regarding, "If you're going to invest time and money in a platform, the last thing you want is to find out after that investment has been sunk that the platform steward likes to periodically smash everything."
Heck, based on a small sampling of anecdotes from iOS developers I know - I bet 90% of all iOS developers make the majority of their income on an app in in less than six months, fewer than 10% experience "significant income" over a full year, and less than 1% have multi-year incomes from an app that is enough to have a comfortable six-figure salary. The nature of iOS application development is different from Desktop Application Development, and, for most (excluding the top 5,000 Apps) - the real money is in new applications.
So - yes, this may be jarring for that 1% - but the other 99% have a great opportunity opening for the next six - nine months.
I agree with you. The iOS aesthetic, though imperfect, was stable and quite nice. In my opinion, Apple should have merely eliminated some of the more egregious skeuomorphisms and brought their apps more in line with the semi-flat look that the OS has had for years. This sort of total visual upheaval is superfluous, if not detrimental. From what I've seen, it looks insultingly bad.
There are better ways to allocate development time than reimplementing an already-fine OS.
I have to disagree. Inside-app navigation in iOS is, truth be told, bodgy and confusing. Throw on top of that the need to handle notifications and app intercommunication and multitasking better and some systematic rethinking was really needed. I agree that graphically the new "commercial-art retro-chic meets skinny Helvetica" look feels like hard going - "it's sorbet and Love Hearts for dinner, everyone!" - but hopefully they'll have tidied and calmed things a bit by launch.
I agree that there are myriad functional issues with iOS. The actual look and feel is not one, though, which is more upsetting considering that their iOS team could be spending time fixing more pressing problems in OS functionality.
Just because a change was necessary doesn't mean this change was necessary.
I'm still flabbergasted that Apple did their big overhaul (apparently) without addressing inter-application communication, default apps, Siri APIs or most of the other "power user" issues that would have helped them grow their computing paradigm. To me, increasing the scope and power of iOS devices while maintaining the predictability and control that are at the heart of Apple's take on mobile/touch-based computing is the problem Apple has to solve over the next few years. Judging from what I've heard about iOS 7, it sounds like Apple couldn't disagree more.
There's a huge tendency for designers to overstate the importance of "looking modern" as an actual end. A lot of people honestly wouldn't care if the buttons on their phone were modelled to look like 3d photorealistic rainbow poop. The designers tell them they don't need cases but they all buy the most garish hideous cases to say nothing of the bedazzler people.
Sign me up as skeptical re: the coming app store revolution.
I noticed something curious on page 10 of Apple’s iOS 7 Transition Guide[1], in the section “Things Every App Should Do”:
”Examine your app for hard-coded UI values – such as sizes and positions – and replace them with those you derive dynamically from system-provided values. Use Auto Layout to help your app respond when layout changes are required.”
Now, I may be reading too much into this, but the use of the word ‘when’ sounds to me like Apple is preparing products with other resolutions than are on the current iOS devices. I think there would be a market for a budget iPhone with a smaller screen, and a high-end iPad with a larger screen.
Also, the icon size for apps is different in iOS 7. It’s going up from 114x114 to 120x120.
Unless Apple has significantly improved new app discovery on the App Store, I don't think we are going to see any changes.
The algorithms that apple uses for the top lists promote established players, the search functions suck, and the interface for scrolling through lists of apps are so slow and clunky that it discourages users from exploring beyond the first 5-6 results in any list.
So when Android is fragmented, it is awful and is the reason the platform sucks. When iOS is fragmented, it's innovative and fresh, masterfully executed to bring new opportunities to developers. Got it.
How do you even draw such a connection in your head? Nobody once, ever, has said that the new UI that came with ICS was a bad thing. The problem is that even now, years later, the most in-use Android version is three years old.
Android fragmentation is about VERSIONS. APIs, bug fixes, security updates, and available features to a lesser extent. It has NOTHING to do with look and feel. At all. This tactic from Android fans is sad. Just expand the definition of fragmentation until it no longer means anything.
Ridiculous. When 90% of devices are using the latest OS within a year of release, trying to call that fragmentation is complete bullshit. No platform in history has been able to tout those kinds of upgrade numbers, and it's a distinct advantage for both users and developers.
Fragmentation is about your userbase being split across incompatible platforms and the paralyzing effect that has on development, which is precisely what Marco was lauding as a good thing in this post.
I did, before it hit Hacker News. The point he was making is that this is a massive look and feel disruption, and will be a rough transition for major players that have an established but now dated look and feel.
That isn't fragmentation. iOS 7 is still compatible with apps written for iOS 6 and iOS 7 apps will still work with iOS 6. The same cannot be said for ICS apps running on older devices. Even in-house apps like Chrome are ICS or bust. Even now, multiple years later, the most popular version of Android won't run Chrome or any other ICS exclusive apps.
However, iOS 7 apps will LOOK & FEEL out of place on iOS 6 and vice versa, which is where the disruption occurs. This isn't fragmentation, and no amount of arguing will make that the case.
On top of that, with the notable exception of the iPod touch 4, all iOS devices sold for the last four years will run iOS 7. That means this period of disruption will last for six months to a year, max. Once again, no fragmentation.
That is not what he's saying at all. He's saying there is disruption and an opportunity now, not that continued disruption (fragmentation) is a good thing!
For what it's worth, I think he's right about it being an opportunity. If you can throw away some percentage of your potential market and leave them to your competitors, you might be able to work that to your advantage. This isn't really a novel concept; every major change in every major ecosystem undergoes a similar period.
I mostly take umbrage at the specific attitude taken towards this, when the same move in similar ecosystems (the ICS/JB upgrade, for example, which saw a huge market open up for "Holo-themed" apps) was mocked and derided as pulling the rug out from under users' feet.
iOS' consistency has been tirelessly lauded as a good thing, until Apple goes and changes it. I'm happy with progress and change, and am fine with the broken eggs required to make that particular omelette; I just think it's funny how the pundits' headlines change based on how their particular horse is doing.
> I mostly take umbrage at the specific attitude taken towards this, when the same move in similar ecosystems (the ICS/JB upgrade, for example, which saw a huge market open up for "Holo-themed" apps) was mocked and derided as pulling the rug out from under users' feet.
I would love to see such an example. As a follower of many Apple themed blogs, I saw nothing but good thoughts directed towards the release of ICS, which was a sorely needed UI revamp. I don't remember anything even close to this sentiment being expressed.
Perhaps I just read too much HN. ;) I'll see if I can dig up a few examples.
Edit: After 15 minutes of googling, I'm unable to find an example to back up my assertion; I withdraw it. I'm still pretty sure it's out there, but I won't ask you to take my word for it. :)
I'm a big fan of the general visual appearance and behaviors of ICS and Jellybean! After Froyo and Gingerbread, it was a relief. With all the Project Butter work, Android has become much better. I wish they would block carriers from messing up the UI.
iOS is a lot different than Android. I don't think people realize how conservative Apple is. In the olden days, Apple was very late to the table with basic OS features like memory protection. I don't really see iOS7 as fragmentation as much as a Gingerbread->Ice Cream Sandwich transition where Apple realizes they were off track and needs to correct.
Android Fragmentation is a problem because what version do you write your application for? 2.3? 4.0? Ideally 4.1/4.2, but that will miss 70% of your market who CAN'T upgrade because their Telco is not offering 4.1/4.2. Not because their hardware can't support it.
Anybody on an iPhone 4, 4S, or 5 who wants an iOS 7 app, can have it within a day of the iOS 7 release.
The downside of offering only one major phone a year, is that you don't occupy much shelf space in a best buy/AT&T - and someone casually looking for a phone, instead of a particular phone, is 90% unlikely to chose you.
The upside, is that when you offer a new OS - everyone gets to upgrade.
No, you write for the minimum API level your app requires. In a lot of cases, this is 2.3.3! If you want to target a wide audience and still use some 3.0/4.0 features, there are nifty polyfills available, too. Google is further improving this by extracting many of the super useful APIs out into a version-agnostic package that is updated independent of carrier say-so. Your API level 10 apps will run quite happily on Gingerbread, Honeycomb, and JB devices. Furthermore, Google supports multiple APK deployments, so you can actually build your application with different features for different API levels, and publish them under the same entry in the Play store; each device is routed to the proper APK.
However, despite all of this, you are talking about the exact same thing that Marco is. He says you "ideally" want to target iOS7, but you can't afford to do so if it means cutting out your iOS6 users. His conjecture is that since the established players are stuck on iOS6 (Gingerbread), the up-and-comers will have a shot to disrupt the market among the iOS7 (Jelly Bean) users. The assertion, flat out, is that "the big players being unable to upgrade their iOS6 software to meet iOS7 standards due to legacy support requirements is good for the app ecosystem", which is a pure RDF spin on the whole fragmentation issue that the Apple ecosystem loves talking about so much.
Marco's baseline assumption, if you read his full argument, is subtle. While it contains a few elements (as you've cited) of legacy support concerns - his principal thrust is not so much that you "can't afford to do so if it means cutting out your iOS6 users", rather, he primarily postulates:
"I don’t think most developers of mature, non-trivial apps are going to have an easy time migrating them well to iOS 7. Even if they overcome the technical barriers, the resulting apps just won’t look and feel right. They won’t fool anyone."
That is - the legacy app writers are going to have a huge challenge in trying to make their iOS 6 apps look "good" on iOS 7 - this is not a problem for someone starting from a greenfield scenario.
I really do think I get the thrust of his argument. I just don't think it holds up.
Let me ask this - why wouldn't app makers just migrate their apps to an iOS7 look and let the (by the arguments in this thread) vanishingly small non-iOS7 contingent deal with something that doesn't fit the theme of their system? If 90% of your userbase is going to be running iOS7, wouldn't it make more sense to just let 10% of your userbase have an app that doesn't feel like it matches the system (but definitely looks nice) rather than risk someone coming along and making a nicer-looking version of your app to steal the other 90% of your userbase?
The only way this becomes an actual talking point is if there is some reason that vendors can't upgrade their software to iOS7 guidelines, and that's the point at which you begin to experience fragmentation.
>Let me ask this - why wouldn't app makers just migrate their apps to an iOS7 look and let the (by the arguments in this thread) vanishingly small non-iOS7 contingent deal with something that doesn't fit the theme of their system?
It's not really about iOS 7 guidelines, it's about iOS 7 APIs, and you can't use iOS 7 APIs on iOS 6 devices. So if you still want to support iOS 6, your code has ifdefs everywhere and is a mess.
I don't know if his argument holds up either. I think he's approaching it with some baggage as the former proprietor or Instapaper. He frequently, on his podcasts, discussed his timing as to when he could simply "Require 5.x" as the iOS based on his user base.
That's the other problem with abandoning even a small percentage of your customer base - even though their existing app will continue to work the way it always has, the fact that they can't continue to upgrade to newer features, might results in negative reviews.
I agree with you, btw - that any app maker worth their salt, if they truly believe that 90% of the iOS userbase will be on iOS 7 in 3-6 months, should be prepared to completely abandon iOS 6 (except for those few that are targeting the iPhone 3GS and older iPod/iPad customer - big fish/small pond competitive technique) and focus all of their energies on iOS 7 development.
Ironically, this creates a positive feedback loop - as no more apps are being written for Pre iOS 7, people more quickly migrate to iOS 7, resulting in more developers completely focussing on iOS 7....
Marco's counterpoint might be, "The set of interest/resources/skills/focus that allowed a developer to build a leading iOS 6 App, might not be present for the new 7.0 paradigms, with their 3-D Z-axis geometry stacking of translucent tiles, inclinometer responsiveness, background processing. Someone who has an entire week at WWDC (yes, the videos are available - but nothing replaces 30-40 hours of onsite time) + all the developer networking (and drinking) that takes place might drive ahead and find the "Sweet Spot" in this new world.
Take, for example, Instapaper - perhaps a hungry up-and-comer will deliver a fully featured, iOS 7 ready read-it-later app, complete with background loading, fully 3D Sheet sliding of documents, light/colorful/iOS 7 palette brilliance etc... several months before Instapaper could be rebuilt. It's also possible (probably, as it turns out) that the original author of Instapaper didn't have the energy to rebuild Instapaper because they'd moved onto other things- And we haven't touched on this, "Upgrading an Existing App to iOS 7 gains a vendor no revenue (unless they have some IAP model)" - but does gain vendors of new iOS 7 apps lots of revenue.
In other words (and this hasn't been voiced yet) - there is a lot of incentive for NEW iOS 7 apps, but, unless you are a top 5,000 App on the AppStore, much less incentive to put a lot of energy into rewriting/upgrading an existing app to iOS 7.
It's not even comparable. Android has many more levels of fragmentation than iOS ever will and Google knows that they screwed up. You have display resolution * display density * OS version * carrier OS modifications * device hardware (hard keyboard/buttons/trackpad/...).
And the OS is specifically built to handle that. Layouts are't done in visual pixels, but in density-independent units. The layout engine itself is designed to accommodate different display sizes and aspect ratios. The entire OS is built around a capabilities system that is designed to make it run on devices that may or may not have a camera or trackpad or whatnot.
Android fragmentation is about the availability of API levels, not the variance in hardware.
> And the OS is specifically built to handle that.
Then it does a pretty crappy job. Why does the Play Store silently block apps from your device if this is the case? Why does every Android dev publish a list of supported devices? Because it's nowhere close to this simple, and can't be solved by improvements in the OS, which by the way don't reach the vast majority of users, hence the problem.
Well, my personal experience is that things break on different Android devices for various and sundry reasons despite the existence of dp. If apps run on the iOS simulator you rarely find issues on an actual device except performance issues. But you'd better test on the Note 2, Galaxy S2, Galaxy S3, Galaxy Tab, Nexus 7, HTC Desire C, LG Elite, ...
Another point that I think has been understated thus far:
Android, WP, and iPhone's visual philosophies are closer now than ever before. I can only assume this will lead to fewer "ecosystem-exclusive" apps, which I think is a net positive for everyone.
You give way too much importance to the visual part, the reason for some apps to be ecosystem exclusive are technical or legal limitations. Writing this on a rooted Android phone with third party keyboard, through a vpn...
I keep hearing about these similarities, but most seem superficial/only visual. How menu/table UI systems work aren't necessarily similar. How does that help apps get on every system?
I felt this way after IOS started to flourish in 2008-2009.
"Why didn't I start building iPhone apps back then."
Now IOS7, and Marco saying this, I can anticipate the same sensation if I don't hop onboard after this refresh.
This is wrong. The vast majority don't want their OS (whether it is phone, desktop, laptop, etc.) to suddenly change. They want stability. They have invested the time and effort to learn the UI and integrate it into their lives, change disrupts this and wastes their time.
If you force them into learning a new way to do things you have just reduced the friction of them switching to some other platform. And that's how you lose customers.
It's honestly not that different. Messaging is just as easy as before. Using Calendar is just as easy as before. All of the apps that aren't Apple's are the exact same and will be for the beginning of the official release. You underestimate people and their ability to adapt to change.
Thousands of developers will be perfectly happy to drop iOS 5 and 6 support and remodel their apps for iOS 7 – myself included – because:
a. The new APIs and Xcode look lovely to work with.
b. Dropping support for iPhone 3G and 3GS devices four years after they were released doesn't feel unreasonable.
c. Apple has a long history of featuring apps that use their latest APIs. Having your app featured is still the only reasonable hope to make money in the App Store, unless your business model revolves around selling Smurfberries.
d. Many developers will have been holding back from making major app changes because they were waiting to see how iOS 7 would change the design language. Now that they know, they can spend the Summer redesigning.
e. Apple are openly inviting developers to "reimagine your apps on iOS 7" - that's the language they've used in their developer emails.
f. A successful developer with a widely read blog has just come out and said that everyone who drops support for older iOS versions to build afresh on top of iOS 7 stands to gain a lot.
So there will probably be a huge host of "new, nimble" apps with new takes on tired old setups come Autumn.
But I bet a lot of torch app developers are feeling very hard done by.
Although he wasn't aware of the exact details or extent of the changes, rumors of a total overhaul have been circulating for months. He probably anticipated having to input hundreds if not thousands of hours of development time to prepare for iOS 7. At the very least, this must have been a factor in his decisions.
What users need is not more choice; as Marco said, the App Store is crowded. What users need is higher quality apps, which iOS 7 will purportedly enable. Therefore, developers who take advantage of this opportunity will make more money than those who don't.
I am using the beta on my primary phone, and I am starting to see what Marco is talking about.
I agree with you, although I think that Apple is betting that "looks more like iOS 7" is more like "works better thanks to features only available in iOS 7".
See this tweet by @flyosity:
"Damn, the UIKit animation/dynamics effects in iOS 7 are some seriously futuristic stuff. Can't wait to see what people do with it."
iOS 7 is in many ways a new beginning for both Apple, its users, and its developers, but I think the OA is relying a bit too much on hyperbole. In the DP, there weren't really too many navigational changes, and the UI was still instantly familiar. Yes, there are new APIs and other neat fancy rendering stuff, and you can't discount the influence of the application's look and feel, but... it didn't feel very different, at least with regards to an experience perspective. It looks markedly better (apart from about 5-6 odd-looking icons on the home screen), and designing around that will perhaps be the most daunting challenge.
It will separate the best from the worst, however, and this beginning, this chance to start fresh, is what I look forward to.
As a developer who hasn't published an app in the iOS App store, I love the point that Marco's making. One of my biggest biggest hesitations in developing an iOS application has been, how the hell can I differentiate myself? By being one of the first iOS 7 apps! I don't have to have some crazy sense of design, or think too hard about what gradients I use, since my app will stand out from the start. I think that's one of the main points in the argument.
As a developer who has friends who have developed "non-trivial" iOS apps, damn. This is pretty spot on exactly what happened (happening still even) on Android with pre/post 3.0 applications. Making sure that the UI works on both categories of devices is just awful. There are a few projects out there to help (ActionBarSherlock, HoloEverywhere), but it takes a lot of diligence, ESPECIALLY if you're trying to do combination tablet and phone apps.
A lot of the posts I've read on this thread are missing the point of the post. It's not just about change, its not about fragmentation, its about the excitement for newcomers to join an ecosystem that has felt super saturated for years. It may not actually shake the foundation of the app store, but it at least allows for new talent to enter on the same playing field as those who have been developing iOS applications for years. That's just exciting.
That's where the opportunity is. If "eventually" is six months, then you have six months to make your mark. By then, if you play your cards right and/or you get lucky, you can get in a good position and try to hold it, which you may not have had the chance before.
Good design is long-lasting - #7 of Dieter Rams ten principles of good design.
I wonder how many app designers will realise that the apps they produced were subject to the fashion of the design of the operating system, and now that the fashion and trend has moved, whether the app designers will be confident enough to apply their own timeless design.
The iOS7 colour palette and style is fresh and new (to iOS users), but it is just the next fashion, and as fresh and new as it feels today, it will feel equally old and stale (like iOS6) at some point in the future.
Good design is long-lasting. App designers should concentrate on getting their design right for their application, and not just follow the trend and wear the attire of the operating system.
Marco is right that when the fashion changes, those who cannot keep up with fashion leave a large opportunity for those who can. I also agree that there is also a lot of money to be made by being one of those who can follow fashion closely.
But from a design perspective... those who follow others (the operating system) rather than having the courage to lead (the right design for the app), will always be subject to vulnerability when the fashion changes.
1. Given Apple history of screwing developers, I doubt they are doing it for developers.
2. It's my opinion. I think the hybrid new design is much worse than the old one. I like the old design. It's different and not old.
3. How is starting from scratch is good for users? Remember, we are here for users and not developers. Also, there are lots of apps not affected by this change: games and apps with their own UI just come to my mind.
I've seen this in several places, but when people say Apple "changed everything about how use it" in this version of iOS, what did they actually change? The only big things I've seen with regards to the OS seem to be the flat iconography and the pull-up menu from the bottom that has the setting changes. Everything else looked approximately identical functionally (at first glance anyway).
There are changes, for instance, about how multitasking works, that an application can take advantage of. If one app doesn't take advantage of it, and another does, I may use one over the other more often. For instance, there was an app that used Push Notification and the Icon Badge to show the current temperature, and for a time I used that over The Weather Channel's app. Small things like this creates opportunities, this is what Marco was saying.
Apple is one of the companies that can really get away with an en-masse OS transition simply because they are basically guaranteed to get 80% on board in less than 6 months simply because they control the distribution mechanism and the device hardware itself. They can plan for these transitions.
Google and to a lesser extent Microsoft screwed up and are struggling to be able to keep their users on the latest and greatest. This is such a huge advantage for Apple that it can't be overstated.
On Android at least 36% of devices are still running Gingerbread (which is 2.5 years old). Android 4.x is finally up over 50% after being out for just over a year and a half.
So, whenever Google gets around to Android 5.0, it will probably be a whole year later (or more) before that is the mainstream targetable version of Android.
As a developer, you could argue this gives you more time to get around to building against the new api's, but at the same time that's remarkably slow user uptake compared to iOS.
I disagree with the significance being given to these UI changes. Many App Store niches are dominated by incredibly ugly apps that are functional and get the job done. Take a look at WhatsApp for a prime example.
This is precisely why I think modern approach to software UX and design is crazy. Like 15 years ago, in the Windows hegemony every sane designer was heavily relying on widgets, metaphors and flow defined by the system and MS could almost flawlessly upgrade the look of the entire ecosystem with XP's Luna. And without breaking the user experience.
Now everybody is looking for some virtual perfection in a different place and the user gets an awful, inconsistent, non-customizable and anti-interoperable clutter. WHY?
I've faced it in other platforms and it's easily overcome with software patterns that you should already be using. If you have a good separations of concerns and the UI is truly decoupled from your logic and models you should not be having any issues. If you don't already have that use this time to refactor and move forward. Some things won't be possible across both platforms. Find those now and address them first.
I'm not sure how a new OS update that changes more things than usual is fragmentation, especially when that OS has a very small number of devices that it runs on and a user base that by and large quickly updates to the new releases. It might be more of a disruptive update than usual, it might be good or bad for users in the short term, it might be good or bad for developers in the short term, but it's not fragmentation.
How can this be sustainable? all this fragmentation/versions - iOS new >=7 and <7, all different versions of android, windows 8 metro etc etc. how many different platform specific versions are these service providers like say pandora suppose to create?
I feel like unless the app is taking advantage of some inherent hardware capability of the phone/tablet everything should eventually be HTML5.
Your example illustrates what Marco is saying. A big player that needs to support all those different versions of different platforms may take more time updating their iOS app to adapt the latest standards. An indie developer, who has more agility and can choose to only support iOS 7, now has a possible advantage, if s/he chooses to take advantage of it.
"Most can’t afford to write two separate interfaces. (It’s a terrible idea anyway.)"
Why? If the newcomers can afford to write an interface for iOS7, the established players also can. If there is money in it for newcomers, there is money in it for established players too. This article assumes that established players are dumb. They will estimate how users adopt iOS7 over time, and act accordingly.
I think it assumes that established players are conservative, not dumb. They will do the least effort necessary to make their app look good in iOS 7, at least at first.
Aren't the well/better funded dev shops and apps going to more easily make the transition? Won't the smaller ones feel the pain of backward/multi version support more acutely? Didn't we just empower the status quo? True there'll be net new revs with no legacy issues but not sure how game changing that is.
Or, in the six months before the new shiny iOS appears, every single dominant or near-dominant application on the app store decides that it makes more sense for them to build a version that targets the other ecosystem.
I've been a rails developer for a while now, and contemplating whether or not to focus my energies on a js frontend framework or learning iOS. I think this post was all I needed to make a decision. Thanks Marco.
In case you've ever wondered why Marco frequently dismisses the Hacker News community, the comments on this article are a fantastic example. Never has "give it five minute" been more apt.
This is exactly the wrong attitude in my opinion. How does it help users to all of a sudden have most apps feel ancient? Is it really something to be proud of? That for the next year we'll be working on replacing existing utilities so that they feel "right" and "fresh" instead of doing what we should be doing: thinking of actual new software that is worthwhile to write.
I've been saying this for a while but I think what is happening, and what many developers haven't noticed yet, is that we have exhausted the utility of software for software's sake. The interesting stuff happening in mobile now has nothing to do with "design" in the traditional sense any more. Its not enough to just have a coder and a designer on the team. The really cool stuff is all about what your phone actually allows you to do in the real world. Look at apps like uber, postmates, spotify, and twitch.tv. Most of these have terrible UI's, but that's not the point. The point is that they allow you to do things. I can have a car on my doorstep! I can listen to almost any song I want. They're not just another calculator app or news reader, so who cares if its not the prettiest or easiest thing to use in the world. They are an interface to actually useful services. Software was interesting on its own a decade ago, but the industry has grown up, its time to do things now. That doesn't mean that "UI and UX" don't matter, it just means their definition changes and grows beyond just how you tap things on glass and what pixels you choose to animate.
The reason that iOS 7 seems comforting in the way its described in this article is because it gives developers who haven't realized this something to do again. Marco is absolutely right, for a long time it has felt like all the major categories have been covered on the app store. That's a good thing! It means we've solved lots of problems. We shouldn't daydream of a day when those problems get artificially unsolved so we can have another shot at them. We should move on.