This is exactly the wrong attitude in my opinion. How does it help users to all of a sudden have most apps feel ancient? Is it really something to be proud of? That for the next year we'll be working on replacing existing utilities so that they feel "right" and "fresh" instead of doing what we should be doing: thinking of actual new software that is worthwhile to write.
I've been saying this for a while but I think what is happening, and what many developers haven't noticed yet, is that we have exhausted the utility of software for software's sake. The interesting stuff happening in mobile now has nothing to do with "design" in the traditional sense any more. Its not enough to just have a coder and a designer on the team. The really cool stuff is all about what your phone actually allows you to do in the real world. Look at apps like uber, postmates, spotify, and twitch.tv. Most of these have terrible UI's, but that's not the point. The point is that they allow you to do things. I can have a car on my doorstep! I can listen to almost any song I want. They're not just another calculator app or news reader, so who cares if its not the prettiest or easiest thing to use in the world. They are an interface to actually useful services. Software was interesting on its own a decade ago, but the industry has grown up, its time to do things now. That doesn't mean that "UI and UX" don't matter, it just means their definition changes and grows beyond just how you tap things on glass and what pixels you choose to animate.
The reason that iOS 7 seems comforting in the way its described in this article is because it gives developers who haven't realized this something to do again. Marco is absolutely right, for a long time it has felt like all the major categories have been covered on the app store. That's a good thing! It means we've solved lots of problems. We shouldn't daydream of a day when those problems get artificially unsolved so we can have another shot at them. We should move on.
- Will I (and the apps I'm planning) be able to adapt to the UI changes smoothly or not?
- How are these changes going to play out on the iPad?
- Does Apple have any other foundation-shifting changes planned before iOS 7 comes out?
- Will these changes impact user behavior?
- And so on.
For everyone else, the risk vs reward calculation might look quite different.
> Software was interesting on its own a decade ago, but the industry has grown up, its time to do things now.
Smartphones are important because they allow people to use computers in a manner that wasn't feasible in the past because of the size of computing devices.
However, smartphones have not produced any important innovations from a software point of view. They are not innovative because they don't need to be; they are simply the logical consequent of the advances in the miniaturization of hardware.
The market saturation you describe is not the result of a glut of innovation, but rather is just a consequence of developers porting existing desktop functionality onto a smartphone. You can take pictures and use voice recognition software to ask where nearby restaurants if you carry around a desktop or a laptop; you just probably wouldn't want to.
Actually, smartphones are a good example of how non-innovative and illogical the computing world can be.
Underneath the touch-based interfaces, they run on a kernel written in a memory-unsafe, non-garbage collected, insecure, and inexpressive language that was designed to target the PDP-11. Because the operating system is so flawed and incompetent that it can't be trusted to prevent rogue processes from stepping on others, you end up with a lack of a useable file system and awful communication between apps. Of course, the joke is that these restrictions are almost worthless because you can almost always bypass the system security and jailbreak the device; then maybe you can do something useful with your phone without.
Its part of a larger trend in the computing world to ignore history and reinvent wheels, and call it innovation. Case in point: web apps, which, unless you are replacing a legacy application, are a terrible idea. Not only will they always face latency issues, but they also use awful languages designed to serve and manipulate text and are at least twice as slow as native code. Software on smartphones is interesting in the same way that a "Hello world!" program is interesting to C programmer; the programmer is so surprised that he has fought an unfriendly, unintuitive system full of kludges and managed to output text to the screen without the program crashing and dumping core that he doesn't notice how trivial the output is.
Of course, what we have today is better than nothing. In a way, the massive and unnecessarily resource intensive process that we have now is uplifting because it shows the determination of the human spirit; just imagine what we could accomplish if we invested in the right things.
Some examples of Smartphone innovation (in my opinion):
At 2:00 AM in bed, when I'm exhausted, and realize I have to wake up at 7:00 AM, but I'm tired enough that I might not be able to reach my alarm clock, and I can just flex my thumb and say, "Wake me up at 6:00 AM" before drifting off to sleep - I call that innovation. I don't know if you recall, there was a time that the big "thing" at Hotels was having them give you a wakeup call, so you wouldn't have to figure out how the alarm clock in the room worked (if they had one). I haven't done that in 5+ years. Used to do it every night when I was in a hotel.
I'm out walking at night in Redwood City, and Dark Sky pops up and tells me it's going to start raining in my neighborhood in about 15 minutes - so I turn around and grab an umbrella.
I'm out at a customer luncheon in Singapore, and they ask me about a presentation I did awhile ago, on my laptop, and I know backblaze has backed it up, so I'm able to grab it while standing there from my iPhone and mail it to them - that real time delivery makes a positive impression. (Replace Backblaze with Dropbox/iCloud/SkyDrive/favorite cloud mechanism that you can connect to from your mobile device)
Listening to a great german song in a Taxi in Aschaffenburg, and I pull up Shazam - and 90 seconds later I have it on my iPhone (and it's now been synced with iTunes match and is on my iPad and Laptop)
Late at night in PaloAlto, and I need a ride home - and Uber tells me there will be a driver at the restaurant in 4 minutes, and shows them driving towards me, and makes sure they show up - As somebody who had spent hours waiting for cabs in the middle of the night - I call that innovation.
Walking down the Stairs at Southwark in London, and getting the real-time update from TubeMap, that the London Bridge station is closed due to a station fire - turning right around and grabbing a cab to an important symposium.
I could obviously go on - but I find it so hard to believe you don't recognize all of this as, "innovation"
It feels like LowKarmaAccount really defines innovation as how far down the stack it is, which is a wholly arbitrary (and IMO nonsensical) criteria.
By that criteria a helicopter is not innovative because it's just a slight rearrangement of fixed wing aircraft, an "evolutionary" progression. By that criteria the Internet is not innovative because it was just the natural evolution and consequence of radio and telephony.
Wheeler does include the Internet (internetworking using diagrams, leading up to the Internet's TCP/IP)as an important innovation.
On the page I linked to , Section 6 is called "What is not an important software innovation?", which inspired my post in part. I'll quote a portion of it below:
" As I noted earlier, many important events in computing aren’t software innovations, such as the announcements of new hardware platforms. Indeed, sometimes the importance isn’t in the technology at all; when IBM announced their first IBM PC, neither the hardware nor software was innovative - the announcement was important primarily because IBM’s imprimateur made many people feel confident that it was “safe” to buy a personal computer.
"Note that there are few software innovation identified in recent times. I believe that part of the reason is that over the last number of years some key software markets have been controlled by monopolies. Monopolies typically inhibit innovation; a monopoly has a strong financial incentive to keep things more or less the way they are. Also, it’s difficult to identify the “most important” innovations within the last few years. Usually what is most important is not clear until years after its development. Software technology, like many other areas, is subject to fads. Most “exciting new technologies” are simply fashions that will turn out to be impractical (or only useful in a narrow niche), or are simply rehashes of old ideas with new names."
I'm thinking of the GeoLocation Applications - that just never existed before, ever, like runkeeper. Or the BodyTelemtry apps, like FitBit. I'm still trying to think if it's possible to categorize Dark Sky as innovation. Or the always-available voice-recognition of Siri.
None of it rises to the level of "Software innovation from a CS standpoint" if we look carefully? Honestly asking now.
I'm thinking that something like "robust speech recognition and parsing" might fit on there (though it's tough to determine when such a system would be considered mature enough, and simultaneously differentiated from full strong AI). But that's not a smartphone innovation, it just happens to be a good use case.
It really comes down to the definition/interpretation of "innovation," and I think many people (myself included) would feel a slight when certain things are excluded, but I can see how "taking things that already exist and putting them together in new ways or usefully extrapolating on them" (which is what any "innovative" app has done) doesn't really fit. As he explained about the smartphone, something can be world-changing in a very real way, without necessarily being innovative.
SMP - parallelism not among separate, isolated computers whose fastest connection is probably an Ethernet port, but among multiple cores in the same die, accessing the same core RAM. Of course someone had a SMP system decades ago, it's not that complicated an idea, but only recently has it become ubiquitous and critical to taking continued advantage of Moore's Law. Although it's fundamentally a hardware innovation, the ways we write multithreaded programs have evolved to take advantage of it - it's only recently that the sort of software used on millions of systems has had to focus on fine-grained locking and lock-free algorithms, rather than slapping big locks on everything with the only downside being latency. And more unorthodox approaches are gaining traction: CSP, though invented 35 years ago, is being rediscovered with Go, various languages have experimented with software transactional memory (e.g. PyPy, Clojure), and Intel's new processors will finally bring hardware transactional memory mainstream, which might finally realize the dream of fast transactional memory.
GPUs - massive parallelism of relatively small, minimally branching algorithms, again on a single chip found in millions of devices; again, a hardware innovation that requires new ways to write software. Yes, I know how old the transputer is, but now it's mainstream.
Virtual machines - a new consummation of the idea of time sharing, innovative in practice if not theory. It's my personal opinion that they're a massive hack, a poor man's multi-user system that accomplishes no more than a traditional kernel could have, with the right operating system design, with all the kludginess you'd expect from a system based on hacking kernels designed to run directly on the hardware into running side-by-side - but when disk space and RAM became cheap enough that it became obvious that each user of a server should have their own isolated copy of all software, allowing them to install and maintain whatever versions of whatever packages they need, Unix had developed so much around the idea of a central administrator that the new paradigm had to evolve rather than being designed. But who cares? Worse is better, the heritage of Unix and perhaps its conqueror - however it came about, ordinary users of multi-user systems now have more power on average than ever before. Consider the difference between a PHP script hosted on shared hosting and a modern webapp stack. And maybe a new design will come around to replace it all, one of these days.
Closely related, cloud computing - I suppose driven by the decreasing price of hardware. The idea of computing as a commodity is hardly new, but in the last few years it has become a reality: virtual servers can now be spun up and down in minutes, as part of massive server farms provided as a service to a huge number of users, for low cost. This is fundamentally changing the way software is designed: scalability is easier than ever, but it has become more and more useful to write distributed systems that can tolerate internal failure.
HTML5. You can enter a URL and instantly and safely run fast code. Yes, it's just a another VM; yes, Java had this a long time ago. But we avoided some of Java's mistakes and CPUs are faster, so who knows, write-once-run-anywhere might become a reality this time.
Sandboxing. We might still be stuck with C and unsafe code, but sandboxing is starting to make software a lot harder to exploit anyway. Software security in general is receiving a lot of attention these days.
Ways to write software: test driven development, agile, etc.
It helps users in that the "new design" is an improvement over the "old design".
A case study, the Windows file picker dialog:
Win3.1 dialogs felt foreign to 95, while XP-style apps feel foreign to Vista-style apps. There is no debate that from win3 to win7 the file picker dialog got improved each time. Easier item selection, easier reading, easier tree traversal, quicker arrival to favorites...
What is true for this dialog is true for the whole system. rundll32 tabbed preference panes survive in Win8 to this day. They felt like they came from the future back then on win95, while today they're old and crufty, alien and inscrutable.
This is not change for the sake of change. Design is how it works. Good design is thorough and holistic. If you only improve part of the system, you end up with a hodgepodge of UI versions with no conventions across the board and only results in an overall bad usability and maintainability.
At best, it looks like iOS7 is nothing more than a mix of Android and Windows Phone plus a few extra UI elements.
I am not a big fan of Marco's obsession with drama but I do respect his opinion and find his take on certain thing within the tech industry enlightening. Unfortunately, this post does seem like nothing more than turning a blind eye to what Apple has done with iOS7.
And don't be so quick to say, "or even what Android and Windows phone offers.." - Those are awesome platforms, and Apple can learn a lot from them (as Android and Windows have learned from the iPhone.) Indeed, the "Cards" approach in multi-tasking is actually reminiscent of what Palm's WebOS did  - so Apple has learned from them as well.
I am very intrigued as to how well the Multitasking/Notification driven background/User-interaction-Dependent-CPU-scheduling-for-background-processess will work - That's pretty innovative solution to a common problem of Craptastic background apps that you don't think about sucking your battery dry. [edit: And all the discussions about Apps checking in when you've got the radios on, when you are powered up, etc... - I've been dreaming about exactly that for 2+ years. Finally! (Non-ironically)]
Don't kid yourself - iOS 7 is the big one, and, will likely be as ugly as the upgrade was from 10.6.8 to 10.7.0 was on OS X - My prediction is that iOS 8 will basically be fixing all the glitches and problems created from this complete platform rethink - but, sometimes, to make an omelet...
Forest Fire Indeed!
Of course it won't. That just reads like hyperbole. Apps on iOS7 will work almost entirely the same as they do on iOS6 and below.
I'm not up in the city right now, but from the few podcasts I've listened to - people are already commenting on how "dated" their app feels.
And - with the exception of the Mail/Calendar (Apple Favoritism, at its worst) and wacky "geofencing wakeup in the background techniques", all of our backgrounded Apps on iOS have basically gone to sleep permanently (until user pulled them back into the foreground) unless they were a GPS/VOIP/Music/Newstand app (and even some newsstand apps performed poorly with background downloads - I'm looking at you NYT).
In addition to the 3-D geometry (where I'm expecting lots of slide over sheets as a new metaphor) - I think all developers are right now considering how they can take advantage of an App that they can "wake up" remotely from a notification and perform activity with.
And people are in shock after years of iOS looking pretty much the same. When everyone actually looks at things objectively, they'll realise that their apps need to look a little different, behave a little different, but overall be pretty much the same apps they were before.
What annoys many users (and every developer is also a user) is change for the sake of change. There is absolutely no technical reason for changing the style of icons on iOS 7. Clear fit right in on the iOS 6 home screen and was a joy to use regardless.
Sure the new APIs might offer some great features for apps and I am sure they will definitely be used in the majority of upcoming apps, but my problem (maybe not problem just curious observation or minor annoyance) is Marco taking a completely nonobjective stance on the iOS7 changes. I am completely ok with his opinion but the blind hyperbole is a bit annoying.
I guess we're moving the goal posts again.
I'm betting over 50% installed base will be on iOS 7 in less than a month.
Wow-level context: Jelly-bean is out about eight months and is at 28 some-odd percent. http://hothardware.com/News/Android-Jelly-Bean-Penetration-U...
I'll see your anecdotal and raise you a citation.
"The Dec. 12 reinstatement of Google Maps on iOS has apparently been enough for some of those reticent users to finally make the upgrade to iOS 6. After achieving 10 million downloads in the first 48 hours available, MoPub, the San Francisco-based mobile ad exchange that monitors more than 1 billion ad impressions a day and supports more than a dozen ad networks and 12,000 apps, says there has been a 29 percent increase in unique iOS 6 users in the five days following Google Maps' release on iOS.
BTW, not "blaming" Google, if anything, I would be blaming Apple for having a sub-par map application (As a frequent international traveler, I can tell you it's still not up to par with Google's maps) - but I'm not blaming anyone. Just recognizing it was a pretty significant issue for many people.
That clearly shows that Android users are happy with the current release and see no need to switch to a newer version because core features are missing completely, as is the case with iOS.
I've not saying fragmentation on Android isn't bad. What I'm arguing is fragmentation on iOS is being framed as a good thing (actually a "great" thing) by the author of the post.
iOS isn't going to fragment - the vast majority of the user base will immediately upgrade to iOS 7.
- there will be fragmentation
- hence it will be difficult for developers to write apps that make the most of iOS6 and iOS7
- hence this is an opportunity for nimble new players to enter the market and capture the iOS7 userbase
I agree that there probably wont be fragmentation, but the point is, Marco is saying if there was, it will be a brilliant opportunity. Therein lies the mental gymnastics.
"Most [developers] can’t afford to drop support for iOS 6 yet. (Many apps still need to support iOS 5. Some unlucky souls even need to support 4.3.) So they need to design for backwards compatibility, which will be extremely limiting in iOS 7."
Contrast that to iOS, sure different API's get added but at least the same versions of the os behave the same across devices. Sure for a very short time iOS will be split between versions but that will be exceedingly short and you'll never have the true fragmentation problems that Android has in iOS.
In one of the Google IO talks, Google has been encouraging developers to develop only on 4.0 and above. To quote, "go to where the puck is going" and "develop the best app possible for every phone".
If you are an independent iOS developer looking to hit a majority of devices, just develop for iOS 7 - pretty much everyone will be on that in a year.
If you are independent Android developer looking to do the same - you need to target at least ICS, Jelly Bean and Gingerbread to get a 90%+ market share. (And with Gingerbread support comes ActionBar Sherlock and other fun stuff). I'm in the middle of this right now for my app, and hating it.
When the iOS version adoption looks like the first table in the following page, then you'll have a fair point:
So there will be apps with the old ui for people running ios7 who will decide to look for a new thing to do X for the first time in years and thus a market opportunity. Whereas with android, the problem is that phones overwhelming remain on the OS they shipped with. It's not a matter of waiting 3 weeks for a fitting ui, it's that you have to develop for a 1-3 year old OS or write off half the market.
 lost url because I'm writing this on ios7 and it's pretty goddamn buggy, but it looks like others have provided it
Says who? This change is much more major and, despite what Gruber and Marco would like to have you believe, is not design perfection personified, and is likely to be polarizing.
In decreasing order of importance (my opinion)
o Pre-Launch background Updating (Finally!) - I spend 3-5 minutes, every morning, and another 3-5 minutes, every night, downloading my Podcasts before my walk home. Annoying. Now, in theory - they'll download for me in the morning and night. Awesome - particularly as my iPhone 5 is happily sitting in an elevation dock at 100% power for most of the day, or plugged in at home for most of the night. It Will be interesting to see what developers do with notification launched background apps - hopefully it won't be abused.
o Airdrop for iPhone (Finally! How many times have I wanted to get a file/image/content off my iPhone onto my Laptop just before a plane took off)
o Swipe Control Center - another "Finally" - getting closer to android parity/rooted iPhone parity - I tweak the brightness/lock rotation/pause/play music and settings pretty often. This will length the lifespan of my home button (double click + swipe left to currently get lock + music settings)
o Enhanced Camera/photo management - looks really nice - lot of people will like this - even the die-hard Camera+/Instagram/Flickr/Google+ photo types. I do feel kind of bad for Camera Noir - their window was pretty meager. :-(
o iTunes radio - particularly for all of us who already subscribe to iTunes Match.
o Advanced Siri - I use Siri many times a day, looking forward to this.
And, let's be clear - it's only a prohibitively long time if you wish to continue to get the new feature release platforms - indeed, it's only Today, June 11, 2013, that the original iPhone, released in 2007, is now being obsoleted by Apple , a full five years after it was discontinued.
My Motorola Defy came with 2.1 Eclair. Runs 4.2 as of now, thanks to Cyanogen. I keep it because it's built like a tank; all phones should have IP67 certification.
If you look at the devices cleared for iOS 7, it's almost definitely a RAM overhead issue. All the 512 MB+ devices (4, 4S, 5, Touch 5, iPad 2+ and Mini) are getting iOS 7. None of the 256 MB devices (Touch <5, iPhone <4, iPad 1) are getting it.
Does it suck? Yeah, definitely. At some point, though, you have to make the tough decision and say, "We have to compromise on either our vision or supporting older devices." When you get to that point, the choice isn't too hard. Is there anyone out there who would really argue that watering down iOS 7 is worth the increased support? (Especially with the US cellphone market and its subsidies being what it is?)
I feel for iPod Touch 4 owners. Like another poster said, those were on sale a few weeks ago. Every other unsupported device is terribly old, at least as far as the pocket computing world is concerned.
Looking forward to upgrading to an S2 or S3 when prices drop to frivolity. A larger screen will be nice.
Tools needn't be new nor flashy to be powerful.
And I'm not hand-waving that notion either. I write apps and I've seen device usage numbers for a number of apps across a number of verticals - 3GS usage really is a vanishingly low number, well below 1%.
Devs don't have anything against old phones, we don't care how often you give Apple your money. If enough people still used the 3GS, we'd support the 3GS.
When you're in the sub-1% group, without some contract guaranteeing support, expecting support is unreasonable.
To you the product's less than a year old, but the reality is that the technology in that product is closer to three years old already.
I say this a iPhone 3GS owner that still hasn't upgraded. It's a bitter pill, but that's the nature of technology.
I really think 3-4 years is more than reasonable support for a mobile device in the current market.
Of all the devices supported by iOS 6, only the iPhone 3G S and the 4th gen iPod touch won’t get the upgrade. The iPads that could run iOS 6 will also be able to run iOS 7.
The iPhone 4 has been out for 3 years and will be able to run iOS 7 when it is released later this year. How many 3 year old Android, Blackberry and Windows Phone models do you know of that can run the newest version of the OS?
(This post was edited to reflect that the 4th gen iPod touch won’t get the upgrade to iOS 7. The ‘iPod touch 16GB’ was listed on iOS 7’s page, I understood that to mean the 4th gen model, which also came in 16GB capacity.)
Not quite, at least as far as the iPod Touch goes. iOS 7 will only run on the 5th generation iPod Touch (see http://www.apple.com/ios/ios7/features/ - bottom of the page), the version with the 4-inch screen.
Apple was selling the 4th generation device up until May 30 of this year. In other words, there's people who bought this thing two weeks ago who will not be able to upgrade to iOS 7.
My fourth generation iPod Touch runs iOS 6 but does not qualify for iOS 7. Fifth gen only.
iOS 7 introduces automatic updates, but I don’t know whether that’s only for apps or for the OS as well. If both, then versions past iOS 7.0.0 should have an even faster adoption rate.
Vine only supports 4.0 and above & there's no uproar. It is a Top 3 app on Google Play.
Are people really likely to get in an uproar over Vine?
Facebook, Twitter, Instagram, sure... but Vine?
And really, I'm not entirely shocked that a Twitter-owned service is getting more sharing on Twitter compared with a service that gets a reasonable amount of obstruction from Twitter.
 By which I mean Froyo, Gingerbread doesn't hit its third birthday until December (at the earliest).
Beyond the parallax effect, what are these new navigation and structural changes? I'm not trolling here, I'm genuinely curious. I'm about to build a new iOS app and I did not see major navigation or structural changes that would drastically affect how I design an app's UI or UX.
That being said, there are a few new things that do stand out:
1) Everything is supposed to be fullscreen as often as possible, with the chrome hiding as much as possible. "Deference to content."
2) Transitioning between view controllers should ideally "zoom" in to new content. Like the calendar app zooming into a day view from a month view or a year's worth of photos zooming into a collection of chronological "collections." In other words, your transitions should communicate _how_ the view controllers relate more than just you're replacing one with another.
3) Text Everywhere. (only new on iOS)
4) More abstract but also more physical (read: mimicking the laws of nature, like physics, rather than mimicking things humans have made, read: skeuomorphism)
There are many more changes, you can read about them in the new iOS 7 Human Interface Guidelines: http://www.slideshare.net/evgenybelyaev16/mobile-hig-2278458...
If I'm building a cross-platform app (or have the intention of going cross-platform eventually), I try to steer clear of any interactions or UI elements that are too obviously from one platform or another.
I really appreciate apps that have their own distinct look and feel which don't use "proprietary" UI features (like the header bar and back button). The most recent app that I've admired is Dots. It doesn't look Androidy or iOSy. It looks like it's own thing. The team behind that did a fantastic job.
If I'm using a news reader or productivity app, I hate having to spend time learning a novel interface. The more conventional the UI, the more easily and quickly I can use it. Unless uniqueness adds a tangible benefit, it's just a pain in the butt.
But hey, it's a (really buggy!) beta.
I'm getting pretty tired of blog post titles that give no real hint at all as to the topic. My mind labels them as "pretentious" because they're pretending to be deep when they're not; by that I mean that rarely do the posts offer up any kind of non-trivial insight. (Compare with PG's similarly titled essays.) Earlier today we had "Two i's" from DHH's crew (where you had to infer from reading the post that it was i for interesting and i for important). Dustin Curtis recently had "Glass". Please, you will still be the coolest people on the planet if you don't try to title your posts per the ineffable style of Apple product advertisements, and as a bonus we will even like you and your work a little bit more.
Honestly - "Fertile Ground" is a pretty damn good title. I can already imagine people casually discussing it at WWDC "Did you catch Marco's "Fertile Ground" post?"
Of course, these have all lost their Zen master cachet, but for me that's a distinct advantage. Good writing doesn't need to be cool.
The best writers put a lot of effort into their titles - great example is Gruber - http://daringfireball.net/archive/
He has a lot of titles like, "Mountain Lion" and "Walter Isaacson’s ‘Steve Jobs’ " and "You Do Not Need to Manually Manage iOS Multitasking " but he also mixes things up (in a positive manner, I would argue), with, "I’ma Set It Straight, This Watergate " (about iBooks Author not following standards for eBooks), and "Get the Fainting Chair" - about Google being shocked by Apple not renewing their license for Google Maps on the new iPhone.
I guess it's unfortunate that Blogs don't have subtitles.
Clearly he's succeeded, from the conversation going on in this post. Marco is occasionally pretty obnoxious, and his ideas are frequently not always well thought-out, but I like writers who attempt to be not only clear, but compelling.
Paul Graham has plenty of terrible essay names, and plenty of terrible essays at that. One of the things that frustrates me about him as a writer is that on occasion he attempts to strike an "objective" tone while offering a skewed and entirely subjective perspective. Besides, conveying pure abstract information should not be the point of an essay. Tone, emotion, purpose, and construction should all be deliberate. Otherwise you get Mashable—ultraspecific titles geared to ultraspecific articles which are so drained of anything beyond pure bullet-point content that you could train a machine to read and interpret it. It's such a waste.
It takes you two seconds to click on a link and look at it, ten seconds tops to decide if you're going to gain something by reading it. If you're clicking on so many links per day that twelve seconds here and there is putting a dent in your productivity/well-being, then there is a worse problem here than ambiguously-titled essays, and ironically, it's a problem that you'll start to solve by seeking more challenging pieces of writing to tackle to distract yourself from the constant useless information mill that the Internet so readily provides.
I can understand how given the context of WWDC yesterday it might be more obvious, and that there is an argument for treating blog posts as ephemera, but in general I like it when things can maintain their meaning for months, years, or indefinitely.
Look to other operating systems that evolved their UI in similar fashion and a few of their dominant software players over the years:
Windows-Office, Quickbooks, Quicken, IE/Chrome/Firefox/Netscape (which have shifted favor over years, but not because of UI changes in the OS)
Mac OS (X and classic)-Adobe PS, Illustrator, ProTools, Office
UI changes, even major ones, have had little to no effect on the dominant software titles for those systems. There have occasionally been new categories of software introduced. For instance, high quality video editing software for the home market, which was made possible by better home cameras and major advances in speed and resources of home computers. Pervasive internet allowed the browser wars to happen. It wasn't minor UI changes in the OS that allowed new players to come onto the scene, it was major technological advances.
If you go back far enough, you can argue that the change from command line to GUI allowed for just such a revolution described (it definitely did: Wordstar/Wordperfect lost to Word, Lotus lost to Excel, AutoCad nearly lost its throne, etc.). But, nobody in their right mind is arguing that iOS 6 to iOS 7 is the difference between DOS and Windows 3.1 or between an Apple IIe and Lisa or the first Macintosh.
History isn't always the best indicator in the tech industry, but in absence of other indicators, I'll bet on history repeating in some form.
This will be a chance to catch any "lazy" dominant apps if they don't upgrade soon enough. I'm not sure if there are any dominant players that are lazy - seems like you'd need to keep on your toes if you're gonna stay dominant. So not a whole lot might change, anyways.
Adobe's tools stood unchanged in the X switch, in no small part because they got X versions out pretty quickly that felt relatively at home. Quark, IIRC, took forever to make an X version, so Adobe was able to bring out an X page layout app and eat their lunch. Of course, it didn't hurt that everyone pretty much loathed XPress, especially its heaviest users...
Marco argues that while this behavior is bad for incumbents, it's good for new entrants, but that doesn't make any sense; if you're a new entrant, your goal over the long term is to become an incumbent. Blowing everything up may benefit you today, but if you survive long enough to see the next demolition spree, then you'll be the one getting 'sploded. It's like arguing that living next to a volcano is good for development because it periodically clears out old building stock.
A good platform is one you can build a business on. Sudden, dramatic change that flushes your investment to date down the toilet is bad for business.
The platform was designed the iPhone platform between 2005 and 2007, and while it served its purpose very well I don't think that we should limit innovation for the sake of supporting old apps.
Also - he was highlighting that this will suck somewhat for incumbents (which, having sold Instapaper, and "The Magazine" he is not), but that it offers great opportunities for people willing to fully commit to the new iOS 7 worldview (who in turn may be wiped out in another 8 years - but hey, 8 years is a long time.
Regarding, "If you're going to invest time and money in a platform, the last thing you want is to find out after that investment has been sunk that the platform steward likes to periodically smash everything."
Heck, based on a small sampling of anecdotes from iOS developers I know - I bet 90% of all iOS developers make the majority of their income on an app in in less than six months, fewer than 10% experience "significant income" over a full year, and less than 1% have multi-year incomes from an app that is enough to have a comfortable six-figure salary. The nature of iOS application development is different from Desktop Application Development, and, for most (excluding the top 5,000 Apps) - the real money is in new applications.
So - yes, this may be jarring for that 1% - but the other 99% have a great opportunity opening for the next six - nine months.
There are better ways to allocate development time than reimplementing an already-fine OS.
I'm still flabbergasted that Apple did their big overhaul (apparently) without addressing inter-application communication, default apps, Siri APIs or most of the other "power user" issues that would have helped them grow their computing paradigm. To me, increasing the scope and power of iOS devices while maintaining the predictability and control that are at the heart of Apple's take on mobile/touch-based computing is the problem Apple has to solve over the next few years. Judging from what I've heard about iOS 7, it sounds like Apple couldn't disagree more.
Sign me up as skeptical re: the coming app store revolution.
”Examine your app for hard-coded UI values – such as sizes and positions – and replace them with those you derive dynamically from system-provided values. Use Auto Layout to help your app respond when layout changes are required.”
Now, I may be reading too much into this, but the use of the word ‘when’ sounds to me like Apple is preparing products with other resolutions than are on the current iOS devices. I think there would be a market for a budget iPhone with a smaller screen, and a high-end iPad with a larger screen.
Also, the icon size for apps is different in iOS 7. It’s going up from 114x114 to 120x120.
The algorithms that apple uses for the top lists promote established players, the search functions suck, and the interface for scrolling through lists of apps are so slow and clunky that it discourages users from exploring beyond the first 5-6 results in any list.
Android fragmentation is about VERSIONS. APIs, bug fixes, security updates, and available features to a lesser extent. It has NOTHING to do with look and feel. At all. This tactic from Android fans is sad. Just expand the definition of fragmentation until it no longer means anything.
Ridiculous. When 90% of devices are using the latest OS within a year of release, trying to call that fragmentation is complete bullshit. No platform in history has been able to tout those kinds of upgrade numbers, and it's a distinct advantage for both users and developers.
Fragmentation is about your userbase being split across incompatible platforms and the paralyzing effect that has on development, which is precisely what Marco was lauding as a good thing in this post.
That isn't fragmentation. iOS 7 is still compatible with apps written for iOS 6 and iOS 7 apps will still work with iOS 6. The same cannot be said for ICS apps running on older devices. Even in-house apps like Chrome are ICS or bust. Even now, multiple years later, the most popular version of Android won't run Chrome or any other ICS exclusive apps.
However, iOS 7 apps will LOOK & FEEL out of place on iOS 6 and vice versa, which is where the disruption occurs. This isn't fragmentation, and no amount of arguing will make that the case.
On top of that, with the notable exception of the iPod touch 4, all iOS devices sold for the last four years will run iOS 7. That means this period of disruption will last for six months to a year, max. Once again, no fragmentation.
I mostly take umbrage at the specific attitude taken towards this, when the same move in similar ecosystems (the ICS/JB upgrade, for example, which saw a huge market open up for "Holo-themed" apps) was mocked and derided as pulling the rug out from under users' feet.
iOS' consistency has been tirelessly lauded as a good thing, until Apple goes and changes it. I'm happy with progress and change, and am fine with the broken eggs required to make that particular omelette; I just think it's funny how the pundits' headlines change based on how their particular horse is doing.
I would love to see such an example. As a follower of many Apple themed blogs, I saw nothing but good thoughts directed towards the release of ICS, which was a sorely needed UI revamp. I don't remember anything even close to this sentiment being expressed.
Edit: After 15 minutes of googling, I'm unable to find an example to back up my assertion; I withdraw it. I'm still pretty sure it's out there, but I won't ask you to take my word for it. :)
iOS is a lot different than Android. I don't think people realize how conservative Apple is. In the olden days, Apple was very late to the table with basic OS features like memory protection. I don't really see iOS7 as fragmentation as much as a Gingerbread->Ice Cream Sandwich transition where Apple realizes they were off track and needs to correct.
Anybody on an iPhone 4, 4S, or 5 who wants an iOS 7 app, can have it within a day of the iOS 7 release.
The downside of offering only one major phone a year, is that you don't occupy much shelf space in a best buy/AT&T - and someone casually looking for a phone, instead of a particular phone, is 90% unlikely to chose you.
The upside, is that when you offer a new OS - everyone gets to upgrade.
That's the Android fragmentation issue.
However, despite all of this, you are talking about the exact same thing that Marco is. He says you "ideally" want to target iOS7, but you can't afford to do so if it means cutting out your iOS6 users. His conjecture is that since the established players are stuck on iOS6 (Gingerbread), the up-and-comers will have a shot to disrupt the market among the iOS7 (Jelly Bean) users. The assertion, flat out, is that "the big players being unable to upgrade their iOS6 software to meet iOS7 standards due to legacy support requirements is good for the app ecosystem", which is a pure RDF spin on the whole fragmentation issue that the Apple ecosystem loves talking about so much.
"I don’t think most developers of mature, non-trivial apps are going to have an easy time migrating them well to iOS 7. Even if they overcome the technical barriers, the resulting apps just won’t look and feel right. They won’t fool anyone."
That is - the legacy app writers are going to have a huge challenge in trying to make their iOS 6 apps look "good" on iOS 7 - this is not a problem for someone starting from a greenfield scenario.
Let me ask this - why wouldn't app makers just migrate their apps to an iOS7 look and let the (by the arguments in this thread) vanishingly small non-iOS7 contingent deal with something that doesn't fit the theme of their system? If 90% of your userbase is going to be running iOS7, wouldn't it make more sense to just let 10% of your userbase have an app that doesn't feel like it matches the system (but definitely looks nice) rather than risk someone coming along and making a nicer-looking version of your app to steal the other 90% of your userbase?
The only way this becomes an actual talking point is if there is some reason that vendors can't upgrade their software to iOS7 guidelines, and that's the point at which you begin to experience fragmentation.
It's not really about iOS 7 guidelines, it's about iOS 7 APIs, and you can't use iOS 7 APIs on iOS 6 devices. So if you still want to support iOS 6, your code has ifdefs everywhere and is a mess.
That's the other problem with abandoning even a small percentage of your customer base - even though their existing app will continue to work the way it always has, the fact that they can't continue to upgrade to newer features, might results in negative reviews.
I agree with you, btw - that any app maker worth their salt, if they truly believe that 90% of the iOS userbase will be on iOS 7 in 3-6 months, should be prepared to completely abandon iOS 6 (except for those few that are targeting the iPhone 3GS and older iPod/iPad customer - big fish/small pond competitive technique) and focus all of their energies on iOS 7 development.
Ironically, this creates a positive feedback loop - as no more apps are being written for Pre iOS 7, people more quickly migrate to iOS 7, resulting in more developers completely focussing on iOS 7....
Marco's counterpoint might be, "The set of interest/resources/skills/focus that allowed a developer to build a leading iOS 6 App, might not be present for the new 7.0 paradigms, with their 3-D Z-axis geometry stacking of translucent tiles, inclinometer responsiveness, background processing. Someone who has an entire week at WWDC (yes, the videos are available - but nothing replaces 30-40 hours of onsite time) + all the developer networking (and drinking) that takes place might drive ahead and find the "Sweet Spot" in this new world.
Take, for example, Instapaper - perhaps a hungry up-and-comer will deliver a fully featured, iOS 7 ready read-it-later app, complete with background loading, fully 3D Sheet sliding of documents, light/colorful/iOS 7 palette brilliance etc... several months before Instapaper could be rebuilt. It's also possible (probably, as it turns out) that the original author of Instapaper didn't have the energy to rebuild Instapaper because they'd moved onto other things- And we haven't touched on this, "Upgrading an Existing App to iOS 7 gains a vendor no revenue (unless they have some IAP model)" - but does gain vendors of new iOS 7 apps lots of revenue.
In other words (and this hasn't been voiced yet) - there is a lot of incentive for NEW iOS 7 apps, but, unless you are a top 5,000 App on the AppStore, much less incentive to put a lot of energy into rewriting/upgrading an existing app to iOS 7.
Android fragmentation is about the availability of API levels, not the variance in hardware.
Then it does a pretty crappy job. Why does the Play Store silently block apps from your device if this is the case? Why does every Android dev publish a list of supported devices? Because it's nowhere close to this simple, and can't be solved by improvements in the OS, which by the way don't reach the vast majority of users, hence the problem.
Android, WP, and iPhone's visual philosophies are closer now than ever before. I can only assume this will lead to fewer "ecosystem-exclusive" apps, which I think is a net positive for everyone.
If you force them into learning a new way to do things you have just reduced the friction of them switching to some other platform. And that's how you lose customers.
a. The new APIs and Xcode look lovely to work with.
b. Dropping support for iPhone 3G and 3GS devices four years after they were released doesn't feel unreasonable.
c. Apple has a long history of featuring apps that use their latest APIs. Having your app featured is still the only reasonable hope to make money in the App Store, unless your business model revolves around selling Smurfberries.
d. Many developers will have been holding back from making major app changes because they were waiting to see how iOS 7 would change the design language. Now that they know, they can spend the Summer redesigning.
e. Apple are openly inviting developers to "reimagine your apps on iOS 7" - that's the language they've used in their developer emails.
f. A successful developer with a widely read blog has just come out and said that everyone who drops support for older iOS versions to build afresh on top of iOS 7 stands to gain a lot.
So there will probably be a huge host of "new, nimble" apps with new takes on tired old setups come Autumn.
But I bet a lot of torch app developers are feeling very hard done by.
Tumblr, Instapaper, "The Magazine.", atp.fm?
> laughing his way to the bank
Error: type mismatch identified.
I am using the beta on my primary phone, and I am starting to see what Marco is talking about.
This isn't always true. Word of mouth is a much more powerful force than raw quality. Look at eBay, for example.
Most apps luckily don't have many network effects. But I won't be switching PDF readers just because the new one looks more like iOS 7.
See this tweet by @flyosity:
"Damn, the UIKit animation/dynamics effects in iOS 7 are some seriously futuristic stuff. Can't wait to see what people do with it."
There are so many more interesting targets, I hope we don't focus our best on new skins for flashlight apps.
It will separate the best from the worst, however, and this beginning, this chance to start fresh, is what I look forward to.
As a developer who has friends who have developed "non-trivial" iOS apps, damn. This is pretty spot on exactly what happened (happening still even) on Android with pre/post 3.0 applications. Making sure that the UI works on both categories of devices is just awful. There are a few projects out there to help (ActionBarSherlock, HoloEverywhere), but it takes a lot of diligence, ESPECIALLY if you're trying to do combination tablet and phone apps.
A lot of the posts I've read on this thread are missing the point of the post. It's not just about change, its not about fragmentation, its about the excitement for newcomers to join an ecosystem that has felt super saturated for years. It may not actually shake the foundation of the app store, but it at least allows for new talent to enter on the same playing field as those who have been developing iOS applications for years. That's just exciting.
There is some money in porting something between the old and the new playing field, but the incumbents will eventually update.
I wonder how many app designers will realise that the apps they produced were subject to the fashion of the design of the operating system, and now that the fashion and trend has moved, whether the app designers will be confident enough to apply their own timeless design.
The iOS7 colour palette and style is fresh and new (to iOS users), but it is just the next fashion, and as fresh and new as it feels today, it will feel equally old and stale (like iOS6) at some point in the future.
Good design is long-lasting. App designers should concentrate on getting their design right for their application, and not just follow the trend and wear the attire of the operating system.
Marco is right that when the fashion changes, those who cannot keep up with fashion leave a large opportunity for those who can. I also agree that there is also a lot of money to be made by being one of those who can follow fashion closely.
But from a design perspective... those who follow others (the operating system) rather than having the courage to lead (the right design for the app), will always be subject to vulnerability when the fashion changes.
2. It's my opinion. I think the hybrid new design is much worse than the old one. I like the old design. It's different and not old.
3. How is starting from scratch is good for users? Remember, we are here for users and not developers. Also, there are lots of apps not affected by this change: games and apps with their own UI just come to my mind.
Google and to a lesser extent Microsoft screwed up and are struggling to be able to keep their users on the latest and greatest. This is such a huge advantage for Apple that it can't be overstated.
On Android at least 36% of devices are still running Gingerbread (which is 2.5 years old). Android 4.x is finally up over 50% after being out for just over a year and a half.
So, whenever Google gets around to Android 5.0, it will probably be a whole year later (or more) before that is the mainstream targetable version of Android.
As a developer, you could argue this gives you more time to get around to building against the new api's, but at the same time that's remarkably slow user uptake compared to iOS.
Now everybody is looking for some virtual perfection in a different place and the user gets an awful, inconsistent, non-customizable and anti-interoperable clutter. WHY?
I feel like unless the app is taking advantage of some inherent hardware capability of the phone/tablet everything should eventually be HTML5.
Why? If the newcomers can afford to write an interface for iOS7, the established players also can. If there is money in it for newcomers, there is money in it for established players too. This article assumes that established players are dumb. They will estimate how users adopt iOS7 over time, and act accordingly.