"If the cross-platform experience is subpar, Apple should just let these apps fail in the market"
Perhaps, like Nintendo, they learned the lessons from the collapse of the home video game market in 1983. When Nintendo was contemplating developing the NES, they took a deep look at what had caused the collapse. What they concluded was that the main cause of death was the market being flooded with too many crappy games.
Originally, if you wanted to write, say, an Intellivision game, you went and got a job with either Mattel or APh Technological Consulting (the company that did the hardware design, system software, and many of the early games for Mattel). If you wanted to do an Atari game, you went and got a job with Atari.
The games at this time were all pretty good. A consumer could go out, buy a game based just on the information printed on the box, and go home and be pretty sure they'd have a good experience.
As time went on, a few more companies joined the party. Activision and Imagic, for instance. These companies were started by people who had worked for Mattel or Atari or their contractors like APh, and generally produced quality games.
A consumer still could be confident that plunking down $40 or whatever for a new game, based just on the box description, would be a good move.
More time passed, and companies that had little or no connection to Atari and Mattel jumped in, using information gleaned by reverse engineering the consoles and system software. The information was not always complete, and they didn't know all the tricks and techniques we authorized developers knew to squeeze greatness out of the hardware. They produced a lot of crap games.
Consumers now found that spending $40 on a game was a big gamble. They had to work to get good games--be aware of brands, read reviews. They stopped buying--all games, not just the bad games.
Nintendo's conclusion was that their new console must be locked down. Only developers that Nintendo approved would be allowed to produce cartridges. This way, they could ensure that quality remained high, and get the now shy consumers to come back and give games another change.
It clearly worked--and consoles have been locked down ever since, and the console game market is huge.
This is exactly what I thought the original app approval process would be for: an Apple "seal of quality". That would be a fine trade off for users -- they may only get the approved apps, but at least they're screened for quality.
However, that isn't what the approval process is. There are literally thousands of crappy applications that were happily approved and clogging up all categories in the app store. It seems non-trivial app rejections are not done on behalf of the user but are done solely to protect Apple's own interests. Remember when a bunch of high-quality Google voice apps disappeared from store? And that's just one example, there have been many more.
And now they're rejecting apps not based on their quality, but based on the programming language or development environment used to create them. How is that at all relevant to the user? This is entirely about protecting Apple's own interests and the comparison to Nintendo's lock down of the NES is not applicable.
> This is entirely about protecting Apple's own interests and the comparison to Nintendo's lock down of the NES is not applicable.
What I don't understand is how this isn't anticompetitive behaviour. By creating an app store and lock-in for application vendors, Apple become the only provider in the market. They now appear to be leveraging that monopoly to restrict another market, that of developer tools, to their commercial advantage and at the expense of a competitor.
I'm not a lawyer and don't live in the US where presumably any legal action would be brought, but can someone please explain to me how this isn't black-and-white illegal under US law?
Because they don't have a monopoly on smartphones. They control their own platform, but as long as there's a reasonably competitive smartphone market out there, it's not illegal.
First, we have the hardware, and as you say, Apple is certainly in a competitive market with its iPhone offering.
Then, for each type of hardware, we have the software that runs on it. Anyone could write software to run on Apple's hardware, but because Apple lock down the phone and run the only app store in town, they have a de facto monopoly on the supply of software to iPhone users.
Finally, we have software development tools. Again, anyone could write tools to help software developers using Apple's hardware. Indeed, according to recent reports connected to this story, many people have, from Adobe's Flash CS5 team to fans of Haskell.
Again, I'm no lawyer, but I would expect that Apple would be perfectly within its rights to lock down the software that can run on its device, but would not be allowed to use that power to unduly influence the secondary market of how that software can be made.
In all likelihood, Apple would argue—successfully—that the market should not be defined as the "iPhone software" market but as the "smartphone software" market, at which point they no longer have a monopoly and are no longer subject to anti-trust scrutiny.
What gets interesting is when/if Apple's dominance continues to grow to the point that the "iPhone software" market is effectively the "smartphone software" market. At that point, the DoJ and/or FTC will almost certainly decide that many of these policies are anticompetitive and initiate some sort of action to remedy the situation.
So in some sense, Apple needs to worry about becoming too successful: it is only because they haven't achieved total dominance that they can get away with being so ruthless.
>> In all likelihood, Apple would argue—successfully—that the market should not be defined as the "iPhone software" market but as the "smartphone software" market, at which point they no longer have a monopoly and are no longer subject to anti-trust scrutiny.
"Apple responsible for 99.4% of mobile app sales in 2009"
... it doesn't take a Phd to acknowledge that the AppStore IS the mobile SW market - thus - i would rather disagree that Apple argument SHOULD be successful in case somebody (or some agency) should bring a legal challenge to the new draconian policies - which are obviously abusing Apple dominance ...
the question for me is more technical = who can bring such a legal challenge? can developers do that? can users do that? or only a govt. agency can ... in which case - considering the influence and connections Apple has with Washington - that might not happen ...
would it be possible for developers to team up in class-action suit based which could then trigger a govt. investigation?
The fact that 99.4 cents of every dollar spent "after the hardware purchase" goes to apple platform.. doesn't make it a monopoly.. it merely means that it is the most successful add-on market..
A good example for comparison.. is a bit old, but illustrates this particular point beautifully.
The fact that more companies are interested in producing add-on stuff for a product and that consumers are more interested in BUYING that stuff.. doesn't mean that a company is anti competitive/antitrust regulated/a monopoly..
Volkswagen beetle aftermarket parts spent some 30+ years as the king of aftermarket parts .. everything from "third party" replacement oem style parts (stuff that matched the original but was cheaper for whatever reason) as well as stuff that essentially completely changed the product into something else (dune buggy conversions, engine swaps, totally different interiors, etc)
Was VW a monopoly because for 30 years 4 out of every 5 dollars spent on "aftermarket parts" was spent on Beetle bits? no and no one ever thought to consider or call it one.. it was just a hugely successful model that didn't change every 11 months, and therefore was a fixed point in space for manufacturers to target.. but more importantly CONSUMERS WHERE BUYING.. as opposed to your avg Ford/GM/Chrysler buyer who for the most part do NOT just go out and buy total conversion kits/hopped up engine parts.. there where many manufacturers who made parts for various successful models such as muscle cars over the years, and still do.. they didn't cry about antitrust because VW add-on makers made more money, nor did they cry that VW should change the way they made the beetle so that "beetle engines" would fit in any car (or vice versa)
There is NO ONE who could bring a class-action and win, and there is no way that adobe could sue and win either.
Because "customer choices" when they have real choice, do not make a monopoly, rather removing customer choice creates a monopoly.
you raise an excellent point and explain it very well with your VW example -
but still it seems the aforementioned laymen had the right intuition - as the appstore draconian policies are finally being looked at by regulators ...
"According to a person familiar with the matter, the Department of Justice and Federal Trade Commission are locked in negotiations over which of the watchdogs will begin an antitrust inquiry into Apple's new policy of requiring software developers who devise applications for devices such as the iPhone and iPad to use only Apple's programming tools."
Anyone who buys apps would have standing, imo. but the developers/dev tool makers might have a better chance. Hell, if you go even farther, you could say other smartphone makers have standing - but they'd have to make a different kind of challenge.
This is a chicken-egg situation. Apple has Apps, so more people buy iPhones than other phones. with Adobe's dev tools, devs could make apps for more than one platform - that is a threat to apple. When apple shuts this down it is an anticompetitive act against other phone makers. Because if devs could put apps on more than one platform, then other platforms would become more appealing.
'Software that runs on the iPhone' is, in the end, too narrow a window to consider for antitrust. If the iPhone were the only smartphone game in town, then the situation would be different, but as things stand, it's not.
Apple's market is one of their own making, and is in fact something of a submarket. They control the platform, period, and when you agree to the terms of their developer agreement, you agree to be bound by them, so you are subject to the same rules as everybody else. They don't have any market monopolized, they merely have their corner of the market locked down.
and they aren't locking in any body. You don't forfeit rights to your source code, you're welcome to write some crossplatform app using some shared code library you build in-house. You just have to build apps natively, rather than use some watered down piss poor Common language that breaks standards on all platforms. This is the same reason adobe apps suck on macs. They have tried to abstract away the os from their applications to the point where they don't look, act, or function properly on any platform. They look like ass and run like ass on everything .
Adobe apps function great on windows and better than Macs on linux... have you ever considered the fact that your overpriced mac is what sucks? no - cause Jobs told you it was Adobe's fault.
There are so many flaws in this reasoning, I don't know where to start.
#1 - Two of the worst carts preceding the '83 crash were E.T. and Pacman, both developed and produced by Atari itself, not these mysteriously inferior 3rd parties you're alluding to. And how many games has Apple, who logically has the most know-how on the platform, produced? None.
#2 - You're making an oranges to apples comparison anyway. The video game market was not crashed by the availability of cross-compilers or tools that lowered the bar of entry. Similarly, Nintendo did not solve the problem by restricting what tools developers could use. They solved it with a strict editorial process.
#3 - Video game production in 1983 required producing and marketing physical goods. It relied on predictable "hits" just like AAA game development to recoup the considerable outlay required to get these games in front of consumers in the first place. iPhones games are virtual and the marketing for many of them non-existent (simply because I can't spend $0.25 CPC on Google trying to sell my $0.99 app). Additionally, there's a long tail of developers creating a more robust landscape of content. There can be tons of failures and still leave plenty of room for successes. Just look at how many games on the iPod have made it big. Many of them came from virtual "nobodies".
#4 - In 1983, there was no manifestation of "wisdom of the crowds" to guide any consumer purchases. Word of mouth was about it. Today, at Apple's scale, one can find dozens of opinions about the quality of a game that only 0.01% of total users may actually purchase.
Many of these problems continue to persist in the locked down AAA console world that you seem to be so fond of. You know, I can accidentally buy 50 terrible iPod games and still spend less money than I would have spent accidentally buying 1 terrible PS3 game.
Poor comparison. All you need to be an authorized Apple Developer is $99. They aren't really trying to lock down the developer market, only the developer-compiler market.
Howso? They've spent the money and the time to create a great platform. It seems well within their rights to dictate a lot of the rules around what goes on the platform they created.
And as mentioned above, they dont have a monopoly in the market so, from Apple's perspective, 'if you don't like it, there are other opportunities' ... developing for the bberry :)
"They've spent the money and the time to create a great platform."
They've spent some money and time, and they have taken a lot of other people's work: Mach, gcc, Smalltalk, BSD, etc.
"they dont have a monopoly in the market"
That's not so clear to me. I own an iPhone and an iPad even though I think they really suck technically. But there is content available for them that simply is not available for other platforms.
OK. I come from a different background of programming than traditional consumer apps. In my "normal" job, I get a very small white list of programming languages. "C", "C++" (but no templates, multiple inheritance, no operator over-loading), "ADA", Assembly. Within each of these areas I am not allowed specific features. Recursion. OMG. I did that once and never again. Big nono. Even a simple things like:
if (A && B)
{
}
else if (A && !B)
{
}
is not legal as you can not fully test 1 of the four branches in the "else if()" case. Even "default" statements in "case" statements that can not be executed (because the switch statements cover 100% of the available options) can be an issue.
What I am getting at is, with programming there are rules. The rules Apple have are actually very minor and very easy to stay within. If you program to make political statements, choose Android or BB or Symbian. If you program to make money, pick the platform that will do that and follow the rules. If it stops making you money, move on. If you don't like the rules and don't want to abide by them, move on.
You can enjoy programming while still staying within the rules. Sometimes, it is part of the fun and challenge.
Programming isn't a game for many people, it's a profession. The time you spend on learning their platform and languages are sunk costs, and you need to recoup them through money-making products. By restricting developers to their APIs and languages, they are trying to lock in developers and users.
Apple is trying to accomplish the same kind of lock-in that Microsoft managed with Windows. And we better nip this thing in the bud, because Apple would screw us even worse than Microsoft has.
Learning another language is an overhead, an investment you make that increses your skills and broadens your abilities. If you want to write applications for iPhone OS and take advantage of the huge market that Apple has created, then learning Objective-C and Apple's IDE is an investment in time that you need to make. It's just what you need to do. Many developers from other backgrounds have already started programming for iPhone OS and met with great success. If you're experienced in object oriented program design, or C++, then the transition isn't that difficult.
As a long time Mac user, I've experienced a lot of Mac applications that have been straight ports from other platforms and they are, for the most part, pretty awful. I can understand from this why Apple wants its developers to code iApps natively.
This 'lock in' makes perfect sense for Apple in other ways too, ways in which end users and developers will benefit. Imagine that Apple allow apps to be ported from Flash. Developers would stop coding natively for iPhone OS as they would be able to create their apps in Flash and distribute them as web apps at the same time, reaching a greater audience. Then add in Android, Blackberry & other export options for Flash. Soon enough Flash would be the only IDE in use and platforms such as iPhone OS would be at the mercy of Adobe. If Apple were to introduce new features and efficiencies to their hardware and APIs, they would have to wait for Adobe to implement them in its Flash translation layer before the features would really become available to end users. Even the most willing and motivated of developers would not be able to get around that, they would have to wait for Adobe. So in the end, Apple would lose sales and credibility, and good developers would get screwed because they wouldn't be able to out pace their competitors in updating their apps to take advantage of new features. Everyone becomes 'locked in' to Adobe. Given Adobe's poor history when it comes to timely bug fixes and support of its OS X applications, I do not think that this 'lock in' would be a nice place to find yourself, whether you're Apple, a developer or an end user.
If you don't like Apple's stance then develop for other platforms and buy other products. But if you want to be in on the action, then accept the rules as they are not unreasonable and will ultimately benefit everyone.
It doesn't look silly when taken in context with the rest of their strategy of controlling the entire toolchain to maintain consistency across apps. That strategy becomes harder when everyone goes off and uses different tools.
It's the same strategy that they've always used with their software/hardware combination, which has worked beautifully in the case of the mac.
"The games at this time were all pretty good. A consumer could go out, buy a game based just on the information printed on the box, and go home and be pretty sure they'd have a good experience."
Rose tinted glasses. They still managed to release buggy, downright broken software. It wasn't about quality, it was about control.
we need to realize that iphone OS is actually just that...an OS. What you are saying is kinda like saying that we would be incapable of finding valuable web apps or locally installable apps on windows/linux/whatever OS we use on our PC. Because there are no restrictions there and never have been.
1. It's only a matter of time until apple realizes this model is not good for anyone.
2. Android (who by the way is powered by the very people who specialize in filtering out the crap on other platforms) will dominate the next few years.
Nowadays Developers produce their games for all consoles e.g. FIFA 10 is available on Xbox, Playstation, Nintendo Wii, DS, PC, Mac and amazingly mobile too..! You can be sure a companies like EA will not align themselves exclusively to any particular platform! What happened in 1983 is irrelevant in today's market. Apple is trying to control the mobile telecommunications market and put their competitors out of business... However what will kill the iPhone 4G is the free apps for ads idea. Free apps will not compensate the user's suffering incessant advertisements being shoved in their face every time they open an app. Apple are loosing the plot!
"However what will kill the iPhone 4G is the free apps for ads idea."
That idea started 1.5 years ago. Now you get Google ads in your free apps. Now, as an option, you can have iAds or Google ads. So all Apple did is open an option (read choice) for developers already doing ads in apps.
(HN is, IMO, above the rest but it's certainly not immune to mob rule)
Besides, with the money involved in high rankings you'd have to constantly police the system against gaming, which would be, I'm guessing, more work than policing the submissions directly a they do now.
It'd be an issue, but it would be relatively easier for Apple to protect against gaming than for Digg or Reddit. Unlike Digg, Apple can get access to unique hardware identifiers. They also have some experience with DRM. Is it possible to get around that? Sure, but it's much harder than simply gaming IP addresses and cookies.
(First my credentials: I was an avid player of Atari 2600 games at the time, so I remember the period in question first-hand.)
My take is not that there was a decline in quality due to any sort of technical reason, it was due to a drop in the quality of the gameplay design and playtesting. That, in turn, I'm guessing was due to the number of Atari 2600 game creators increasing past the threshold of GOOD game designers & playtesters available. After all, it was a relatively new field at the time, video game design. Perhaps it reached a point where there were say 50 new games being "designed" concurrently but there were only 20-30 good designers. Whereas before, it was under that threshold.
This was my theory because I've heard your position stated a few times in the web, but from direct experience I remember it being more a drop in the quality of game play rather than in code quality or technical polish.
I think that it is entirely a myth that the video game crash of '83 had anything at all to do with a decline in the quality of games. The fact that a couple of anticipated games (ET and PacMan) turned out to be dogs is coincidental. There had been dogs all along. What actually happened was that the videogame fad had finally run its course. For a while, videogames were novel enough that consumers were willing to buy just about anything that they could play on a video screen. Then, as invariably happens with fads, they were old hat. It wasn't just console games that hit a slump--it was arcade games and computer games as well. Videogames had to rebuild a market based not upon novelty of playing games on a video screen, but upon the quality and features of the individual games.
Or just don't let "bad" apps in the store -- regardless of how they were developed. I mean, duh. I'm having a hard time seeing how it is anything but an attack on Flash.
If Apple's strategy of toolkit lockdown (to improve app quality, performance, and differentiation) is overly draconian, their platform will fail in the market.
I don't think there's any government intervention forcing people to work on (or not work on) app store apps.
I think apple is doing well in the (presumably free, it's way more open than other places) market, because they are very good at negotiating their property rights.
I guess it's just fashionable to imply the "bad guys" are "communists"
Do you mean the FCC approval process for new cell phones is a massive barrier to entry? I'm sure it's not free, but i can't imagine they'd want more that a few dozen phones and $100k of studies. maybe a half a million?
The average person's tolerance for faulty software is lower than it should be. I guess Apple wants to raise expectations, so that people are locked into the iPhone. ("OMG, that android app has ITS OWN KEYBOARD!!!111".)
Of course, this is why I don't use C, C++, or Objective-C. I'm always a little surprised when someone writes code in one of those languages that actually runs.
Yeah, the Linux kernel, Mozilla, Chrome, Safari, Emacs, vi, Mac OS X, iPhone OS, Windows 7, Google search, Apache, Nginx, the very first web browser, et cetera are utter crap.
As an Emacs developer, I can tell you with 100% confidence that the C part of Emacs is utter crap.
The good news is that there isn't very much of it.
The rest of the software is buggier than it should be. My web browser has remotely-exploitable security holes. Random drivers in Linux randomly regress as the version number increases. OS X and Windows 7 crash for no reason, and don't support enough hardware.
The only program I use regularly that doesn't crash on me is Xmonad. And guess which language that isn't written in.
Xmonad claims to be about 1000 lines of Haskell. I'd say that's equivalent to about 10-100k of C.
It doesn't crash because it's simple, not because it's written in a good language.
Well you'd probably have 100X as many bugs in a C program (it takes 10X as many LOC, and I bet bugs scale with N^2), so Haskell is a bit better, but language isn't as important as scope. Big programs have more bugs.
Actually, it doesn't crash because a theorem prover was run over the code to prove that it wouldn't crash. I believe some obscure stackset crashes were preemptively found this way.
It's not just lines of code; Haskell's BDSM-oriented type system is often infuriating, but it's remarkably good at catching bugs at compile time. Haskell programs tend to have a surprisingly small bug rate once you can actually get the things to compile.
While I agree that C (and C++, and Obj-C) have some problems that make it really easy to write completely buggy software, I wouldn't go as far as blaming programming languages for buggy apps.
Plus, I don't understand why people keep praising XMonad for its lack of bugs. It has some bugs that are quite annoying (eg. stuck windows that don't close, locked with only one tile on a screen) -- on the other hand, I've never seen a bug in metacity.
To be fair, the part of Emacs implemented in Lisp looks nothing like poetry. I wouldn't qualify it as ``utter crap'' but it's a lot messier than I thought. Some of the default packages looks like straight C code translated verbatim to Lisp.
My take is that even if it's easier to screw things up in C, that doesn't mean that if you program in a higher level language, you'd automatically produce elegant code.
I can tell you, though, that this doesn't really matter in real life. The compiler warns you when you use a free variable, so it's pretty hard to accidentally misuse a dynamic variable. There are pathological cases that people point out, but these rarely matter in elisp that most people actually write.
Programming Emacs is a little different from programming other systems, but once you use its idioms instead of the ones you took from your favorite language, everything works quite nicely.
Oh, it's not so much a problem of using a free variable by accident--where the compiler can help you out--but of not being able to use proper closures.
I agree that emacs lisp can still be used to productively write software. People put up with much worse things.
Not sure why this is getting upvoted (and the grandparent downvoted), because it is completely besides the point. I'm always amazed when any of my code runs, but the more so when I write C. Apple is strangely not supporting any modern language that reduces the chance of subtle and hard-to-fund bugs.
"Of course, this is why I don't use C, C++, or Objective-C. I'm always a little surprised when someone writes code in one of those languages that actually runs."
Why does it really scare me when people that claim to programmers say stuff like that. If what you say is true, that you are "surprised when someone writes code in one of those languages that actually runs.", then you really should not be programming.