Hacker News new | comments | show | ask | jobs | submit login
New iPhone Agreement Bans Flash-to-iPhone Compiler & Others (daringfireball.net)
525 points by swilliams on Apr 8, 2010 | hide | past | web | favorite | 480 comments



Getting away from the frenzied rhetoric, my opinion is that what Apple really wants to prevent is people releasing multi-platform compilers. So taking Flash as just one example, if I can build one app and the compiler can make me an iPhone executable, an Android executable, and so forth, Apple don't want that.

In my experience so far with such "cross platform compatibility layers," they always produce results that water down each platform's individual strengths and differentiations. And of course, instead of the developer being locked into the phone platform, they are locked into the compatibility layer's platform.

Adobe's Flash compiler is a classic maneuver to "commoditize your complements," as Joel put it so well. Apple don't want to be commoditized, especially if it means having apps that don't take advantage of the iPhone's strengths.

Adobe want to lock developers into Flash and commoditize everything else as Flash-delivery devices. Apple want to commoditize applications and lock developers into their APIs.

http://www.joelonsoftware.com/articles/StrategyLetterV.html


If the cross-platform experience is subpar, Apple should just let these apps fail in the market.


"If the cross-platform experience is subpar, Apple should just let these apps fail in the market"

Perhaps, like Nintendo, they learned the lessons from the collapse of the home video game market in 1983. When Nintendo was contemplating developing the NES, they took a deep look at what had caused the collapse. What they concluded was that the main cause of death was the market being flooded with too many crappy games.

Originally, if you wanted to write, say, an Intellivision game, you went and got a job with either Mattel or APh Technological Consulting (the company that did the hardware design, system software, and many of the early games for Mattel). If you wanted to do an Atari game, you went and got a job with Atari.

The games at this time were all pretty good. A consumer could go out, buy a game based just on the information printed on the box, and go home and be pretty sure they'd have a good experience.

As time went on, a few more companies joined the party. Activision and Imagic, for instance. These companies were started by people who had worked for Mattel or Atari or their contractors like APh, and generally produced quality games.

A consumer still could be confident that plunking down $40 or whatever for a new game, based just on the box description, would be a good move.

More time passed, and companies that had little or no connection to Atari and Mattel jumped in, using information gleaned by reverse engineering the consoles and system software. The information was not always complete, and they didn't know all the tricks and techniques we authorized developers knew to squeeze greatness out of the hardware. They produced a lot of crap games.

Consumers now found that spending $40 on a game was a big gamble. They had to work to get good games--be aware of brands, read reviews. They stopped buying--all games, not just the bad games.

Nintendo's conclusion was that their new console must be locked down. Only developers that Nintendo approved would be allowed to produce cartridges. This way, they could ensure that quality remained high, and get the now shy consumers to come back and give games another change.

It clearly worked--and consoles have been locked down ever since, and the console game market is huge.


This is exactly what I thought the original app approval process would be for: an Apple "seal of quality". That would be a fine trade off for users -- they may only get the approved apps, but at least they're screened for quality.

However, that isn't what the approval process is. There are literally thousands of crappy applications that were happily approved and clogging up all categories in the app store. It seems non-trivial app rejections are not done on behalf of the user but are done solely to protect Apple's own interests. Remember when a bunch of high-quality Google voice apps disappeared from store? And that's just one example, there have been many more.

And now they're rejecting apps not based on their quality, but based on the programming language or development environment used to create them. How is that at all relevant to the user? This is entirely about protecting Apple's own interests and the comparison to Nintendo's lock down of the NES is not applicable.


> This is entirely about protecting Apple's own interests and the comparison to Nintendo's lock down of the NES is not applicable.

What I don't understand is how this isn't anticompetitive behaviour. By creating an app store and lock-in for application vendors, Apple become the only provider in the market. They now appear to be leveraging that monopoly to restrict another market, that of developer tools, to their commercial advantage and at the expense of a competitor.

I'm not a lawyer and don't live in the US where presumably any legal action would be brought, but can someone please explain to me how this isn't black-and-white illegal under US law?


Because they don't have a monopoly on smartphones. They control their own platform, but as long as there's a reasonably competitive smartphone market out there, it's not illegal.


But aren't there really three markets here?

First, we have the hardware, and as you say, Apple is certainly in a competitive market with its iPhone offering.

Then, for each type of hardware, we have the software that runs on it. Anyone could write software to run on Apple's hardware, but because Apple lock down the phone and run the only app store in town, they have a de facto monopoly on the supply of software to iPhone users.

Finally, we have software development tools. Again, anyone could write tools to help software developers using Apple's hardware. Indeed, according to recent reports connected to this story, many people have, from Adobe's Flash CS5 team to fans of Haskell.

Again, I'm no lawyer, but I would expect that Apple would be perfectly within its rights to lock down the software that can run on its device, but would not be allowed to use that power to unduly influence the secondary market of how that software can be made.


In all likelihood, Apple would argue—successfully—that the market should not be defined as the "iPhone software" market but as the "smartphone software" market, at which point they no longer have a monopoly and are no longer subject to anti-trust scrutiny.

What gets interesting is when/if Apple's dominance continues to grow to the point that the "iPhone software" market is effectively the "smartphone software" market. At that point, the DoJ and/or FTC will almost certainly decide that many of these policies are anticompetitive and initiate some sort of action to remedy the situation.

So in some sense, Apple needs to worry about becoming too successful: it is only because they haven't achieved total dominance that they can get away with being so ruthless.


>> In all likelihood, Apple would argue—successfully—that the market should not be defined as the "iPhone software" market but as the "smartphone software" market, at which point they no longer have a monopoly and are no longer subject to anti-trust scrutiny.

"Apple responsible for 99.4% of mobile app sales in 2009"

http://arstechnica.com/apple/news/2010/01/apple-responsible-...

... it doesn't take a Phd to acknowledge that the AppStore IS the mobile SW market - thus - i would rather disagree that Apple argument SHOULD be successful in case somebody (or some agency) should bring a legal challenge to the new draconian policies - which are obviously abusing Apple dominance ...

the question for me is more technical = who can bring such a legal challenge? can developers do that? can users do that? or only a govt. agency can ... in which case - considering the influence and connections Apple has with Washington - that might not happen ...

would it be possible for developers to team up in class-action suit based which could then trigger a govt. investigation?


OK this is a common failing in minds of laymen..

The fact that 99.4 cents of every dollar spent "after the hardware purchase" goes to apple platform.. doesn't make it a monopoly.. it merely means that it is the most successful add-on market..

A good example for comparison.. is a bit old, but illustrates this particular point beautifully.

The fact that more companies are interested in producing add-on stuff for a product and that consumers are more interested in BUYING that stuff.. doesn't mean that a company is anti competitive/antitrust regulated/a monopoly..

Volkswagen beetle aftermarket parts spent some 30+ years as the king of aftermarket parts .. everything from "third party" replacement oem style parts (stuff that matched the original but was cheaper for whatever reason) as well as stuff that essentially completely changed the product into something else (dune buggy conversions, engine swaps, totally different interiors, etc)

Was VW a monopoly because for 30 years 4 out of every 5 dollars spent on "aftermarket parts" was spent on Beetle bits? no and no one ever thought to consider or call it one.. it was just a hugely successful model that didn't change every 11 months, and therefore was a fixed point in space for manufacturers to target.. but more importantly CONSUMERS WHERE BUYING.. as opposed to your avg Ford/GM/Chrysler buyer who for the most part do NOT just go out and buy total conversion kits/hopped up engine parts.. there where many manufacturers who made parts for various successful models such as muscle cars over the years, and still do.. they didn't cry about antitrust because VW add-on makers made more money, nor did they cry that VW should change the way they made the beetle so that "beetle engines" would fit in any car (or vice versa)

There is NO ONE who could bring a class-action and win, and there is no way that adobe could sue and win either.

Because "customer choices" when they have real choice, do not make a monopoly, rather removing customer choice creates a monopoly.


you raise an excellent point and explain it very well with your VW example -

but still it seems the aforementioned laymen had the right intuition - as the appstore draconian policies are finally being looked at by regulators ...

"According to a person familiar with the matter, the Department of Justice and Federal Trade Commission are locked in negotiations over which of the watchdogs will begin an antitrust inquiry into Apple's new policy of requiring software developers who devise applications for devices such as the iPhone and iPad to use only Apple's programming tools."

http://www.nypost.com/p/news/business/an_antitrust_app_buvCW...


> the question for me is more technical = who can bring such a legal challenge?

Presumably Adobe are the most likely candidates who also have serious legal firepower.


Anyone who buys apps would have standing, imo. but the developers/dev tool makers might have a better chance. Hell, if you go even farther, you could say other smartphone makers have standing - but they'd have to make a different kind of challenge.

This is a chicken-egg situation. Apple has Apps, so more people buy iPhones than other phones. with Adobe's dev tools, devs could make apps for more than one platform - that is a threat to apple. When apple shuts this down it is an anticompetitive act against other phone makers. Because if devs could put apps on more than one platform, then other platforms would become more appealing.


'Software that runs on the iPhone' is, in the end, too narrow a window to consider for antitrust. If the iPhone were the only smartphone game in town, then the situation would be different, but as things stand, it's not.


Apple's market is one of their own making, and is in fact something of a submarket. They control the platform, period, and when you agree to the terms of their developer agreement, you agree to be bound by them, so you are subject to the same rules as everybody else. They don't have any market monopolized, they merely have their corner of the market locked down.

and they aren't locking in any body. You don't forfeit rights to your source code, you're welcome to write some crossplatform app using some shared code library you build in-house. You just have to build apps natively, rather than use some watered down piss poor Common language that breaks standards on all platforms. This is the same reason adobe apps suck on macs. They have tried to abstract away the os from their applications to the point where they don't look, act, or function properly on any platform. They look like ass and run like ass on everything .


Adobe apps function great on windows and better than Macs on linux... have you ever considered the fact that your overpriced mac is what sucks? no - cause Jobs told you it was Adobe's fault.


There are so many flaws in this reasoning, I don't know where to start.

#1 - Two of the worst carts preceding the '83 crash were E.T. and Pacman, both developed and produced by Atari itself, not these mysteriously inferior 3rd parties you're alluding to. And how many games has Apple, who logically has the most know-how on the platform, produced? None.

#2 - You're making an oranges to apples comparison anyway. The video game market was not crashed by the availability of cross-compilers or tools that lowered the bar of entry. Similarly, Nintendo did not solve the problem by restricting what tools developers could use. They solved it with a strict editorial process.

#3 - Video game production in 1983 required producing and marketing physical goods. It relied on predictable "hits" just like AAA game development to recoup the considerable outlay required to get these games in front of consumers in the first place. iPhones games are virtual and the marketing for many of them non-existent (simply because I can't spend $0.25 CPC on Google trying to sell my $0.99 app). Additionally, there's a long tail of developers creating a more robust landscape of content. There can be tons of failures and still leave plenty of room for successes. Just look at how many games on the iPod have made it big. Many of them came from virtual "nobodies".

#4 - In 1983, there was no manifestation of "wisdom of the crowds" to guide any consumer purchases. Word of mouth was about it. Today, at Apple's scale, one can find dozens of opinions about the quality of a game that only 0.01% of total users may actually purchase.

Many of these problems continue to persist in the locked down AAA console world that you seem to be so fond of. You know, I can accidentally buy 50 terrible iPod games and still spend less money than I would have spent accidentally buying 1 terrible PS3 game.


Referring to your first point, Apple has released a game for the iPhone: Texas Hold'em.

http://itunes.apple.com/us/app/texas-holdem/id284602850?mt=8

Though I have not purchased this game and cannot comment on whether it diminishes your first point.


Poor comparison. All you need to be an authorized Apple Developer is $99. They aren't really trying to lock down the developer market, only the developer-compiler market.


And what if I don't want to write in one of the languages mentioned?


Then don't, thank you!


Sure. But it seems silly from Apple to restrict all iPhone developers to a (very) short whitelist of languages.


Howso? They've spent the money and the time to create a great platform. It seems well within their rights to dictate a lot of the rules around what goes on the platform they created.

And as mentioned above, they dont have a monopoly in the market so, from Apple's perspective, 'if you don't like it, there are other opportunities' ... developing for the bberry :)


"They've spent the money and the time to create a great platform."

They've spent some money and time, and they have taken a lot of other people's work: Mach, gcc, Smalltalk, BSD, etc.

"they dont have a monopoly in the market"

That's not so clear to me. I own an iPhone and an iPad even though I think they really suck technically. But there is content available for them that simply is not available for other platforms.


Great platform it may be, but this rule is insane. Who cares how the compiled code was produced? What matters is how it runs on the platform.


OK. I come from a different background of programming than traditional consumer apps. In my "normal" job, I get a very small white list of programming languages. "C", "C++" (but no templates, multiple inheritance, no operator over-loading), "ADA", Assembly. Within each of these areas I am not allowed specific features. Recursion. OMG. I did that once and never again. Big nono. Even a simple things like:

   if (A && B)
   {
   }
   else if (A && !B)
   {
   }
is not legal as you can not fully test 1 of the four branches in the "else if()" case. Even "default" statements in "case" statements that can not be executed (because the switch statements cover 100% of the available options) can be an issue.

What I am getting at is, with programming there are rules. The rules Apple have are actually very minor and very easy to stay within. If you program to make political statements, choose Android or BB or Symbian. If you program to make money, pick the platform that will do that and follow the rules. If it stops making you money, move on. If you don't like the rules and don't want to abide by them, move on.

You can enjoy programming while still staying within the rules. Sometimes, it is part of the fun and challenge.


Programming isn't a game for many people, it's a profession. The time you spend on learning their platform and languages are sunk costs, and you need to recoup them through money-making products. By restricting developers to their APIs and languages, they are trying to lock in developers and users.

Apple is trying to accomplish the same kind of lock-in that Microsoft managed with Windows. And we better nip this thing in the bud, because Apple would screw us even worse than Microsoft has.


Learning another language is an overhead, an investment you make that increses your skills and broadens your abilities. If you want to write applications for iPhone OS and take advantage of the huge market that Apple has created, then learning Objective-C and Apple's IDE is an investment in time that you need to make. It's just what you need to do. Many developers from other backgrounds have already started programming for iPhone OS and met with great success. If you're experienced in object oriented program design, or C++, then the transition isn't that difficult.

As a long time Mac user, I've experienced a lot of Mac applications that have been straight ports from other platforms and they are, for the most part, pretty awful. I can understand from this why Apple wants its developers to code iApps natively.

This 'lock in' makes perfect sense for Apple in other ways too, ways in which end users and developers will benefit. Imagine that Apple allow apps to be ported from Flash. Developers would stop coding natively for iPhone OS as they would be able to create their apps in Flash and distribute them as web apps at the same time, reaching a greater audience. Then add in Android, Blackberry & other export options for Flash. Soon enough Flash would be the only IDE in use and platforms such as iPhone OS would be at the mercy of Adobe. If Apple were to introduce new features and efficiencies to their hardware and APIs, they would have to wait for Adobe to implement them in its Flash translation layer before the features would really become available to end users. Even the most willing and motivated of developers would not be able to get around that, they would have to wait for Adobe. So in the end, Apple would lose sales and credibility, and good developers would get screwed because they wouldn't be able to out pace their competitors in updating their apps to take advantage of new features. Everyone becomes 'locked in' to Adobe. Given Adobe's poor history when it comes to timely bug fixes and support of its OS X applications, I do not think that this 'lock in' would be a nice place to find yourself, whether you're Apple, a developer or an end user.

If you don't like Apple's stance then develop for other platforms and buy other products. But if you want to be in on the action, then accept the rules as they are not unreasonable and will ultimately benefit everyone.


> Within each of these areas I am not allowed specific features. Recursion. OMG. I did that once and never again. Big nono.

We tend to frown on loops and mutating variables.


Oh, I never questioned their right to do so. My comment was meant literally: They look silly to me.


It doesn't look silly when taken in context with the rest of their strategy of controlling the entire toolchain to maintain consistency across apps. That strategy becomes harder when everyone goes off and uses different tools.

It's the same strategy that they've always used with their software/hardware combination, which has worked beautifully in the case of the mac.


Oh, I also don't want to question that it may be a good strategy.


"Apple responsible for 99.4% of mobile app sales in 2009"

http://arstechnica.com/apple/news/2010/01/apple-responsible-....

check the numbers - AppStore IS a monopole ...


"The games at this time were all pretty good. A consumer could go out, buy a game based just on the information printed on the box, and go home and be pretty sure they'd have a good experience."

Rose tinted glasses. They still managed to release buggy, downright broken software. It wasn't about quality, it was about control.


You don't achive quality without control


you don't achieve quality with anti-competitive terms of use, either.


We're well past that anyways. 185,000 apps? What percentage of those are complete and utter crap?


A lot less than if Apple had opened the floodgates.


Conversely, how many apps have been made worse by archaic restrictions by Apple?


5


we need to realize that iphone OS is actually just that...an OS. What you are saying is kinda like saying that we would be incapable of finding valuable web apps or locally installable apps on windows/linux/whatever OS we use on our PC. Because there are no restrictions there and never have been.

1. It's only a matter of time until apple realizes this model is not good for anyone. 2. Android (who by the way is powered by the very people who specialize in filtering out the crap on other platforms) will dominate the next few years.


Nowadays Developers produce their games for all consoles e.g. FIFA 10 is available on Xbox, Playstation, Nintendo Wii, DS, PC, Mac and amazingly mobile too..! You can be sure a companies like EA will not align themselves exclusively to any particular platform! What happened in 1983 is irrelevant in today's market. Apple is trying to control the mobile telecommunications market and put their competitors out of business... However what will kill the iPhone 4G is the free apps for ads idea. Free apps will not compensate the user's suffering incessant advertisements being shoved in their face every time they open an app. Apple are loosing the plot!


"However what will kill the iPhone 4G is the free apps for ads idea."

That idea started 1.5 years ago. Now you get Google ads in your free apps. Now, as an option, you can have iAds or Google ads. So all Apple did is open an option (read choice) for developers already doing ads in apps.

Choice is now bad? Or is it good?


Couldn't you have a non-fascist solution to this?

Do something like Reddit, Digg, or Hacker News... let people guinea-pig apps and upvote/downvote them and sort in each category.


Because those sites aren't full of bullshit...?

(HN is, IMO, above the rest but it's certainly not immune to mob rule)

Besides, with the money involved in high rankings you'd have to constantly police the system against gaming, which would be, I'm guessing, more work than policing the submissions directly a they do now.


It'd be an issue, but it would be relatively easier for Apple to protect against gaming than for Digg or Reddit. Unlike Digg, Apple can get access to unique hardware identifiers. They also have some experience with DRM. Is it possible to get around that? Sure, but it's much harder than simply gaming IP addresses and cookies.


Except most iPhone apps accepted to the store already are utter crap.


Great post, but I'll disagree on one point.

(First my credentials: I was an avid player of Atari 2600 games at the time, so I remember the period in question first-hand.)

My take is not that there was a decline in quality due to any sort of technical reason, it was due to a drop in the quality of the gameplay design and playtesting. That, in turn, I'm guessing was due to the number of Atari 2600 game creators increasing past the threshold of GOOD game designers & playtesters available. After all, it was a relatively new field at the time, video game design. Perhaps it reached a point where there were say 50 new games being "designed" concurrently but there were only 20-30 good designers. Whereas before, it was under that threshold.

This was my theory because I've heard your position stated a few times in the web, but from direct experience I remember it being more a drop in the quality of game play rather than in code quality or technical polish.


I think that it is entirely a myth that the video game crash of '83 had anything at all to do with a decline in the quality of games. The fact that a couple of anticipated games (ET and PacMan) turned out to be dogs is coincidental. There had been dogs all along. What actually happened was that the videogame fad had finally run its course. For a while, videogames were novel enough that consumers were willing to buy just about anything that they could play on a video screen. Then, as invariably happens with fads, they were old hat. It wasn't just console games that hit a slump--it was arcade games and computer games as well. Videogames had to rebuild a market based not upon novelty of playing games on a video screen, but upon the quality and features of the individual games.


Or just don't let "bad" apps in the store -- regardless of how they were developed. I mean, duh. I'm having a hard time seeing how it is anything but an attack on Flash.


Apple is letting them fail in the market. On their competitors' devices.

Don't worry, if you savor the cross-platform software experience there will be plenty of options for you. At bargain prices, in fact.


Yeah, that's true. iPhone users don't even want Google Voice.


oh yeah? how about calling europe with one click on ATT networks for 2c a minute?

... please - you Apple fan-boys should try things before jumping to conclusions ...

i dumped my iphone for a nexus specifically for Google Voice and Apple draconian control on the platform ...


Sarcasm, my friend. Take a look at my post history and call me an "Apple fan-boy" again...


If Apple's strategy of toolkit lockdown (to improve app quality, performance, and differentiation) is overly draconian, their platform will fail in the market.


Apple don't believe in a free market / free competition. If they did, then they wouldn't have an App Store.


I don't think there's any government intervention forcing people to work on (or not work on) app store apps.

I think apple is doing well in the (presumably free, it's way more open than other places) market, because they are very good at negotiating their property rights.

I guess it's just fashionable to imply the "bad guys" are "communists"

Do you mean the FCC approval process for new cell phones is a massive barrier to entry? I'm sure it's not free, but i can't imagine they'd want more that a few dozen phones and $100k of studies. maybe a half a million?


Part of what people like about Apple's products is the inclusion of Apple's opinions. Both the App Store and the iTouch Platform are Apple products.


Apple isn't interested in proving a point about market viability, they just don't want that on their platform.


The average person's tolerance for faulty software is lower than it should be. I guess Apple wants to raise expectations, so that people are locked into the iPhone. ("OMG, that android app has ITS OWN KEYBOARD!!!111".)

Of course, this is why I don't use C, C++, or Objective-C. I'm always a little surprised when someone writes code in one of those languages that actually runs.


Yeah, the Linux kernel, Mozilla, Chrome, Safari, Emacs, vi, Mac OS X, iPhone OS, Windows 7, Google search, Apache, Nginx, the very first web browser, et cetera are utter crap.


As an Emacs developer, I can tell you with 100% confidence that the C part of Emacs is utter crap.

The good news is that there isn't very much of it.

The rest of the software is buggier than it should be. My web browser has remotely-exploitable security holes. Random drivers in Linux randomly regress as the version number increases. OS X and Windows 7 crash for no reason, and don't support enough hardware.

The only program I use regularly that doesn't crash on me is Xmonad. And guess which language that isn't written in.


Xmonad claims to be about 1000 lines of Haskell. I'd say that's equivalent to about 10-100k of C.

It doesn't crash because it's simple, not because it's written in a good language.

Well you'd probably have 100X as many bugs in a C program (it takes 10X as many LOC, and I bet bugs scale with N^2), so Haskell is a bit better, but language isn't as important as scope. Big programs have more bugs.


Actually, it doesn't crash because a theorem prover was run over the code to prove that it wouldn't crash. I believe some obscure stackset crashes were preemptively found this way.

http://neilmitchell.blogspot.com/2007/05/does-xmonad-crash.h...


It's not just lines of code; Haskell's BDSM-oriented type system is often infuriating, but it's remarkably good at catching bugs at compile time. Haskell programs tend to have a surprisingly small bug rate once you can actually get the things to compile.


While I agree that C (and C++, and Obj-C) have some problems that make it really easy to write completely buggy software, I wouldn't go as far as blaming programming languages for buggy apps.

Plus, I don't understand why people keep praising XMonad for its lack of bugs. It has some bugs that are quite annoying (eg. stuck windows that don't close, locked with only one tile on a screen) -- on the other hand, I've never seen a bug in metacity.


To be fair, the part of Emacs implemented in Lisp looks nothing like poetry. I wouldn't qualify it as ``utter crap'' but it's a lot messier than I thought. Some of the default packages looks like straight C code translated verbatim to Lisp.

My take is that even if it's easier to screw things up in C, that doesn't mean that if you program in a higher level language, you'd automatically produce elegant code.


Emacs Lisp is a mess. It doesn't even have lexical scoping.


The development version does.

I can tell you, though, that this doesn't really matter in real life. The compiler warns you when you use a free variable, so it's pretty hard to accidentally misuse a dynamic variable. There are pathological cases that people point out, but these rarely matter in elisp that most people actually write.

Programming Emacs is a little different from programming other systems, but once you use its idioms instead of the ones you took from your favorite language, everything works quite nicely.


Oh, it's not so much a problem of using a free variable by accident--where the compiler can help you out--but of not being able to use proper closures.

I agree that emacs lisp can still be used to productively write software. People put up with much worse things.


Not sure why this is getting upvoted (and the grandparent downvoted), because it is completely besides the point. I'm always amazed when any of my code runs, but the more so when I write C. Apple is strangely not supporting any modern language that reduces the chance of subtle and hard-to-fund bugs.


"Of course, this is why I don't use C, C++, or Objective-C. I'm always a little surprised when someone writes code in one of those languages that actually runs."

Why does it really scare me when people that claim to programmers say stuff like that. If what you say is true, that you are "surprised when someone writes code in one of those languages that actually runs.", then you really should not be programming.


Enjoy being Apple serfs, developers! The guys over at Titanium [1] can give you genuflecting lessons if you need.

Apple has definitely crossed-over to the dark side. After 26 years of being a fanboy, they've finally exceeded what I can stomach.

[1] http://developer.appcelerator.com/blog/2010/04/apple-4-0-and... [2] http://ipadmakesmesad.blogspot.com


You're beholden to whatever platform you develop on. That's simply the way it always is.

The web isn't appropriate for the apps I want to write yet, so I can't develop on a perfectly open platform and expect to find customers. And I can't reasonably create my own platform and create the software I want to create.

So I have to pick a platform that can reasonably support the apps I want to write and that gives me a reasonable chance to make a living at it. I'm going to be somebody's sharecropper.

When I have more resources, I can consider supporting multiple platforms to mitigate my risk. But until then, pointing out that we develop at the pleasure of the platform holder is redundant and the differences between more- and less-restrictive platforms is splitting hairs.


This change doesn't just cover Flash though; it also hits all of the Mono cross-compilers (and Scheme, and everything else that happens to cross-compile).

I can't see any way that this turns out as remotely positive for developers.


Apple has consistently demonstrated that they don't care about what developers want. They seem to believe, and have so far been shown to be correct that if they can get the consumers, developers will follow. The only way I can see this changing is if a killer app that consumers want becomes available on another platform and not iPhone OS due to Apple's restrictions.


I think you're forgetting that it's also really fun to make an app that has the best "look and feel". I can make things that make people say "Wow." A lot easier on the iPhone than on Android.

Developers will make apps where it is fun to make apps. I haven't had any horror stories with the app store, so it's still just more fun to make iPhone software than anything else.


The developers who make apps 'for fun' aren't producing the blockbusters that make the appstore so important to the iPhone.


No, but they do make up a fair number of apps that populate the top 100 lists for various categories (even if they're not runaway hits on the most profitable list).


You really think developing software in 25 year old technologies (Objective-C, Cocoa) is fun???


Coming from a lot of .NET at work it's been fun to learn something new, and play around with a new environment (Xcode). So yeah it's been fun. I don't know how long it will last, I've also not written anything terribly complex, so perhaps the pain points will surface more over time.

Handing my phone to my friends and telling them I made what they are looking at has been really fun. Also, telling anyone with an iPhone how they can just search the app store and find my little app is way fun. My Mom installed it even!


> Apple has consistently demonstrated that they don't care about what developers want.

The facts disagree.

The App Store model is unusual and it is not perfect. Few developers have any experience with Cocoa or Objective-C. Developers must use a Mac. iPhone software only runs on the iPhone and is not easily ported to other platforms.

Despite all of that, Apple has attracted developers to the App Store and the iPhone in numbers nobody would have predicted. Meanwhile, all other mobile platforms are rushing to duplicate the model.

It would seem Apple knows exactly what developers want.


No. Apple knows exactly how much developers will put up with.


Google, Microsoft, RIM, and Palm are wishing it was that easy.


Apple is operating under an effective first-mover advantage. That conveys a lot of benefits that let them get away with a lot of abuses. If/when the other players in the market catch up, that won't hold quite as true.


"It would seem Apple knows exactly what developers want."

Aye, money outweighs freedom even today.


Money is freedom.


Depends how many strings are attached to it.


Apple only cares about their image. They make that "glow" effect on produvt to attract costumers, and developers go wwhere the people are.


It doesn't cover any Scheme that compiles to C since that would be indistinguishable from regular C/C++ code, at least on Apple's side.


Say, one wants to develop a touch user interface for a visual programming language for music composition and the core engine happens to be written in Common Lisp (say, something like PWGL or Open Music) - why can't I have that in Lisp? It is compiled, native and would use the Cocoa libs.

There are not only consumers which want to download ebooks with ads. There could be an area of innovative and experimental use where Universities want to develop novel applications - applications that might be written in Smalltalk, Lisp, Haskell - or any other language that can compile and is not Objective C. In many cases the innovation lies in the core logic and the innovative use of a touch screen. Why should I develop my core logic in a way that it is tied to the iPhone or iPad (assuming that the iPad will get the same developer agreement) and where I have to use a relatively low-level language like Objective C.

It is one thing what you assume Apple's target (Adobe, cross platform frameworks, ...) is and another thing who else is also affected by these clauses the developers have to agree to.


The university example is a good one. I wonder if they will be covered by wireless app distribution, which will bypass the app store. From Apple's iPhone Enterprise Developer page... "Deploy proprietary, in-house applications to authorized users in your company, the iPhone Developer Enterprise Program is available to companies with 500 or more employees and a Dun & Bradstreet number.


Because companies with fewer than 500 employees have no need for custom apps...


Nah, that's bullshit. Applications written in a higher-level language and compiled down to Obj-C could be a lot better than apps written in straight Obj-C, for the same reason that applications written in Obj-C are better than applications written in straight ARM assembly.


It's not just with the consoles. I used to develop Lotus Notes applications. The problem with Notes was that the barrier to entry for developers was too low such that it hurt the platform, and it made it hard to distinguish good developers and good apps from the chaff. When a bunch of unqualified developers fill the world with junk it gives the entire platform a reputation as being junky.

I now develop iPhone applications, and I personally agree with Apple in this respect. If you want to develop iPhone applications, spend some time and become proficient with the tools.

And, @raganwald is correct that it's still lock-in, it's just a matter of where the lock is.


Your lock-in argument is wrong. Of course there are always dependencies. But it's important to be able to choose which dependencies are the right ones for your application. Mandating a particular API isn't just lock-in. It's lock-in lock-in. You're signing away the right to choose the right tool for the job.


Apple should filter by quality, not by what language/technology was used to make the app.

It makes more sense to reject Fart apps -- or least the 100th Fart app. It does not make sense to reject apps written in Python or Clojure, for example.

To put it another way: they are disproportionately turning away the better quality devs, not the lower quality ones.


Democracy is similar. I used to have a nice life as a french king, until the stupid peasants took over. Now everything is fucked. I feel so locked in.


Performance is an easy excuse. A terrible programmer can do worse than cross compiler. If Apple do care about the so-called performance, they should judge apps by their quaility, not the process of how they are created.


Adobe's response, from a game-theoretic perspective, is to adhere to

   3.3.1 — Applications may only use Documented APIs
   in the manner prescribed by Apple and must not use
   or call any private APIs. Applications must be 
   originally written in Objective-C, C, C++, or 
   JavaScript as executed by the iPhone OS WebKit 
   engine, and only code written in C, C++, and 
   Objective-C may compile and directly link against
   the Documented APIs (e.g., Applications that link 
   to Documented APIs through an intermediary translation 
   or compatibility layer or tool are prohibited).
by simply generating Objective-C as their Object Code. Developers then compile this for their target platform (iPhone) and stay within the bounds of 3.3.1. I see Apple is trying to put a kibosh on this with the "originally written" clause, but, that will be a stretch to enforce.


If i understand "originally written" correctly, doesn't this also kill objective-j and GWT?


Yes, but then the app still wouldn't be "originally written" in Objective-C.


And how could this ever be proven?


Apple just sues you, if they suspect, and the court will find out.


No, Apple rejects your app from the App Store until you can prove it's provenance.


OK. So it's even easier for Apple.


This is addressed in the original post. As of now, projects that were made other the the approved ones would have obvious file layouts characteristics.

Theoretically, you could make it hard to tell the difference, but in practice its pretty easy to tell machine generated from human written code.


You don't submit your source code to Apple as part of approval.


Yet! Wait for it serfs, wait for it!


Unfortunately rewriting a cross compiler takes more time than rewriting a sentence in a license agreement.


Agree, from my experience any good software for a platform should take as much advantage of the platform as possible. And then work around platform limitations. Any cross-platform apps look non-native, missing subtle platform conventions, and ultimately feeling awkward on such a distinctive platform as iPhone OS. This seems to be a consistent line of Apple's, ultimately aiming to project the "quality" image of the platform, and not diluting the valued brand. This is the same line as John Gruber suggested pondering the purge of overly explicit apps. It did not come down to purging crapware developed natively with Xcode yet, but with thousands of apps in each category, the purgatory moment in one form or the other (e.g. if not purging outright, but subjectively separating into premium and "others" stores) probably is not too far ahead.


It's not just about quality.

As with everything Apple does - there are company-centric motives, which have been neatly balanced against a set of consumer-centric motives.

Apple is run by smart people, who realise they have a dedicated following. By making the 'we don't want to diminish the quality of the App pool' argument clear, they allow their ardent followers to do battle for them.

The corollary of this is that Apple ensure that their platform receives the developer investment it requires, enabling the company to become a permanent fixture in the mobile market.

If Flash developers didn't have to make a new investment of time and money to learn their platform - what would stop this pool of developers from leaving Apple's side tomorrow?

They want full control over what is allowed _into_ their market, and they want a dedicated team of developers who won't walk away.

If Flash was allowed, neither of these requirements would be assured.


Sure, 'multiplatform compiling/targeting' is one thing. But banning non-approved languages is silly. Even apple is funding projects like RubyCocoa and bringing Python compatability to the Cocoa API. If somebody wants to write an app using CocoaTouch but finds Ruby, Python, Lua, or Mono to be a better fit for the project or their own capabilities than Obj-C, why stop them from doing so? If the apps still have to interface with Cocoa somehow, how are such app's any more/less native than one written in Obj-C? Such a policy does nothing to stop somebody from writing a cross-platform targetting framework that uses Obj-C. The language is not the real issue.


Think OS/4, think opening up multitasking to 3rd party apps, perhaps thats why they are being so anal about how apps should be written, if they are planning to make changes to the low-level libraries to support 3rd party app multitasking then they are going to want to make sure that 3rd party apps are linked against them.


This argument makes no sense. These third party tools are using only the publicly documented Apple APIs. They are simple doing so through an intermediary layer. It has nothing to do with using private or low level libraries.


what does language have to do with that? you can include your own C libraries within your project if you want, the language is no different. if people use other language interfaces to Cocoa, like RubyCocoa or MonoTouch, what is the problem, they are still using apple's libraries. the language has nothing to do with any of that.


Yeah, but there's no actual multitasking in iPhone OS 4. Just some 'background' system services. Apps cant run in the background. And the clause doesn't say "you need to link to our libraries", it says "you need to author your app in [languages] etc".


Imagine if Microsoft had come out with an App Store for Windows 7 and decreed that the only way to run apps on Windows 7 was to get them from the Microsoft App Atore. If you wanted to create apps for Windows 7, you'd have to use Visual Studio and a Microsoft compiler, you'd have to pay an annual fee to be a an accepted Microsoft App Store developer, and if you wanted to charge for your application, you'd have to pay Microsoft a 30% commission on the sale price.

People would lose their damn minds.

If you're the dominant OS in the smartphone market or in the desktop market, where's the difference?

Apple has to unlock the iPhone and let people get their apps from wherever they want.

If people want the security of knowing an app is Apple approved to work and play nice with others on their systems, they can go through the App Store.

It's not their rules, but the fact that they remove choice from the market for both consumers and developers by FORCING themselves into the consumer/developer relationship as a restrictive middleman.


But Apple is NOT the dominate OS.

1) Symbian 2) RIM 3) Apple 4) MS 5) Android

Android and MS might have swapped in the last month or so. They are close. As for choice, consumes don't seem to mind. You have tons of "choice" on Symbian, Android and WM for app selection. The AppStore is beating them all.

Developers will go where the money is. It is a job. You play by the rules or move on. Simple really.


I can understand Apple wanting to lock down their property to make sure what comes on it is quality and secure. Look at MS, anyone can develop almost anything on it, and what we have is a massive bug and virus trap.

BUT... Microsoft was given into trouble by the competition commission over the tight reign it had on Windows. Now what are we saying, Apple should be allowed to get away with near enough the same things Microsoft got fined for? just because they are Apple?

If Apple just lets this happen, and lets iPhone apps be developed on other OSs/SDKs/whatevers, then if a developer wants to produce a piss poor version of something then let them. Apple can then say yeah or nay when the App goes into the Store. They are still going make money, their phones are still going to be bought in droves.

Open the doors Apple, you might let something good in.


well i cannot agree with you on the apple/mircosoft issue. Microsoft has an near monopoly or defacto monopoly if we call it that.

What MS decides on does effect the whole market. What Apple decides affect the Apple users only. But me as an Apple user Is affected of all dumb anti standard decisions MS have done. My web experience is crippled and I have to deal with name extensions and even more so now in OSX 10.6.

Under MacOS9 i only had to know if a file was going to be used on a windows machine and then add the proper extension in the name. But it's uggly and wrong.


I agree with Apple, 80% of apps are crap and are deleted after a few weeks, thats not a good app and not how apple what to be seen as having a junk yard of crap apps ? Would you delete Photoshop after a couple of weeks ? No, why ? because it's a good bit of kit, and Apple want to keep that inline with there image.

I am not a developer but have been investigating starting a development company, I have seen 1,000 of bedroom chancers on the fourms....

Yeah man lets make an app and cash in, lets make an app called twitbook, it's a cross between a twit and a book we will get 1,000 downloads a day and we will spend it all the profit on weed, yeah man good idea ! Ok lets start... we can't code but we can use 3rd party apps that even my mum can use and we are done.

Yeah I am exaggerating slightly but thats how the market is going, it means that the app store is constantly full of shite with shite apps and will get worse if these 3rd parties are allowed to run riot to let any Tom Dicks and Harrys to release apps, which will happen if it's allowed to. Apple does not want to encourage that and nore do I, I want quality not quantity apps.

This is a new phenomenon for the world so Apple are bound to make mistakes on how they operate it and they have realised this is a bad thing that is happening to THEIR brand.

I have seen so many good serious developers with good apps, they complain they are not getting seen on the app store and above is the main factor for that, piles of it.

These guys can code with their eyes closed and yes the 3rd parties apps maybe an inconvenience but I am sure they can work round it and actually get the coverage and sales they deserve.

In the short it's not good but the long of it is that it will be good for the people who know what they are doing hence brining us apps that don't sit on our phones for a few days and get ditched.

Why are people screaming about this ? It's all about money on both sides, not future development of the up and coming kids, end off.

If I rent a room in your nice house and start pissing on the carpet would you want to boot me out ?

As far as i remember it was always Apple being victomised by other OS and software companies, how the tables have turned and good on em.


I'm sorry but that is a terrible excuse for what they've done. As a flash developer hit hard by the recession, having another (strong) string in my bow was essential for survival. Like others in my industry, I don't have time to completely re-train every time a new product comes onto the market. Being able to produce for a different device using my current skills is a blessing. Apple should be thoroughly ashamed with themselves and I for one will be approaching the Monopolies Commission regarding this.


While it sucks when times are tough, if you want to develop for a new platform to make money, then make and take tge time necessary to learn the platform and its tools.

Your complaint here is disingenuous. When you learned to be a Flash developer, did you complain that Macromedia should be ashamed cos they didn't build their tool in HTML & CSS?

If you want to develop for the iPhone, then develop for the iPhone.


I sympathize with your economic plight. But the fact that you're having a hard time making a living using Flash doesn't give Apple the obligation to provide you with a development platform.

Since the iPhone has less than 20 percent of the smartphone market, it seems unlikely that the Monopolies Commission will be interested.


Apple isn't obliged to actively deprive people of development platforms either.


I thought of doing the same thing with my Pl/M experience. Decided to learn new tool sets/languages instead:-)

You will find your self in serious trouble moving forward if you tie yourself to only knowing a single company's toolset and tying your future to the well being of that companies tool set. Especially Adobe's; a company as fickle as Apple.


What you're saying makes sense, but at the end of the day - it doesn't matter what Apple wants to do, what matters is what they have actually done.

If they wanted to eliminate cross-platform apps, they could have just as easily put something which specifically mentions that in their terms of service. It wouldn't be any more ridiculous than the conditions they have put in place right now.

These new terms of service effectively bar tools like Monotouch - a development environment that exclusively targets the iPhone OS.


This is seriously putting me off. I was not thinking of a third party approach, but it limits me when I want to use my OWN scripting language when developing apps.

Does this stretch to build scripts??


Technically yes, but it's not like they would ever know.

It's pretty obviously an out clause so they can kick Adobe in the pants, and possibly prevent app-mills from popping up all over, completely saturating their approval process for the app store. I doubt they'll go after an individual developer who isn't obviously using some mass-market code generator to pump out apps.


There was a sentece from Apple, Apple aproved tools could be used. Don't excpect Flas to be one of them and I totally hate Flash.

When they produce less buggy Flas that doesn't have memoryleaks enough to kill a new computer with 4GB ram Core2Duo only running outlook and Google maps in IE7. Then maybe I'm also intressted to run Flash in my phone. ATM flash is blocked in my Nokia.


Okay, so cross platform compatibility layers water down each platform's individual strengths and differentiations. What happens when a small team of developer needs to re-write their application to reach more devices? While the platform with the most users will probably get the most polish, the rest will probably flounder at v0.9. I guess Apple could be counting on being the platform getting the attention. Let's see how that works out for them in the next 5 years!


Apple is really trying hard to enclose and seal a closed garden environment, that much more lucrative for them. HOWEVER, 98% of video seen is currently viewed via flash. If Adobe have a cross compiler that works and apps can be translated WELL onto the apple platform, why should apple care? if they dont work, the user will decide!! Is this Microsoft all over again? And is it non-competitive? A legal suit looms over apple's head. One I suspect they will lose.


MonoTouch isn't really a cross platform toolkit. It is not providing a replacement API for Apple's APIs just allowing them to be called from C# and allowing you to use extra code you have in C# that Apple by definition doesn't have.

the Modern C# language provides many advantages over the Objective C used on the iPhone which does even have garbage collection.

If they intend to succeed in the Enterprise they should of encouraged MonoTouch not squashed it.


How can anyone now develop a business and future with Apple when it could all be taken away at a moments notice.


I presume all these other programming environments allow one to target the iPhone/Pad/Pod using a PC. Is Apple, with all their (new) clauses, trying to say in a round-about way "If yer wanna target our iDevices, buy a Mac"..? After all, Mac sales is where they make the most money.


20 years ago I used to rave about how Objective C was 10 years ahead of anything else out there. Unfortunately it's now 10 years behind C#....


This is a great point.


They dont want anyone peeing in their pool. They purge the pornspam apps, they purge the rss-reader apps. I personally think this is a good thing, because developing a native iPhone app in C/C++/Objective-C means you will more likely have a vested interest in the iPhone/Mac platform other than to make a quick buck with a flatuence app. At the very least, you're a more dedicated developer.


"At the very least, you're a more dedicated developer"

What a rose-tinted way to describe lock-in.


This is the reason why Apple still have better quality freeware,shareware applications on Mac than Windows have commersial application.

It was more difficult to program for Mac than for DOS. Hell I can make a decent DOS application. But in my younger years doing the same on Mac was way more troublesum.

Ok I havn't programmed Cocoa just done som experiments, made de calculator and currency converter Apple has as tutorial. And yes it simplifies a lot.

But simple programming comes with a cost in quality. Takin the step from DOS programming to program for Apples System 1-MacOS9 was huge. Eventloops, memoryheaps etc etc. Those who know programming had little problems. Those who made hello word apps had huge problems, aka me.

The greater challange there is to programming the greater programs will de creators do.


this is all about controlling the quality of applications, its is good, congrats to apple, they want to controll the quality of their applis by forcing people to code in a low level structure, they are right, high level abstractions when badly coded are a mess.


you can't seriously argue that it's "quality" that is driving this decision. I've downloaded far too many apps that turned out to be absolute garbage to believe that it's a factor for which apps they approve. it's about controlling the apps and the devs, not the quality.


I don't like this at all. Maybe they only meant to hit Flash, and maybe not, but as written this is in direct opposition to one of the most important principles of software development. Apple themselves must have benefited countless times from writing software in layers. But no layer above their layers is permitted?

I wonder if the open source world can successfully fight back, by making compilers that generate code the app store police can't tell from hand written.


I wonder if the open source world can successfully fight back, by making compilers that generate code the app store police can't tell from hand written.

I think the answer is definitely yes. Apple's software engineering is not that great. There is always some hole in Safari that allows root access to the entire device. They can't get atomic syscalls working in OS X. Does anyone really think they can recruit and afford people that can tell computer-generated software from hand-written software?

My guess is that this is a scare tactic to keep anyone thinking of supporting two platforms at once to "not want to risk it" and go for the iPhone instead. More users, only so many hours that the developer can be awake, safer to just go with the iPhone. (Of course, you are already risking it anyway; use the wrong multi-touch gesture -- app denied. Use a Google service -- denied. Do something useful that Apple wishes they thought of first -- denied. And people wonder why there are so many fart apps...)

My next guess is that this tactic will be successful. People seem to adore doing whatever Apple tells them to do. It frightens me.

What I've learned from iPhone vs. Android (among other things) is that people will pick pretty and mean over average and nice.


I am not sure that it'll be exceedingly difficult to determine if an app was compiled using another tool, and not written in Objective-C.

I'd expect many languages and frameworks are going to have many very signature functions and patterns of code. If Apple decides to enforce this, they won't have a hard time.

On top of that, if they miss it, and let a bunch of apps in, then later on determine those apps were crosscompiled, they can revoke the current versions and block that developer...


But of course, this makes for unhappy users too. "I just lost the $5.99 I paid for that app!"


NetShare users kept using the app months after Apple pulled it. I think people finally stopped using it to tether because an update finally broke it, not because Apple deleted it off people’s phones…


Easy - they can and should just refund the money if they take the app away.


> They can't get atomic syscalls working in OS X.

Have you filed a bug?


Nope. I enjoy watching users of proprietary software suffer. Since I don't use OS X, why would I care, anyway?


Even if it can, that's not the point.

Is this the kind of company you want to build software for? This is bullshit. Do I really even want to play along anymore?

This company makes great products, but they can do completely dickish things.


Even if they can tell the difference, C code is C code. It shouldn't matter if it was generated so long as it uses kosher API calls. Doing so would be no different than enforcing which editor I can use to write code in.


Even if they can, the policy is going to be a strong deterrent from using those tools. One would rather be less efficient than risk getting caught and being rejected (maybe even blacklisted?) from the App Store.


Yeah, the "no compatibility layers" bullshit is just that, bullshit.


I don't really have a politically correct way to say this: What a horseshit maneuver by Apple.


I have been a very loyal mac user for years and love it, but I'm seriously considering going back to Linux. I am not a free software "zealot" and don't mind some closed-ness, but this is getting absolutely insane.

Enforcing which LANGUAGES can be used on a platform?!? Insane!

Edit: I've been looking at these guys:

http://www.system76.com/

(I don't work with or have any vested interest in them, but they look cool.)


Boycotting the Mac platform (which is pretty much free of any such absurdity) doesn't make a lot of sense to me. That's where Apple is doing things right. You should of course, shun iPhone OS devices if you feel this way.

The system76 laptops are probably generic machines from Clevo, Sager or some such with a custom badge. Alienware used to do the same thing.


If I had to guess, the iPhone and iPod Touch are seen as more successful endeavors to Apple than the Mac has been of late. So I wouldn't be surprised if Apple's iPhone policies start leaking their way onto the Mac too.


If you believe Trefis, the iPhone makes up a little over 50% of AAPL's value as compared to ~20% from the Mac line, so anyone that thinks of it as more successful is right: http://www.trefis.com/company?hm=AAPL.trefis&hk=34d9a244...

I really hope that this ridiculousness doesn't start to bleed through - the MacBook Pro is pretty much the only laptop I've ever considered usable for development, and I don't know what I'd do if I had to abandon it...


Get a Sony Vaio Z, and install Lucid?


Yes, second Sony laptop ind the last 6 years (so i had the first one 5 years, and it's still in use without any hardware problems!). Displays are top notch, as well as the keyboards on the new one, best keyboard i've seen. Of course that's purely subjective ;)


Sadly, an IBM Selectric keyboard won't fit in a notebook ;-)

I specially loved the one that came with the 3278 terminal. And the clicky ones, like the ones that came with the 3290.

Nowadays, when on my desk, the netbook is hooked up to a Microsoft natural keyboard. I would like the Sun keyboard, but Sun won't ship it to me in Brazil and local dealers want... US$400 for it.


That's what I'm worried about. The instant that happens, I'm gone.


I think I'm already gone :( I am now seriously considering mowing down my iMac and MacBook and putting Linux on them. (Not that this makes a lick of difference, but it will make me feel better)


It does make a difference. Linux needs more users, both active and passive. I'm actually surprised how many people that would be the target audience for Linux have come to use OS X instead.


User interface, user interface, quality of hardware, and user interface.

"Quality of hardware" boils down to user interface... just at the hardware level.

Their UI ability is the secret of Apple's success. Everyone else treats UI and design like an afterthought.


Yes, but it's the content creators that are developing the apps and making the iPad/iPhone a more valuable platform.

Boycotting Apple's content creation tools and consequently not developing for the iPad/iPhone is one way to send a message.


I've been wondering about this a bit lately. I've so far chosen not to support the iPhone OS for the reason you mention, instead favoring Android.

However, I'm ultimately interested in serving my application to the largest number of people, and it seems clear to me that the App Store is the best way to achieve that today. So, do you cave in and develop an iPhone version? Or, do you stand by your morals, and, in turn, limit your audience?


If you're already developing for Android, stick with what you're doing and you'll soon be serving your app to a much larger market. Android growth is pretty strong: http://news.cnet.com/8301-30684_3-20001788-265.html http://www.pcworld.com/article/193653/google_android_hits_ma...


I wonder if Leo Laporte is going to be right about how he thinks that will Apple eventually stop making macs so that they can focus on more closed or controlled platforms like the iPod, iPhone, and now the iPad. They seem to be pretty good at making money with the closed systems.

Maybe the next iBook will just be a foldable iPad with the same closed OS.


Maybe someday but for now Mac sales are going up -- not down. I can't imagine they'd walk away from that market. I could see the two platforms merging in certain ways. For instance if you added a multi-touch screen to the current MacBook, retaining the keyboard/touchpad, you'd have an interesting platform. You'd have desktop multi-touch apps very similar to iPad apps but you'd also be able to use traditional OSX apps. There could be some system wide multi-touch features along the lines of what they're doing with the multi-touch trackpads already to enhance these applications.


I predict the differences will shrink and disappear. The desktops and laptops will get touch input and iPhone apps, and the closed platforms will round out in specs and features.


The hardest thing to me about ditching Mac would be that Apple is the only company capable of doing a user interface.

ALL other user interfaces by ALL other vendors suck.

For some reason, no human beings on the entire planet other than those that work at One Infinite Loop in Cupertino are capable of doing a UI.


This is a massive exaggeration. I went from OSX to Ubuntu and I am more productive on Ubuntu in interface terms. The dock is very silly and graphical when you have quicksilver or gnome-do. Chrome is a better browser and browser UI experience than Safari in my opinion (and lots of people like firefox and opera more). The Finder vs Nautilus differences aren't big enough really, there are pros and cons to each. Apple stuff is well designed until you want to go past just cookie cutter customizations and then the openness IS the UI because I define it to work how I want to.

The same caries over to other things. Sure the install and uninstall of applications on OSX is a great UI. But apt-get just seriously leaves it in the dust. The problem with linux was never that it had a bad UI or was too customizable. The problem with linux was actual hardware bugs in drivers, lack of office, lack of flash, lack of games. Which have mostly been addressed asides from games (which is a sore point for Apple too). UI is far too overrated over actual features.

And it's all based on a myth. Steve Jobs ringing up SUN over their looking glass and threatening with UI patents is just ridiculous. As are the claims to Apple fame with interfaces taken from Xerox.

So yes, I agree Apple are very good, possibly the best at UIs. But my points are 1) UIs aren't as massively important as people say, 2) Apple's "innovations" have been overrated 3) your statement that "no human beings on the entire planet other than those that work at One Infinite Loop in Cupertino are capable of doing a UI" is just ridiculous.


As much as I agree with the "it's not that bad" opinion, I can't understand why do you say "UI is far too overrated over actual features". No "standard user" will want to know about apt-get. They have synaptic and others. Normal users don't want to use the console at all. Command line is good for developers and power-users - and that's a minority. Majority wants to work based on recognition, not recall because they don't care enough to remember things.


I agree, that's why Ubuntu comes with the "Ubuntu Software Center" which is basically a frontend for those tools my grandma could use.


What Apple really has down about UIs is sparseness. "Exterminate features." I loathe clutter, both in real life and in technology.

When something on a mac looks like it should be able to be frobbed, dragged, etc., it usually can and in exactly the way that it seems like it should.

This is not the case on Windows, Linux, or anything else.


For me, Apple's UI advantage comes from two things:

1. It's way the hell better than Windows. No competition here. Windows is horribly clunky, and if you disagree with me, try using both for a while.

2. It has less of a tendency to plunge you into pesky little technical details than Linux. I personally don't mind fiddling with a config file every now and then, and the more recent Ubuntu versions are getting surprisingly good about this, but Linux still demands more effort to get a good, productive environment going. And Flash support still sucks.

Of the two, I prefer Linux in terms of usability. I doubt most non-technical users would agree. (Chrome, though, is just unambiguously better than Safari in every way. I use both regularly and don't want to dislike either of them.)


I can't stand the Mac UI. In every way that it's better than Windows, there's some way that it's worse. The dock is a terrible interface element and having a menu at top the of screen may have made sense when screens where 512×342 pixels but makes little sense with giant resolution screens and multiple monitors.

I'd say the all user interfaces from all vendors suck, including Apple.


There’s a reason it’s at the top of the screen, and it has nothing to do with screen resolution. Try looking up Fitt’s Law (http://en.wikipedia.org/wiki/Fittss_law).

Essentially, the menu items are infinitely tall hit targets… no matter how fast the mouse moves towards them, one can never overshoot them vertically. Menus in Windows and most *nix environments require both horizontal and vertical precision.

Furthermore, why are you bothering with the Dock when Exposé and Spotlight (or Quicksilver, Launchbar, etc.) offer great power user alternatives?

Personally, I still can’t stand how Windows and Linux make no distinction between applications and windows.


> There’s a reason it’s at the top of the screen, and it has nothing to do with screen resolution.

I understand the original reason for it. However, hitting the menu is an extremely large distance from what you might be working on now. I'm typing this on a multi-monitor machine -- this app is totally self-contained on one monitor. Why must I move my mouse across two monitors just use this app's menu?

> Personally, I still can’t stand how Windows and Linux make no distinction between applications and windows.

I can't stand how you can close all windows on a Mac and still have the process around with the only indication being a slightly difference in the menu bar. Amazingly confusing.

Now this may be personal preference but it still shows that the Mac GUI isn't some ultimate model of perfection that everyone agrees on.


It's only true that you can't overshoot them if your trajectory is perfectly vertical. With a diagonal trajectory (which is usually the case), you can still overshoot the item you were aiming for.

Also, Fitts' Law says that the time to select a target is proportional to the distance and the size of the target. Putting the menu on the edge of the screen makes the effective size of the targets bigger, but in some cases it also means that they are much further away.

As I type this, the menu bar is about 4x further away than the top of my browser window.


"Essentially, the menu items are infinitely tall hit targets…"

And they violated that rule in the dock, unless that's been fixed in the last couple of versions.


Although historically I have always agreed with you, I dare say Microsoft is starting to figure it out. Slowly but surely. Win7 is almost a pleasant experience.


I just got my new laptop yesterday, and while some of this is surely due to the SSD or the New Laptop Smell, Windows 7 is EASILY the best experience I've ever had with a Microsoft product. The new taskbar looks wonderful, the design is up there with Apple, and everything is so freaking fast it is unbelievable.

I booted my other laptop to transfer some data over and waiting for Vista to load to the point where I could interact with Chrome was like pulling teeth.


Windows 7 is usable for two reasons:

1) They removed much of the "originality" that they did from Vista.

2) They copied more stuff from Apple, and a little bit from the Linux ecosystem.

Oh... and what's their fetish with that sickening shade of teal/blue? Yuck! It looks like smurf vomit.


I doubt that Microsoft has copied anything from the Linux ecosystem. And with Linux innovations such as moving the window buttons around for no reason and upsetting their users, I don't see why they would.

Linux fans need to relax a bit and realize they're number three for a reason and work harder not complain harder.

Edid: I'm sorry, do I sound bitter? Maybe I've grown disillusioned with the difference between the reality in the trenches and fans extolling the features of product X, where X is any of Linux, Mac, Windows, iPhone, etc.


I've started using it this week. Alt+tab becomes more and more broken with every release. There was a time when excel was the only application that didn't behave properly.


I honestly wonder what you mean by Alt-Tab being broken? Surely it does switch windows, doesn't it?


The stack doesn't work consistently.

In the old days, when you minimised something, it always went to the bottom of the stack. (Except Excel, which had a silent entry in the stack when you had more than one document open. Outlook has been bizarre for a while, too.)

In one of the NT4 service packs (I think) this changed so that if something minimised to the status bar (rather than the task bar) it worked differently. I'm not sure what Vista did, but XP was bearable.

Now there are lots of variables - different apps respond to being removed from focus differently, I've read that the number of open applications affects stack response to minimise also.

Windows used to be very friendly to rapid keyboard-only operation. You could drop things in the start menu and activate them with two keystrokes. Alt+tab was dependable. No longer.


Hmm. I changed the theme to classic mode and it seems to work like I expect it to.


    ALL other user interfaces by ALL other vendors suck.
I don't understand your enthusiasm for aqua (I've got a mac) but I understand what you're sayin in a general sense because I came to a similar conclusion about ten years ago.

If this is a big thing for you, I'd recommend the path I took after Be folded. Accept that complex user interfaces have bad tradeoffs (platform dependence, inflexibility). Find a full-screen tty you like (I use iterm because with apple+key + enter it goes full screen and gets rid of aqua) and return to living in the habitat of your ancestors!

A few things are inherently visual: paint programs, 3d games and movie editing. Everything that is not can be done effectively on the console. These interactions are often far superior to GUIs.

There's a learning curve. But once you're over it you'll have enormous power at your disposal and won't ever get locked in again.

Two things make this far easier than it has been previously:

1) Python. The standard library contains everything you'd want to do to push a system around. You can hammer out powerful tools in python in a casual manner and at a speed that has not been available to mere mortals before. You can get it on a variety of platforms.

2) Web browsesr. It's now easy to get high-quality web browsers on any platform you'd want to use. Where you do need to produce a GUI, you can knock up a trim webapp with html and forms.


> The hardest thing to me about ditching Mac would be that Apple is the only company capable of doing a user interface.

Some systems have a CLUE (Command Line User Environment).


Boycotting the Mac platform (which is pretty much free of any such absurdity)

For now.

That's the thing about Apple's capricious, passive-aggressive contract language... you have no idea, and no way to even guess, if your business model will be the next one they target for termination.


Ubuntu will run on most laptops, including mac books, so need to go with a special vendor.

https://help.ubuntu.com/community/MacBook

I actually went the opposite route and installed mac os on a netbook.

http://gizmodo.com/5156903/how-to-hackintosh-a-dell-mini-9-i...


I have a Starling Netbook from system76. Nothing to write home about, but I've had few, if any real problems with it.

It's a little flimsy, physically.

And I am a proud free software zealot, so I moved it to debian.


Hmm... if it's flimsy I probably wouldn't like it. One thing I don't like about Apple hardware is how you have to treat it like a family heirloom or it turns into a scratched up mess. Anything less hardy than that would be unusable.

Toughbooks are also promising, but they're light on the specs... could someone please make a durable and well-specced laptop?


I wasn't particularly gentle with my MacBook Pro and it never got scratched or damaged.


I moved to a Thinkpad (from a Powerbook) two years ago and couldn't be happier. Ubuntu just works and the hardware feels more solid that the Powerbook did.


IBM?


IBM laptops aren't anymore - bought by Lenovo a few years ago.


That's strange, because I also have a System76 Starling and it seems very sturdy to me. I even dropped it once and it was none the worse for wear. It's been a very hardy and well-functioning machine for me.


I have a lap top from them. It works well.


Well, you're on the best way of becoming a "free software zealot". See, free software zealots by and large are what they are because they invested a ton of time and effort in some platform only to have that destroyed by a stupid or greedy management decision.


It's almost like refusing to accept patches that are written in anything but vanilla C. It's almost like they are claiming if you can't at least code in C you probably aren't smart enough to submit patches anyway.


> I am not a free software "zealot" and don't mind some closed-ness

Maybe we were not that zealot after all when we free software "zealot" said that proprietary software allow their owners to treat their users badly and that eventually, this happens to every proprietary software. Just saying. It amazes me how surprised users of proprietary software are every time they get screwed by their masters even though this has been happening for the last 30 years.


There are products and there are platforms.

Products are OK to be proprietary as long as the value provided is top-notch. Sorry, but I don't see profesional designers using Gimp over Photoshop.

Relying on a platform for your existence is a different story. But as a business you need alliances with other businesses, and not just in software. And everybody can pull the plug on you, that's why reputation matters and in many cases it's all you need.

About free software, programmers need to eat too. Myself I use open-source everywhere, but for the last 7 years I've been doing consultancy work (turn-key apps that are never released in any form or web services that put a lock on your data ... the worst kind of closed systems). And until you'll teach me a business model that would empower me to work on "free software" while providing for my family, then I'll keep doing it.

Until then it's only fair I get paid for my work, that's why I consider the free software philosophy as extremist bullshit.


My entire business (and livelihood) is centered on configuring and adapting free software to fit the needs of businesses. I contribute along the way because what one person needs, likely many need.

Free software is an infrastructure. I drive on it daily delivering value and getting paid.


You're wrong: every decent product is a platform. Photoshop and GIMP have plugins developed on top of them, and professional designers are very much reliant on "photoshop the platform" for their existence.


lets see theres training, and consulting, theres offer the free version, and charge for a more robust paid version, theres pay for customization and don't forget ads. There are many ways to have a business model around "free software". Red Hat, Canonical, and Google are just some of the companies that have found valid business models around "free software".


> Red Hat, Canonical, and Google are just some of the companies that have found valid business models around "free software".

That's a fallacy.

The majority of open-source sponsors are selling closed systems or services to sponsor their "free software" involvement. Google doesn't have a business model around "free software". Neither does IBM or Sun.

I'm also not Mozilla and my apps would probably never get in front of 40% of all Internet users. Even if I'm that lucky, it's probably not going to be a desktop app that's used to search for stuff.

To get paid for customizations, your software also has to be really popular for businesses (consumers don't pay for that, they either endure it or search for something better).

Did I mentioned that I don't live in Silicon Valley nor in Cambridge, but in an Eastern European country? So training is off.

I already mentioned consulting, but then I would be a hypocrite if I promoted the free software ideology while working on the worst kind of closed software, wouldn't I?

Anything else?


> > Red Hat, Canonical, and Google are just some of the companies that have found valid business models around "free software".

> That's a fallacy.

> The majority of open-source sponsors are selling closed systems or services to sponsor their "free software" involvement.

Red Hat, Canonical and a large chunk of IBM GS wouldn't be able to sell those services with proprietary products - the OSS licensing of various Linux OSs, Apache, etc. are the basis for them being able to have an audience to sell their services.

I agree with you re: Google. They're not a service business and could have written their own OS or used a proprietary one and not affect revenue.


I agree with you re: Google. They're not a service business and could have written their own OS or used a proprietary one and not affect revenue.

Not sure what licensing fees would have done to their overhead costs in the early days.


> Did I mentioned that I don't live in Silicon Valley nor in Cambridge, but in an Eastern European country? So training is off.

Maybe you should have a look at this again... I've been on a trip to Romania for training (about architecture of some specific piece of OSS) at some point. As long as you can provide good training, it could work for you too.


Does Canonical make a profit? A business model that loses money isn't really all that valid.

And the software that makes Google their money is proprietary, unless you can show me the link to the AdWords server source.


Phusion (makers of Ruby Enterprise Edition and Phusion Passenger) definitely make a profit.

I think they basically just do it through consulting + a little support and training. (See http://phusion.nl/services). But since they wrote Passenger and REE, I'm sure they can command a very high rate.


Thank you, I agree completely and you've said it much better than I could have.


This is human nature, it's not related to source openness. Open source would just allow you to start over after you have been treated badly if you have the needed development skills (hello Mr. Drepper).

It's also worth noting that the users aren't getting screwed at all by Apple, only the developers and only a small fraction of them.


I don't get it. You don't have to "start over" you continue from where you left off. Only some members of the community need development skills.

Secondly, users are harmed by these actions as there will be fewer developers making apps for them, plus, Apple are potentially stifling innovation.


It's on purpose. Apple willfully sacrifices innovation for control and UX. Users accept it for UX.


You do realize that your decision is purely based on feeling and irrational? NEVER make decisions when you are emotional unless it's a life and death matter and you must act immediately.


Yeah, I really thought all the hysteria about how Apple will crush small children's hopes and dreams by distributing a closed platform that does not allow tinkering was off base until now. This has much more the flavor of intent to put a stranglehold on the device above and beyond handicapping their competitors.


With much deliberation, I had got to where I could accept the ipad--window on the future of computing--being a walled (and barbed-wired, and guarded) garden. I was planning to get a 3G iPad if I have any money left after taxes, and a WiFi one for my wife soon. But this... I think this is farther than I'm willing to go. And even if they were to back down on this requirement (as they did on the original NDA)... I'm rambling, but I feel hurt by this.


Just make a webapp :/ all the freedom you need.


Freedom to have no access to the camera or microphone, for example?


It'll get there... You can already have access to the multi-touch stuff via javascript.

idk, I've never thought the appstore was anything but a stop gap personally, and developing for it a crazy risk.


It won't get there before the iPhone is obsolete. You really think that web apps are ever going to get access the microphone or the camera?

Also, a stop gap? I think there's an app for that.


"Mobile" is what will become obsolete. These new devices are handheld computers pretending to be phones. Someday they will embrace this reality, and inherit all of the progress that's already played out on PCs/laptops - browsers, flash, etc.


Handheld computers (with and without a phone) have been around forever. Apple has changed the game from that open reality to this and have been quite successful with it. You have it backwards.


Hand held computers capable of rendering standard web pages have not been around for very long, unless I'm overlooking something.


Moore's law. Even desktop computers capable of rendering today's web pages haven't been around forever. But a web browser is just an app -- and all platforms have them now. I'm not sure what point you are trying to make.


>> "You really think that web apps are ever going to get access the microphone or the camera?"

Definitely, if other phones or devices have such capabilities, and a killer webapp comes out that uses them.

If I was Apple I'd just shut the App store down to get rid of all the ungrateful developers. Sorry, but it's their show. They get to decide if you can play, and what the rules are. If you don't like it, don't play with them.


Seems like Apple thinks the same restriction applies. If you use parenscript or coffeescript (or GWT) to generate your JavaScript, you are technically violating the contract you signed with Apple.

Of course, if you didn't sign that, then you can do whatever you want.


If they started actively censoring/blocking websites that they determine haven't written javascript by hand, then that would be theonion.com territory (And the end of their sales) :)


> then that would be theonion.com territory

As opposed to where they are right now? Enforcing what higher level language you write in before it gets compiled down to machine code?


The web = open

Native apps that run directly on their hardware = closed - you play by their rules or not at all.

shrug pretty clear difference IMHO.


Yup, freedom to have: no access to hardware, no ability to do anything computationally intensive, no filesystem access, no storage, no disconnected access, ...


There is storage--and the ability to run offline--thanks to HTML5, and the same web apps will work the same (more-or-less) on Android and webOS (Palm) phones. You get a SQL db (up to 5MB), plus a persistent key-value store and a window-local "session" key-value store that isn't persistent.

But yeah, no camera/GPS/microphone/etc.


You can now get location from Javascript APIs in the safari browser following the iPhone 3.0 release.

http://blog.bemoko.com/2009/06/17/iphone-30-geolocation-java...


How well does that work for gaming? I like playing around with Unity3D; I have a hard time thinking that HTML5 is going to give the same experience.



Putting aside the intention of this agreement which does seem targeted at the Flash-to-iPhone compiler, this just seems overly broad and silly from a "how things work" point of view.

Let's say I write an iPhone app originally in Scheme (like this guy did: http://jlongster.com/blog/2009/06/17/write-apps-iphone-schem...), and compile it down to C, which is then compiled to object code and linked against the iPhone libraries. At this point, the object code is the same (or functionally the same in terms of its syscalls, library calls, and general program flow) as if I had originally written it in C, except that I would have lost the unique developer efficiencies I got from using Scheme in the first place. I'm not saying Scheme is better or should be an officially sanctioned source language for the iPhone SDK. I'm just saying, where the rubber meets the road -- object code linking against libraries and making certain calls -- there is no difference to the computer what the original source was.

Seems very silly.


I had the same thought and wondered, if a developer did follow such a path, how would Apple identify that they did? I'm not sure Apple could detect that such a method was used. Although, I suppose they could ban an app even if they only suspected that it had been developed in such a fashion.


1) They may not be able to detect all frameworks, but they will easily be able to detect the top N frameworks, where N scales with how serious they are about enforcing this.

2) The app approval process is so opaque that it might be hard to tell if they were banning you due to this reason (note I hear things have improved, so this point may be out of date).


Say hello to the new boss, same as the old boss. The corporate pissing matches have started in earnest...

When Alan Kay said Apple would take over the world with an iPad, I don't think he realized that eToys or anything like eToys would never be allowed to run on the device and that a majority of the apps will probably have commercial spots embedded within them. Actually it reminds me of "educational" tv all over again...

I'm starting to see the point of people who complain about the consume vs. create nature of the iPad...


> Say hello to the new boss, same as the old boss

Who do you mean by this? Microsoft doesn't make sense in this context.


"a majority of the apps will probably have commercial spots embedded within them"

The majority of apps on the app store already do. Free apps, ad-supported. Apple just wants the ads to not suck so much, but hey, nobody has to use iAds.


but hey, nobody has to use iAds

Yet.


Well, Steve did flat-out say in the Q&A session today that they weren't going to require anyone to use their built-in services for any of that, and Apple doesn't really have a history of changing what they've said they're gonna do.


I don't know one way or the other about their history of going back on direct claims, and that would be a pretty sprawling area to get real data on. But they certainly do have a history of locking the phone OS down more and more and controlling it further and further.

I really wouldn't put it past them, and I think today's a bad to err on the side of "things won't change."


"Your contract when you buy the iPad is you're going to watch the spots. I guess there's a cetain allowance for people going to the bathroom..."


...until Apple changes its mind.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: