In my experience so far with such "cross platform compatibility layers," they always produce results that water down each platform's individual strengths and differentiations. And of course, instead of the developer being locked into the phone platform, they are locked into the compatibility layer's platform.
Adobe's Flash compiler is a classic maneuver to "commoditize your complements," as Joel put it so well. Apple don't want to be commoditized, especially if it means having apps that don't take advantage of the iPhone's strengths.
Adobe want to lock developers into Flash and commoditize everything else as Flash-delivery devices. Apple want to commoditize applications and lock developers into their APIs.
Perhaps, like Nintendo, they learned the lessons from the collapse of the home video game market in 1983. When Nintendo was contemplating developing the NES, they took a deep look at what had caused the collapse. What they concluded was that the main cause of death was the market being flooded with too many crappy games.
Originally, if you wanted to write, say, an Intellivision game, you went and got a job with either Mattel or APh Technological Consulting (the company that did the hardware design, system software, and many of the early games for Mattel). If you wanted to do an Atari game, you went and got a job with Atari.
The games at this time were all pretty good. A consumer could go out, buy a game based just on the information printed on the box, and go home and be pretty sure they'd have a good experience.
As time went on, a few more companies joined the party. Activision and Imagic, for instance. These companies were started by people who had worked for Mattel or Atari or their contractors like APh, and generally produced quality games.
A consumer still could be confident that plunking down $40 or whatever for a new game, based just on the box description, would be a good move.
More time passed, and companies that had little or no connection to Atari and Mattel jumped in, using information gleaned by reverse engineering the consoles and system software. The information was not always complete, and they didn't know all the tricks and techniques we authorized developers knew to squeeze greatness out of the hardware. They produced a lot of crap games.
Consumers now found that spending $40 on a game was a big gamble. They had to work to get good games--be aware of brands, read reviews. They stopped buying--all games, not just the bad games.
Nintendo's conclusion was that their new console must be locked down. Only developers that Nintendo approved would be allowed to produce cartridges. This way, they could ensure that quality remained high, and get the now shy consumers to come back and give games another change.
It clearly worked--and consoles have been locked down ever since, and the console game market is huge.
However, that isn't what the approval process is. There are literally thousands of crappy applications that were happily approved and clogging up all categories in the app store. It seems non-trivial app rejections are not done on behalf of the user but are done solely to protect Apple's own interests. Remember when a bunch of high-quality Google voice apps disappeared from store? And that's just one example, there have been many more.
And now they're rejecting apps not based on their quality, but based on the programming language or development environment used to create them. How is that at all relevant to the user? This is entirely about protecting Apple's own interests and the comparison to Nintendo's lock down of the NES is not applicable.
What I don't understand is how this isn't anticompetitive behaviour. By creating an app store and lock-in for application vendors, Apple become the only provider in the market. They now appear to be leveraging that monopoly to restrict another market, that of developer tools, to their commercial advantage and at the expense of a competitor.
I'm not a lawyer and don't live in the US where presumably any legal action would be brought, but can someone please explain to me how this isn't black-and-white illegal under US law?
First, we have the hardware, and as you say, Apple is certainly in a competitive market with its iPhone offering.
Then, for each type of hardware, we have the software that runs on it. Anyone could write software to run on Apple's hardware, but because Apple lock down the phone and run the only app store in town, they have a de facto monopoly on the supply of software to iPhone users.
Finally, we have software development tools. Again, anyone could write tools to help software developers using Apple's hardware. Indeed, according to recent reports connected to this story, many people have, from Adobe's Flash CS5 team to fans of Haskell.
Again, I'm no lawyer, but I would expect that Apple would be perfectly within its rights to lock down the software that can run on its device, but would not be allowed to use that power to unduly influence the secondary market of how that software can be made.
What gets interesting is when/if Apple's dominance continues to grow to the point that the "iPhone software" market is effectively the "smartphone software" market. At that point, the DoJ and/or FTC will almost certainly decide that many of these policies are anticompetitive and initiate some sort of action to remedy the situation.
So in some sense, Apple needs to worry about becoming too successful: it is only because they haven't achieved total dominance that they can get away with being so ruthless.
"Apple responsible for 99.4% of mobile app sales in 2009"
... it doesn't take a Phd to acknowledge that the AppStore IS the mobile SW market - thus - i would rather disagree that Apple argument SHOULD be successful in case somebody (or some agency) should bring a legal challenge to the new draconian policies - which are obviously abusing Apple dominance ...
the question for me is more technical = who can bring such a legal challenge? can developers do that? can users do that? or only a govt. agency can ... in which case - considering the influence and connections Apple has with Washington - that might not happen ...
would it be possible for developers to team up in class-action suit based which could then trigger a govt. investigation?
The fact that 99.4 cents of every dollar spent "after the hardware purchase" goes to apple platform.. doesn't make it a monopoly.. it merely means that it is the most successful add-on market..
A good example for comparison.. is a bit old, but illustrates this particular point beautifully.
The fact that more companies are interested in producing add-on stuff for a product and that consumers are more interested in BUYING that stuff.. doesn't mean that a company is anti competitive/antitrust regulated/a monopoly..
Volkswagen beetle aftermarket parts spent some 30+ years as the king of aftermarket parts .. everything from "third party" replacement oem style parts (stuff that matched the original but was cheaper for whatever reason) as well as stuff that essentially completely changed the product into something else (dune buggy conversions, engine swaps, totally different interiors, etc)
Was VW a monopoly because for 30 years 4 out of every 5 dollars spent on "aftermarket parts" was spent on Beetle bits? no and no one ever thought to consider or call it one.. it was just a hugely successful model that didn't change every 11 months, and therefore was a fixed point in space for manufacturers to target.. but more importantly CONSUMERS WHERE BUYING.. as opposed to your avg Ford/GM/Chrysler buyer who for the most part do NOT just go out and buy total conversion kits/hopped up engine parts.. there where many manufacturers who made parts for various successful models such as muscle cars over the years, and still do.. they didn't cry about antitrust because VW add-on makers made more money, nor did they cry that VW should change the way they made the beetle so that "beetle engines" would fit in any car (or vice versa)
There is NO ONE who could bring a class-action and win, and there is no way that adobe could sue and win either.
Because "customer choices" when they have real choice, do not make a monopoly, rather removing customer choice creates a monopoly.
but still it seems the aforementioned laymen had the right intuition - as the appstore draconian policies are finally being looked at by regulators ...
"According to a person familiar with the matter, the Department of Justice and Federal Trade Commission are locked in negotiations over which of the watchdogs will begin an antitrust inquiry into Apple's new policy of requiring software developers who devise applications for devices such as the iPhone and iPad to use only Apple's programming tools."
Presumably Adobe are the most likely candidates who also have serious legal firepower.
This is a chicken-egg situation. Apple has Apps, so more people buy iPhones than other phones. with Adobe's dev tools, devs could make apps for more than one platform - that is a threat to apple. When apple shuts this down it is an anticompetitive act against other phone makers. Because if devs could put apps on more than one platform, then other platforms would become more appealing.
and they aren't locking in any body. You don't forfeit rights to your source code, you're welcome to write some crossplatform app using some shared code library you build in-house. You just have to build apps natively, rather than use some watered down piss poor Common language that breaks standards on all platforms. This is the same reason adobe apps suck on macs. They have tried to abstract away the os from their applications to the point where they don't look, act, or function properly on any platform. They look like ass and run like ass on everything .
#1 - Two of the worst carts preceding the '83 crash were E.T. and Pacman, both developed and produced by Atari itself, not these mysteriously inferior 3rd parties you're alluding to. And how many games has Apple, who logically has the most know-how on the platform, produced? None.
#2 - You're making an oranges to apples comparison anyway. The video game market was not crashed by the availability of cross-compilers or tools that lowered the bar of entry. Similarly, Nintendo did not solve the problem by restricting what tools developers could use. They solved it with a strict editorial process.
#3 - Video game production in 1983 required producing and marketing physical goods. It relied on predictable "hits" just like AAA game development to recoup the considerable outlay required to get these games in front of consumers in the first place. iPhones games are virtual and the marketing for many of them non-existent (simply because I can't spend $0.25 CPC on Google trying to sell my $0.99 app). Additionally, there's a long tail of developers creating a more robust landscape of content. There can be tons of failures and still leave plenty of room for successes. Just look at how many games on the iPod have made it big. Many of them came from virtual "nobodies".
#4 - In 1983, there was no manifestation of "wisdom of the crowds" to guide any consumer purchases. Word of mouth was about it. Today, at Apple's scale, one can find dozens of opinions about the quality of a game that only 0.01% of total users may actually purchase.
Many of these problems continue to persist in the locked down AAA console world that you seem to be so fond of. You know, I can accidentally buy 50 terrible iPod games and still spend less money than I would have spent accidentally buying 1 terrible PS3 game.
Though I have not purchased this game and cannot comment on whether it diminishes your first point.
And as mentioned above, they dont have a monopoly in the market so, from Apple's perspective, 'if you don't like it, there are other opportunities' ... developing for the bberry :)
They've spent some money and time, and they have taken a lot of other people's work: Mach, gcc, Smalltalk, BSD, etc.
"they dont have a monopoly in the market"
That's not so clear to me. I own an iPhone and an iPad even though I think they really suck technically. But there is content available for them that simply is not available for other platforms.
if (A && B)
else if (A && !B)
What I am getting at is, with programming there are rules. The rules Apple have are actually very minor and very easy to stay within. If you program to make political statements, choose Android or BB or Symbian. If you program to make money, pick the platform that will do that and follow the rules. If it stops making you money, move on. If you don't like the rules and don't want to abide by them, move on.
You can enjoy programming while still staying within the rules. Sometimes, it is part of the fun and challenge.
Apple is trying to accomplish the same kind of lock-in that Microsoft managed with Windows. And we better nip this thing in the bud, because Apple would screw us even worse than Microsoft has.
As a long time Mac user, I've experienced a lot of Mac applications that have been straight ports from other platforms and they are, for the most part, pretty awful. I can understand from this why Apple wants its developers to code iApps natively.
This 'lock in' makes perfect sense for Apple in other ways too, ways in which end users and developers will benefit. Imagine that Apple allow apps to be ported from Flash. Developers would stop coding natively for iPhone OS as they would be able to create their apps in Flash and distribute them as web apps at the same time, reaching a greater audience. Then add in Android, Blackberry & other export options for Flash. Soon enough Flash would be the only IDE in use and platforms such as iPhone OS would be at the mercy of Adobe. If Apple were to introduce new features and efficiencies to their hardware and APIs, they would have to wait for Adobe to implement them in its Flash translation layer before the features would really become available to end users. Even the most willing and motivated of developers would not be able to get around that, they would have to wait for Adobe. So in the end, Apple would lose sales and credibility, and good developers would get screwed because they wouldn't be able to out pace their competitors in updating their apps to take advantage of new features. Everyone becomes 'locked in' to Adobe. Given Adobe's poor history when it comes to timely bug fixes and support of its OS X applications, I do not think that this 'lock in' would be a nice place to find yourself, whether you're Apple, a developer or an end user.
If you don't like Apple's stance then develop for other platforms and buy other products. But if you want to be in on the action, then accept the rules as they are not unreasonable and will ultimately benefit everyone.
We tend to frown on loops and mutating variables.
It's the same strategy that they've always used with their software/hardware combination, which has worked beautifully in the case of the mac.
check the numbers - AppStore IS a monopole ...
Rose tinted glasses. They still managed to release buggy, downright broken software. It wasn't about quality, it was about control.
1. It's only a matter of time until apple realizes this model is not good for anyone.
2. Android (who by the way is powered by the very people who specialize in filtering out the crap on other platforms) will dominate the next few years.
That idea started 1.5 years ago. Now you get Google ads in your free apps. Now, as an option, you can have iAds or Google ads. So all Apple did is open an option (read choice) for developers already doing ads in apps.
Choice is now bad? Or is it good?
Do something like Reddit, Digg, or Hacker News... let people guinea-pig apps and upvote/downvote them and sort in each category.
(HN is, IMO, above the rest but it's certainly not immune to mob rule)
Besides, with the money involved in high rankings you'd have to constantly police the system against gaming, which would be, I'm guessing, more work than policing the submissions directly a they do now.
(First my credentials: I was an avid player of Atari 2600 games at the time, so I remember the period in question first-hand.)
My take is not that there was a decline in quality due to any sort of technical reason, it was due to a drop in the quality of the gameplay design and playtesting. That, in turn, I'm guessing was due to the number of Atari 2600 game creators increasing past the threshold of GOOD game designers & playtesters available. After all, it was a relatively new field at the time, video game design. Perhaps it reached a point where there were say 50 new games being "designed" concurrently but there were only 20-30 good designers. Whereas before, it was under that threshold.
This was my theory because I've heard your position stated a few times in the web, but from direct experience I remember it being more a drop in the quality of game play rather than in code quality or technical polish.
Don't worry, if you savor the cross-platform software experience there will be plenty of options for you. At bargain prices, in fact.
... please - you Apple fan-boys should try things before jumping to conclusions ...
i dumped my iphone for a nexus specifically for Google Voice and Apple draconian control on the platform ...
I think apple is doing well in the (presumably free, it's way more open than other places) market, because they are very good at negotiating their property rights.
I guess it's just fashionable to imply the "bad guys" are "communists"
Do you mean the FCC approval process for new cell phones is a massive barrier to entry? I'm sure it's not free, but i can't imagine they'd want more that a few dozen phones and $100k of studies. maybe a half a million?
Of course, this is why I don't use C, C++, or Objective-C. I'm always a little surprised when someone writes code in one of those languages that actually runs.
The good news is that there isn't very much of it.
The rest of the software is buggier than it should be. My web browser has remotely-exploitable security holes. Random drivers in Linux randomly regress as the version number increases. OS X and Windows 7 crash for no reason, and don't support enough hardware.
The only program I use regularly that doesn't crash on me is Xmonad. And guess which language that isn't written in.
It doesn't crash because it's simple, not because it's written in a good language.
Well you'd probably have 100X as many bugs in a C program (it takes 10X as many LOC, and I bet bugs scale with N^2), so Haskell is a bit better, but language isn't as important as scope. Big programs have more bugs.
Plus, I don't understand why people keep praising XMonad for its lack of bugs. It has some bugs that are quite annoying (eg. stuck windows that don't close, locked with only one tile on a screen) -- on the other hand, I've never seen a bug in metacity.
My take is that even if it's easier to screw things up in C, that doesn't mean that if you program in a higher level language, you'd automatically produce elegant code.
I can tell you, though, that this doesn't really matter in real life. The compiler warns you when you use a free variable, so it's pretty hard to accidentally misuse a dynamic variable. There are pathological cases that people point out, but these rarely matter in elisp that most people actually write.
Programming Emacs is a little different from programming other systems, but once you use its idioms instead of the ones you took from your favorite language, everything works quite nicely.
I agree that emacs lisp can still be used to productively write software. People put up with much worse things.
Why does it really scare me when people that claim to programmers say stuff like that. If what you say is true, that you are "surprised when someone writes code in one of those languages that actually runs.", then you really should not be programming.
Apple has definitely crossed-over to the dark side. After 26 years of being a fanboy, they've finally exceeded what I can stomach.
The web isn't appropriate for the apps I want to write yet, so I can't develop on a perfectly open platform and expect to find customers. And I can't reasonably create my own platform and create the software I want to create.
So I have to pick a platform that can reasonably support the apps I want to write and that gives me a reasonable chance to make a living at it. I'm going to be somebody's sharecropper.
When I have more resources, I can consider supporting multiple platforms to mitigate my risk. But until then, pointing out that we develop at the pleasure of the platform holder is redundant and the differences between more- and less-restrictive platforms is splitting hairs.
I can't see any way that this turns out as remotely positive for developers.
Developers will make apps where it is fun to make apps. I haven't had any horror stories with the app store, so it's still just more fun to make iPhone software than anything else.
Handing my phone to my friends and telling them I made what they are looking at has been really fun. Also, telling anyone with an iPhone how they can just search the app store and find my little app is way fun. My Mom installed it even!
The facts disagree.
The App Store model is unusual and it is not perfect. Few developers have any experience with Cocoa or Objective-C. Developers must use a Mac. iPhone software only runs on the iPhone and is not easily ported to other platforms.
Despite all of that, Apple has attracted developers to the App Store and the iPhone in numbers nobody would have predicted. Meanwhile, all other mobile platforms are rushing to duplicate the model.
It would seem Apple knows exactly what developers want.
Aye, money outweighs freedom even today.
There are not only consumers which want to download ebooks with ads. There could be an area of innovative and experimental use where Universities want to develop novel applications - applications that might be written in Smalltalk, Lisp, Haskell - or any other language that can compile and is not Objective C. In many cases the innovation lies in the core logic and the innovative use of a touch screen. Why should I develop my core logic in a way that it is tied to the iPhone or iPad (assuming that the iPad will get the same developer agreement) and where I have to use a relatively low-level language like Objective C.
It is one thing what you assume Apple's target (Adobe, cross platform frameworks, ...) is and another thing who else is also affected by these clauses the developers have to agree to.
I now develop iPhone applications, and I personally agree with Apple in this respect. If you want to develop iPhone applications, spend some time and become proficient with the tools.
And, @raganwald is correct that it's still lock-in, it's just a matter of where the lock is.
It makes more sense to reject Fart apps -- or least the 100th Fart app. It does not make sense to reject apps written in Python or Clojure, for example.
To put it another way: they are disproportionately turning away the better quality devs, not the lower quality ones.
3.3.1 — Applications may only use Documented APIs
in the manner prescribed by Apple and must not use
or call any private APIs. Applications must be
originally written in Objective-C, C, C++, or
engine, and only code written in C, C++, and
Objective-C may compile and directly link against
the Documented APIs (e.g., Applications that link
to Documented APIs through an intermediary translation
or compatibility layer or tool are prohibited).
Theoretically, you could make it hard to tell the difference, but in practice its pretty easy to tell machine generated from human written code.
As with everything Apple does - there are company-centric motives, which have been neatly balanced against a set of consumer-centric motives.
Apple is run by smart people, who realise they have a dedicated following. By making the 'we don't want to diminish the quality of the App pool' argument clear, they allow their ardent followers to do battle for them.
The corollary of this is that Apple ensure that their platform receives the developer investment it requires, enabling the company to become a permanent fixture in the mobile market.
If Flash developers didn't have to make a new investment of time and money to learn their platform - what would stop this pool of developers from leaving Apple's side tomorrow?
They want full control over what is allowed _into_ their market, and they want a dedicated team of developers who won't walk away.
If Flash was allowed, neither of these requirements would be assured.
People would lose their damn minds.
If you're the dominant OS in the smartphone market or in the desktop market, where's the difference?
Apple has to unlock the iPhone and let people get their apps from wherever they want.
If people want the security of knowing an app is Apple approved to work and play nice with others on their systems, they can go through the App Store.
It's not their rules, but the fact that they remove choice from the market for both consumers and developers by FORCING themselves into the consumer/developer relationship as a restrictive middleman.
Android and MS might have swapped in the last month or so. They are close. As for choice, consumes don't seem to mind. You have tons of "choice" on Symbian, Android and WM for app selection. The AppStore is beating them all.
Developers will go where the money is. It is a job. You play by the rules or move on. Simple really.
BUT... Microsoft was given into trouble by the competition commission over the tight reign it had on Windows. Now what are we saying, Apple should be allowed to get away with near enough the same things Microsoft got fined for? just because they are Apple?
If Apple just lets this happen, and lets iPhone apps be developed on other OSs/SDKs/whatevers, then if a developer wants to produce a piss poor version of something then let them. Apple can then say yeah or nay when the App goes into the Store. They are still going make money, their phones are still going to be bought in droves.
Open the doors Apple, you might let something good in.
What MS decides on does effect the whole market. What Apple decides affect the Apple users only. But me as an Apple user Is affected of all dumb anti standard decisions MS have done. My web experience is crippled and I have to deal with name extensions and even more so now in OSX 10.6.
Under MacOS9 i only had to know if a file was going to be used on a windows machine and then add the proper extension in the name. But it's uggly and wrong.
I am not a developer but have been investigating starting a development company, I have seen 1,000 of bedroom chancers on the fourms....
Yeah man lets make an app and cash in, lets make an app called twitbook, it's a cross between a twit and a book we will get 1,000 downloads a day and we will spend it all the profit on weed, yeah man good idea ! Ok lets start... we can't code but we can use 3rd party apps that even my mum can use and we are done.
Yeah I am exaggerating slightly but thats how the market is going, it means that the app store is constantly full of shite with shite apps and will get worse if these 3rd parties are allowed to run riot to let any Tom Dicks and Harrys to release apps, which will happen if it's allowed to. Apple does not want to encourage that and nore do I, I want quality not quantity apps.
This is a new phenomenon for the world so Apple are bound to make mistakes on how they operate it and they have realised this is a bad thing that is happening to THEIR brand.
I have seen so many good serious developers with good apps, they complain they are not getting seen on the app store and above is the main factor for that, piles of it.
These guys can code with their eyes closed and yes the 3rd parties apps maybe an inconvenience but I am sure they can work round it and actually get the coverage and sales they deserve.
In the short it's not good but the long of it is that it will be good for the people who know what they are doing hence brining us apps that don't sit on our phones for a few days and get ditched.
Why are people screaming about this ? It's all about money on both sides,
not future development of the up and coming kids, end off.
If I rent a room in your nice house and start pissing on the carpet would you want to boot me out ?
As far as i remember it was always Apple being victomised by other OS and software companies, how the tables have turned and good on em.
Your complaint here is disingenuous. When you learned to be a Flash developer, did you complain that Macromedia should be ashamed cos they didn't build their tool in HTML & CSS?
If you want to develop for the iPhone, then develop for the iPhone.
Since the iPhone has less than 20 percent of the smartphone market, it seems unlikely that the Monopolies Commission will be interested.
You will find your self in serious trouble moving forward if you tie yourself to only knowing a single company's toolset and tying your future to the well being of that companies tool set. Especially Adobe's; a company as fickle as Apple.
If they wanted to eliminate cross-platform apps, they could have just as easily put something which specifically mentions that in their terms of service. It wouldn't be any more ridiculous than the conditions they have put in place right now.
These new terms of service effectively bar tools like Monotouch - a development environment that exclusively targets the iPhone OS.
Does this stretch to build scripts??
It's pretty obviously an out clause so they can kick Adobe in the pants, and possibly prevent app-mills from popping up all over, completely saturating their approval process for the app store. I doubt they'll go after an individual developer who isn't obviously using some mass-market code generator to pump out apps.
When they produce less buggy Flas that doesn't have memoryleaks enough to kill a new computer with 4GB ram Core2Duo only running outlook and Google maps in IE7. Then maybe I'm also intressted to run Flash in my phone. ATM flash is blocked in my Nokia.
the Modern C# language provides many advantages over the Objective C used on the iPhone which does even have garbage collection.
If they intend to succeed in the Enterprise they should of encouraged MonoTouch not squashed it.
What a rose-tinted way to describe lock-in.
It was more difficult to program for Mac than for DOS. Hell I can make a decent DOS application. But in my younger years doing the same on Mac was way more troublesum.
Ok I havn't programmed Cocoa just done som experiments, made de calculator and currency converter Apple has as tutorial. And yes it simplifies a lot.
But simple programming comes with a cost in quality. Takin the step from DOS programming to program for Apples System 1-MacOS9 was huge. Eventloops, memoryheaps etc etc. Those who know programming had little problems. Those who made hello word apps had huge problems, aka me.
The greater challange there is to programming the greater programs will de creators do.
I wonder if the open source world can successfully fight back, by making compilers that generate code the app store police can't tell from hand written.
I think the answer is definitely yes. Apple's software engineering is not that great. There is always some hole in Safari that allows root access to the entire device. They can't get atomic syscalls working in OS X. Does anyone really think they can recruit and afford people that can tell computer-generated software from hand-written software?
My guess is that this is a scare tactic to keep anyone thinking of supporting two platforms at once to "not want to risk it" and go for the iPhone instead. More users, only so many hours that the developer can be awake, safer to just go with the iPhone. (Of course, you are already risking it anyway; use the wrong multi-touch gesture -- app denied. Use a Google service -- denied. Do something useful that Apple wishes they thought of first -- denied. And people wonder why there are so many fart apps...)
My next guess is that this tactic will be successful. People seem to adore doing whatever Apple tells them to do. It frightens me.
What I've learned from iPhone vs. Android (among other things) is that people will pick pretty and mean over average and nice.
I'd expect many languages and frameworks are going to have many very signature functions and patterns of code. If Apple decides to enforce this, they won't have a hard time.
On top of that, if they miss it, and let a bunch of apps in, then later on determine those apps were crosscompiled, they can revoke the current versions and block that developer...
Have you filed a bug?
Is this the kind of company you want to build software for? This is bullshit. Do I really even want to play along anymore?
This company makes great products, but they can do completely dickish things.
Enforcing which LANGUAGES can be used on a platform?!? Insane!
Edit: I've been looking at these guys:
(I don't work with or have any vested interest in them, but they look cool.)
The system76 laptops are probably generic machines from Clevo, Sager or some such with a custom badge. Alienware used to do the same thing.
I really hope that this ridiculousness doesn't start to bleed through - the MacBook Pro is pretty much the only laptop I've ever considered usable for development, and I don't know what I'd do if I had to abandon it...
I specially loved the one that came with the 3278 terminal. And the clicky ones, like the ones that came with the 3290.
Nowadays, when on my desk, the netbook is hooked up to a Microsoft natural keyboard. I would like the Sun keyboard, but Sun won't ship it to me in Brazil and local dealers want... US$400 for it.
"Quality of hardware" boils down to user interface... just at the hardware level.
Their UI ability is the secret of Apple's success. Everyone else treats UI and design like an afterthought.
Boycotting Apple's content creation tools and consequently not developing for the iPad/iPhone is one way to send a message.
However, I'm ultimately interested in serving my application to the largest number of people, and it seems clear to me that the App Store is the best way to achieve that today.
So, do you cave in and develop an iPhone version? Or, do you stand by your morals, and, in turn, limit your audience?
Maybe the next iBook will just be a foldable iPad with the same closed OS.
ALL other user interfaces by ALL other vendors suck.
For some reason, no human beings on the entire planet other than those that work at One Infinite Loop in Cupertino are capable of doing a UI.
The same caries over to other things. Sure the install and uninstall of applications on OSX is a great UI. But apt-get just seriously leaves it in the dust. The problem with linux was never that it had a bad UI or was too customizable. The problem with linux was actual hardware bugs in drivers, lack of office, lack of flash, lack of games. Which have mostly been addressed asides from games (which is a sore point for Apple too). UI is far too overrated over actual features.
And it's all based on a myth. Steve Jobs ringing up SUN over their looking glass and threatening with UI patents is just ridiculous. As are the claims to Apple fame with interfaces taken from Xerox.
So yes, I agree Apple are very good, possibly the best at UIs. But my points are 1) UIs aren't as massively important as people say, 2) Apple's "innovations" have been overrated 3) your statement that "no human beings on the entire planet other than those that work at One Infinite Loop in Cupertino are capable of doing a UI" is just ridiculous.
When something on a mac looks like it should be able to be frobbed, dragged, etc., it usually can and in exactly the way that it seems like it should.
This is not the case on Windows, Linux, or anything else.
1. It's way the hell better than Windows. No competition here. Windows is horribly clunky, and if you disagree with me, try using both for a while.
2. It has less of a tendency to plunge you into pesky little technical details than Linux. I personally don't mind fiddling with a config file every now and then, and the more recent Ubuntu versions are getting surprisingly good about this, but Linux still demands more effort to get a good, productive environment going. And Flash support still sucks.
Of the two, I prefer Linux in terms of usability. I doubt most non-technical users would agree. (Chrome, though, is just unambiguously better than Safari in every way. I use both regularly and don't want to dislike either of them.)
I'd say the all user interfaces from all vendors suck, including Apple.
Essentially, the menu items are infinitely tall hit targets… no matter how fast the mouse moves towards them, one can never overshoot them vertically. Menus in Windows and most *nix environments require both horizontal and vertical precision.
Furthermore, why are you bothering with the Dock when Exposé and Spotlight (or Quicksilver, Launchbar, etc.) offer great power user alternatives?
Personally, I still can’t stand how Windows and Linux make no distinction between applications and windows.
I understand the original reason for it. However, hitting the menu is an extremely large distance from what you might be working on now. I'm typing this on a multi-monitor machine -- this app is totally self-contained on one monitor. Why must I move my mouse across two monitors just use this app's menu?
> Personally, I still can’t stand how Windows and Linux make no distinction between applications and windows.
I can't stand how you can close all windows on a Mac and still have the process around with the only indication being a slightly difference in the menu bar. Amazingly confusing.
Now this may be personal preference but it still shows that the Mac GUI isn't some ultimate model of perfection that everyone agrees on.
Also, Fitts' Law says that the time to select a target is proportional to the distance and the size of the target. Putting the menu on the edge of the screen makes the effective size of the targets bigger, but in some cases it also means that they are much further away.
As I type this, the menu bar is about 4x further away than the top of my browser window.
And they violated that rule in the dock, unless that's been fixed in the last couple of versions.
I booted my other laptop to transfer some data over and waiting for Vista to load to the point where I could interact with Chrome was like pulling teeth.
1) They removed much of the "originality" that they did from Vista.
2) They copied more stuff from Apple, and a little bit from the Linux ecosystem.
Oh... and what's their fetish with that sickening shade of teal/blue? Yuck! It looks like smurf vomit.
Linux fans need to relax a bit and realize they're number three for a reason and work harder not complain harder.
Edid: I'm sorry, do I sound bitter? Maybe I've grown disillusioned with the difference between the reality in the trenches and fans extolling the features of product X, where X is any of Linux, Mac, Windows, iPhone, etc.
In the old days, when you minimised something, it always went to the bottom of the stack. (Except Excel, which had a silent entry in the stack when you had more than one document open. Outlook has been bizarre for a while, too.)
In one of the NT4 service packs (I think) this changed so that if something minimised to the status bar (rather than the task bar) it worked differently. I'm not sure what Vista did, but XP was bearable.
Now there are lots of variables - different apps respond to being removed from focus differently, I've read that the number of open applications affects stack response to minimise also.
Windows used to be very friendly to rapid keyboard-only operation. You could drop things in the start menu and activate them with two keystrokes. Alt+tab was dependable. No longer.
ALL other user interfaces by ALL other vendors suck.
If this is a big thing for you, I'd recommend the path I took after Be folded. Accept that complex user interfaces have bad tradeoffs (platform dependence, inflexibility). Find a full-screen tty you like (I use iterm because with apple+key + enter it goes full screen and gets rid of aqua) and return to living in the habitat of your ancestors!
A few things are inherently visual: paint programs, 3d games and movie editing. Everything that is not can be done effectively on the console. These interactions are often far superior to GUIs.
There's a learning curve. But once you're over it you'll have enormous power at your disposal and won't ever get locked in again.
Two things make this far easier than it has been previously:
1) Python. The standard library contains everything you'd want to do to push a system around. You can hammer out powerful tools in python in a casual manner and at a speed that has not been available to mere mortals before. You can get it on a variety of platforms.
2) Web browsesr. It's now easy to get high-quality web browsers on any platform you'd want to use. Where you do need to produce a GUI, you can knock up a trim webapp with html and forms.
Some systems have a CLUE (Command Line User Environment).
That's the thing about Apple's capricious, passive-aggressive contract language... you have no idea, and no way to even guess, if your business model will be the next one they target for termination.
I actually went the opposite route and installed mac os on a netbook.
It's a little flimsy, physically.
And I am a proud free software zealot, so I moved it to debian.
Toughbooks are also promising, but they're light on the specs... could someone please make a durable and well-specced laptop?
Maybe we were not that zealot after all when we free software "zealot" said that proprietary software allow their owners to treat their users badly and that eventually, this happens to every proprietary software. Just saying. It amazes me how surprised users of proprietary software are every time they get screwed by their masters even though this has been happening for the last 30 years.
Products are OK to be proprietary as long as the value provided is top-notch. Sorry, but I don't see profesional designers using Gimp over Photoshop.
Relying on a platform for your existence is a different story. But as a business you need alliances with other businesses, and not just in software. And everybody can pull the plug on you, that's why reputation matters and in many cases it's all you need.
About free software, programmers need to eat too. Myself I use open-source everywhere, but for the last 7 years I've been doing consultancy work (turn-key apps that are never released in any form or web services that put a lock on your data ... the worst kind of closed systems). And until you'll teach me a business model that would empower me to work on "free software" while providing for my family, then I'll keep doing it.
Until then it's only fair I get paid for my work, that's why I consider the free software philosophy as extremist bullshit.
Free software is an infrastructure. I drive on it daily delivering value and getting paid.
That's a fallacy.
The majority of open-source sponsors are selling closed systems or services to sponsor their "free software" involvement. Google doesn't have a business model around "free software". Neither does IBM or Sun.
I'm also not Mozilla and my apps would probably never get in front of 40% of all Internet users. Even if I'm that lucky, it's probably not going to be a desktop app that's used to search for stuff.
To get paid for customizations, your software also has to be really popular for businesses (consumers don't pay for that, they either endure it or search for something better).
Did I mentioned that I don't live in Silicon Valley nor in Cambridge, but in an Eastern European country? So training is off.
I already mentioned consulting, but then I would be a hypocrite if I promoted the free software ideology while working on the worst kind of closed software, wouldn't I?
> That's a fallacy.
> The majority of open-source sponsors are selling closed systems or services to sponsor their "free software" involvement.
Red Hat, Canonical and a large chunk of IBM GS wouldn't be able to sell those services with proprietary products - the OSS licensing of various Linux OSs, Apache, etc. are the basis for them being able to have an audience to sell their services.
I agree with you re: Google. They're not a service business and could have written their own OS or used a proprietary one and not affect revenue.
Not sure what licensing fees would have done to their overhead costs in the early days.
Maybe you should have a look at this again... I've been on a trip to Romania for training (about architecture of some specific piece of OSS) at some point. As long as you can provide good training, it could work for you too.
And the software that makes Google their money is proprietary, unless you can show me the link to the AdWords server source.
I think they basically just do it through consulting + a little support and training. (See http://phusion.nl/services). But since they wrote Passenger and REE, I'm sure they can command a very high rate.
It's also worth noting that the users aren't getting screwed at all by Apple, only the developers and only a small fraction of them.
Secondly, users are harmed by these actions as there will be fewer developers making apps for them, plus, Apple are potentially stifling innovation.
idk, I've never thought the appstore was anything but a stop gap personally, and developing for it a crazy risk.
Also, a stop gap? I think there's an app for that.
Definitely, if other phones or devices have such capabilities, and a killer webapp comes out that uses them.
If I was Apple I'd just shut the App store down to get rid of all the ungrateful developers. Sorry, but it's their show. They get to decide if you can play, and what the rules are. If you don't like it, don't play with them.
Of course, if you didn't sign that, then you can do whatever you want.
As opposed to where they are right now?
Enforcing what higher level language you write in before it gets compiled down to machine code?
Native apps that run directly on their hardware = closed - you play by their rules or not at all.
shrug pretty clear difference IMHO.
But yeah, no camera/GPS/microphone/etc.
Let's say I write an iPhone app originally in Scheme (like this guy did: http://jlongster.com/blog/2009/06/17/write-apps-iphone-schem...), and compile it down to C, which is then compiled to object code and linked against the iPhone libraries. At this point, the object code is the same (or functionally the same in terms of its syscalls, library calls, and general program flow) as if I had originally written it in C, except that I would have lost the unique developer efficiencies I got from using Scheme in the first place. I'm not saying Scheme is better or should be an officially sanctioned source language for the iPhone SDK. I'm just saying, where the rubber meets the road -- object code linking against libraries and making certain calls -- there is no difference to the computer what the original source was.
Seems very silly.
2) The app approval process is so opaque that it might be hard to tell if they were banning you due to this reason (note I hear things have improved, so this point may be out of date).
When Alan Kay said Apple would take over the world with an iPad, I don't think he realized that eToys or anything like eToys would never be allowed to run on the device and that a majority of the apps will probably have commercial spots embedded within them. Actually it reminds me of "educational" tv all over again...
I'm starting to see the point of people who complain about the consume vs. create nature of the iPad...
Who do you mean by this? Microsoft doesn't make sense in this context.
The majority of apps on the app store already do. Free apps, ad-supported. Apple just wants the ads to not suck so much, but hey, nobody has to use iAds.
I really wouldn't put it past them, and I think today's a bad to err on the side of "things won't change."
Telling developers which languages to use? Might as well stick with enterprise software development, more innovation going on over there Apple!
1 - Perhaps, during the beta, they want code written in C (and derivatives) to aid in debugging framework bugs. Rather than "My App doesn't WORK!!!!! (Using MonoTouch Vxxyy)", they can get a repro using a stack they know and control top to bottom. In this case, I can see it as justified.
2 - If it's specifically to target cross-compilers as a business decision, to ensure that iPhone only gets "exclusives", then I suppose it's their prerogative. It will probably backfire long-term, as their market share isn't that dominant. "Browser, Palm, Android + all else" or "iPhone" makes the iPhone a secondary port target versus a first class platform.
If it's any other reason, it is ridiculous. It makes me seriously question my upcoming Mac Pro & iPad purchases. Even though I currently develop for Mac and iPhone using only Obj-C, I can't build a development strategy around a schizophrenic "partner". What's next - documentation not written in Pages? Failure to use Steve's favourite colour in the background? It's getting... bizarre.
Checking an app isn't doing anything malicious may be easier if it's written with standard tools.
I think the situation will only improve when Google actually cares about their users and less about the cool tech that comes with every phone.
Hope you guys enjoy the mass exodus of decent developers from your dumbass platform with kiddie languages.
Objective-C, C, and C++ are kiddie languages?
I seriously doubt that there will be any sort of exodus away from iPhone development because of this. Apple provides some awesome development tools, and they want developers to use them because they improve the quality of the applications that people write.
You are right, though there maybe an impact on more developers coming into IPhone Dev. That said I would be surprised if at least some developers have had enough of Apple's high handedness and move away from it, like they did from Windows long ago because of Microsoft doing evil things.
Just an anecdotal datapoint, but I for one was toying with the idea of buying a MacBookPro (though not yet an IPhone or IPad) and learning Objective C, but not any more.
Objective C on Apple is now the programming language for Orcs, working for the fulfilment of Steveron the Dark Lord's World domination plans. What does developer unfriendliness matter if you get "teh new shiny" to play with every year?
Ash Objective C durbatulûk, ash Objective C gimbatul,
Ash Objective C thrakatulûk, agh burzum-ishi krimpatul.
Apple is telling people, "write software like this, because we say so."
Yes, I'm sure it sucks. Your project is collateral damage from Apple's efforts to thwart Adobe and other makeshift cross-platform efforts that generally create sub-par products.
I'm not saying Apple's development tools are perfect or that I like Objective-C, but Apple makes it pretty easy to write great apps their way.
Even if you hate Apple, you should consider looking at ObjC on GNUStep.
Objective-C has only a small fraction of the potential performance of C++ so should not even be compared with it. They are in completely separate categories. Objective-C is used primarily as a higher-level glue for routines written in C. You cannot write all of your routines in Objective-C all the way down, because the performance is dynamic and often worse than languages with smart VMs. LuaJIT, MacRuby, and several others all have better performance (in terms of function dispatch, calling, etc) than Objective-C. If you want Objective-C code to perform well, you just start using C stuff instead of Objective-C classes and methods.
With Apple's change to the dev license they are basically saying, "We only want you to use paleo languages. Put away your shiny modern toys and come back to the creaky past with us."
But on the other platforms you are in reality limited (in practice) to one or two languages? I'm looking at you, Android and BlackBerry.
(Windows and Palm don't really count, since nobody's writing for them).
Since I don't know much about Android, are there significant apps (in the Android Marketplace) that use some of these languages?
...or would have been, before this announcement. Ouch.
"When we embarked on the task of developing PowerLoom® which had to be delivered in C++, we were faced with exactly this problem. Our response was to invent a new programming language, called STELLA, that incorporates those aspects of Common Lisp that we deemed essential into a language that can still be translated into efficient, conventional and readable C++ and Java code."
Controlling programmers like this seems positively insane, doesn't it? Does this mean that games can't do scripting in something like Lua either, under the letter of Apple's law?
What they are forbidding now is the usage of any third-party compilers that are producing native executables ... Adobe CS5, MonoTouch, Unity.
It's more than that. They are forbidding you from writing it in 'language X,' converting 'language X' into C/C++/ObjC, and then running that through Apple's developer tools. They state that the program must be originally written in C/C++/ObjC.
Until now, you could link to an interpreter, as long as all the code that it running was bundled with the app as downloaded from the App Store. No code could be generated by the user or downloaded from the net, though.
A lot of games use interpreters. ScummVM is used for the official Broken Sword port, the original SCUMM engine is used in the remake of The Secret of Monkey Island, and a lot of commercial iPhone game frameworks use Python or Lua as a scripting language.
That said, I have an implementation of Conway's game of Life on my iPhone, downloaded from the app store. Life is Turing equivalent, and the user can create his own patterns. I guess the app reviewers didn't know about it (or don't care about it since the applications are very limited).
I was hoping to finally develop for the platform thanks to background services, but my moral objection is now at par with my profit motive.
I was planning on building an app in Titanium. I guess I'll wait and see how this shakes out first.
btw, at the time of writing this comment Titanium's developer center is hosed. Maybe it became self aware at the same time and commited suicide.
3. Therefore, if you write native iPhone apps, you must use C derivatives that expose raw pointers and direct memory access.
- There is a bug with the compiler not properly counting references when assigning a variable that needs a
release to a property marked "retain". The property will increment the reference count, expecting the caller to decrement it, but the static analyzer does not see this, and so fails to warn.
- http://www.openradar.appspot.com/7338181 - CodeSense fails to work when Static Analysis is turned on.
(DISCLAIMER: I am using Xcode 3.2.1 on Snow Leopard 10.6.3)
A more cynical theory: Apple believes they have more than enough developers for their platform, and they need a way to raise the bar. Particularly, they need to limit the pile of me-too apps, and Flash's impending iPhone compatibility threatens to send nothing but.
The benefit to the user in this case is that they have less crap to wade through in the App Store. The harm to the user is that a highly desirable app does not get made for the Apple platform because the developer is put off. And that risk is spectacularly low: any popular app on another platform is quickly cloned, and any killer app that fails to materialize at all... well, the users would never know what they're missing. (The logic breaks down for expensive-to-develop apps, game franchises, and other forms of lock-in, but these cases are few; Apple can negotiate them on a case-by-case basis.)
Is Apple primarily annoyed that Adobe's app and MonoTouch aren't doing it the "Apple" way? Or do they not like that those frameworks present non-standard UI's (I haven't used MonoTouch, so I don't even know if that's the case here)?
I think it's stupid to ban different languages but I can see that writing apps specifically for a platform can make the application better on that platform, rather than having the lowest common denominator between iPhone, Android and Blackberry powered by a scripting language.
But the indirect effect would be to shut people up. Rather than spending much or any effort on peeking inside iPhone apps looking for signature traces of cross-compilation, Apple would benefit because people would stop talking in public about which 3rd-party development platforms they had used for their cool new app, out of fear that mentioning anything besides C, C++ or Obj. C might result in their app being taken off sale - so without Apple doing a single thing different at the technical level, this license provision would have a chilling effect on discussion about competitors' development tools.
MonoTouch apps are written in C#.
I would be hugely disappointed to see Adobe pursue this course of action, and would regard it with significantly more contempt than I currently hold toward Apple.
On what basis?
The end result being fewer apps for Android and a less vibrant software ecosystem.
Companies that have existing resources in Objective C, sure, they may choose to go with the iPhone. Developers who already know Objective C will go with the iPhone. Developers who, like me, don't yet know Objective C, and have been interested in development on the iPhone through PhoneGap or Titanium? We're not going to bother with the iPhone.
So porting from another language is now forbidden? Or is it unoriginal code that is banned?
Either way, the sheer craziness is astounding.
Surely they aren't saying that all code has to be entirely hand-written? If so, then is running search-and-replace on a source file not allowed?
Note: I am in no way saying that Apple's stance on this situation is a good one or not, in face, absurd. The iPhone 3G is the last Apple product I will ever buy because of this and other iPod/iPad related anti-developer bullcrap.
What this will absolutely stop is Adobe calling out developers who have added apps to the app store compiled with CS5. Those apps created that way will also be unable to move upwards to the new OS version. I would imagine it will also completely stop Adobe and others from any development on any OS 4 bridge because they would have to agree to the license to test against the actual SDK and they can't do that without legal problems.
(Hint: Apple's version of C is unlike any standard version of C, like C99. So technically, you can't use Apple's tools. Or, you can call anything you want C, just like they do.)
Sure there is an ideal, but practically, there is not consistent definition of the C language.
Apple's C is C99 (as implemented by both gcc and clang). If you wish to take advantage of something like Grand Central Dispatch, they recommend that you do so with their completely optional syntax for blocks (which, aside from being closures AFAIK, are essentially anonymous functions in situ instead of defining a function externally).
You can still use GCD without the new block syntax by using function pointers, leaving you with bog standard C99.
Frankly, gcc implements a lot more non-standard stuff on its own than Apple's blocks have introduced.
Damn Apple is evil.
It is totally off base.
Such a relocation tends to be more visible from a distance.
Just created a new "I'm with Adobe" Facebook Group:
Is this technically possible? If I write my app in language L and also write an L->Obj-C translator, how is Apple going to know? I don't think Apple looks through your source code before putting your app on the App Store(do they)?
Basically, they reserve the right to ban any sort of DSL/runtime/engine application architecture. Imagine you are building adventure games (like Money Island). You'd have a game engine and a domain language in which you describe the game. The beauty of this is, you don't have to rewrite the engine every time you finish a story for a game, you just write in your DSL. The same holds for a range of other applications such as travel guides, recipes for cooking... How can you be sure where they draw the line between data and code?
Well, it probably just depends on whether it comes from an Adobe tool or not. Let's see who's next.
Even THQ's Star Wars: Trench Run is affected. You can't really get more high profile than that. I wrote a blog post (http://bit.ly/disQ2C) with some links to various SDKs and their app showcases to give people a bit of an idea of just how many existing apps this SDK change will effect.
If you think of the iPhone as a PC-like platform, yes, it's worth being crazy. But I think of mine as more like a game console. That helps me minimize my own feather ruffling, even when Apple does snarky things to me during the review process.
First of all, lots of companies do license many 3rd party libraries, without their source code. Sometimes these said libraries are specifically compiled with latest bugfixes to the relevant project they've been used.
What the companies might require is usually symbols/debug information/etc. But never the source code!
After all, Microsoft, Sony, Nintendo are direct competitors. Why should an independent publisher (EA, Activision, THQ, Konami) be sharing source-code with them?
Your comment is emitting dangerous levels of irony.
My thought on what they probably meant:
I bet what they're trying to do is prevent intermediate tools from allowing people to "whitewash" private API access.
I say this because the interpretation everyone else is seeing in this clause is obviously a horrible position, and I just don't think that's the intent.
Time will tell!
In order to keep something great from turning to crap, you have to have a gatekeeper. If you don't trust the gatekeeper, that's OK. But I'm hearing a lot of noise from developers who are already beholden to their own gatekeepers (whether it's the core team of your open source project, Microsoft, Oracle/Sun, or whoever). I mean, Adobe?! Could there be a more arbitrary and horrible gatekeeper than Adobe?!
Just because Apple doesn't do what you want doesn't make them better or worse than any maintainer of any "good thing". In the end, it all boils down to whether you trust Apple to do the right thing.
If you don't, then you're welcome to go punt on the umpteen device platforms that are currently vapor. (How many years have we been hearing about "iPod killers", and where are we with that?)
It's OK to complain, but I really would have thought the HN audience would understand that EVERYTHING EVERYWHERE is a trade-off. Yes, we should be debating the rules that Apple imposes, because we have a say in that. No, that doesn't validate rhetoric about Apple being "evil" or "anti-developer" etc.
I couldn't disagree more. The goal of free software is to ensure nobody has that kind of control over you and your tools. If you don't need to organize a fork, great, but they've already agreed they can't stop you.
As for the rhetoric, making tools is what makes you a member of our species. Agreeing not to is more like serfdom than partnership.
And forking is obviously less-than-ideal in terms of maintaining against future core updates. It's a kind of "freedom" that comes with plenty of new burdens.
Apple, on the other hand, has shown a somewhat... avaricious... approach to the app store, rejecting applications for curvy buttons, speech bubbles, or being able to load arbitrary images from the internet. We should all realize that Apple will allow precisely what it wants to with respect to the app store, and nothing more. They don't need to do anything else; consumers will buy their products regardless of how the apps can be created.
I don't like it, but I definitely saw it coming. :/
http://monotouch.net/Store $999 for the Enterprise version? Ouch.
It's a common misconception that you have to have a monopoly to run afoul of antitrust law. You don't. It also goes the other way--having a monopoly is not necessarily a violation of antitrust.
For instance, mergers have been shot down on antitrust grounds even when the merged companies would have had under 50% of the market.
Antitrust is more about the effects of actions on competition, rather than market share, although there tends to be a correlation between the two.
Had they done this months ago, they'd likely have not had a problem. But 4 days before CS5's release? I could see that adobe has already filed suit.
Once out there, Adobe was likely under no obligation to not talk about it.
For example, I don't think automakers are allowed to ban competitors from making compatible parts for their cars. Similarly I think there have been cases where printer makers tried and failed to prevent competitors from making compatible ink cartridges for their printers. No automakers or printer manufacturers have a monopoly though.
Requiring the use of certain tools to produce an otherwise identical product seems to me a violation along similar lines. I'd love to hear an expert opinion on it.
It wasn't until recently that some of us stopped and thought, wait.... what if Apple decides to deny all Flash built apps???
A lot of time was probably lost by developers. There's going to be a lot of crushed dreams.
Plus, instead of spending $600 on CS5, I'll probably buy a Mac Mini.
Of course, I disagree with this approach: there already are widget shops. It's possible to program in a language without knowing it, by writing it non-idiomatically resulting in buggier, slower code that takes longer to develop and is difficult to maintain. The fact that they write in languages that take longer to develop an app in would just mean they'll choose "low quality" while still maintaining the same quantity.
Apple's doing this for the exact same reason they do EVERYTHING else controversial that they do on iPhone OS: because the primary (only?) thing they care about is the end user experience being as good as possible. In this case, Flash (which I'm sure this is entirely aimed at) doesn't provide as good of a user experience as the native UI controls, so they're nixing it while they can without as much bad publicity as they'd have if they waited until after CS5 was released.
I mean, how the hell do you define a "cute toy" vs. an actually innovative app? What's something that is more innovative on Android, say, or a Pre that isn't there on an iPhone? (Google Voice being the obvious exception, but again with the everything-Apple-does-is-about-UX thing.)
>Apps with the same codebase for Android/iPhone will automatically suck as far as the UI is concerned
Hardly. There is the graphics api called opengl that works on both iPhone and Android (and your ps3 and computer).
By only allowing the tools they use? It seems like if what you said was true they would do exactly the opposite.
so obviously AppStore is a monopole - the other 3~10 stores have a miserable 0.6% ... of course that says a lot about Apple genius and how its competitors are plain retarded ... but ...
still i think they are more and more abusing their dominant position with their draconian dev rules ... and that should be looked at - could present some legal challenges for them ... im no lawyer - but i think the only way they can get away with it is tru the private nature of the dev agreement ...
incidentally - i really think Apple is special - one of a kind genius company - but also because of that people cut them way too much slack ...
to prove my point - just try this simple mental exercise - try to picture - what would happen on this same forum if we were discussing about say - Microsoft - say Microsoft just put out a nice new license for Windows-7 that said you "can develop for W7 only if you use C#, .NET, Microsoft own compilers, Flash is banned and Redmond will need to approve all any application" ...
dude - it's so funny i cannot even finish typing it ... really - it just sounds so crazy - like a story from another world - a world of evil monopolistic corporations that try to control our free lives... yet all we did with this simple exercise is to change one word : Apple -> Microsoft ... and - last time i checked - Windows does not even have 99.4% of the market - more like 90% coupled with a boring, inferior resource hogging product that though comes with all the freedoms democracy brigs us ...
and that proves my point - we do cut Apple way too much slack just because they are so awesome in making great products ...
full disclosure - i owned both iPhone 2G and 3G, just recently dumped 3G for a nexus-one because id rather own an imperfect (though rapidly improving) democracy than in a perfect nazional-socialistic system ... i still use my mac - osx is awesome and kicks Windows-7 lower-back ... and just got an iPad - just because there are no other choices ... yet ...
... but then, any compiler that can generate the processor's instructions can obviously also be modified to generate Objective-C code. So I guess this will result in all the cross-compilers just changing their target language from llvm or assembly to ... Objective-C.
I can live with that ... and watch Apple slap themselves on their pretty foreheads ... whatever it is they hoped to accomplish in terms of lock-in :)
Far from ideal, but it may be the best solution at the moment.
It could also simply be a change just so they can see what the response from the development community is - and may change rather soon.
I just can't fathom this particular rule lasting very long... it's not just lock-in, or a little bit evil, it's patently insane.
The requirement to use Apple development tools, automatically means you must use Apple hardware to develop on. Ka-ching!
So, as an Apple shareholder, I'm happy about it.
The first thing that came to my mind when I read the title is "wtf". What a really bad move. I wonder how much more can the app developers take.
You could use a JVM-based "close substitute" (e.g., Objective J->JS->Rhino) for the UI portion, which makes JNI calls to an Obj-C backend.
I can't use it?
I hope there are other companies that will release usable tablet hardware.
I've heard you can even get a SLIME/Swank REPL from the emulator or actual hardware. Seems like this beats the iPhone any day; better development cycle, and nobody can ever stop you from selling your app.
The only useful alternative will be a 'free' Linux derivative that does not restrict me in my developer choices and does not want to collect all kinds of data.
And your less paranoid users can just grab your software from your website, and run it on their "stock" devices.
Sorry, that I don't care about running Clojure and/or Android.
About Flash, Java and other outdated WM-based crap - they are pragmatic. They don't need any WM to run apps on their hardware. Modern CPU is the only necessary WM. So, there will be much less complains about crashes, bad performance and out-of-memory issues.
Together, these two major problems sank the reputation of Windows and Apple want to avoid them.
So, there is nothing special. Apple can dictate rules which helps them to make more profit, and of course they will.
Wanna freedom - there is Android. Google trying very hard to position it as Linux (community-driven open and free platform) for smartphones.
Nanny Apple wants to dictate the media you take in. Trust me, if it could, it would restrict HTML. Open standards are only good if you control the standards. =)
Don't want to learn Objective-C? Then build a webapp.
Apple is not restricting your rights. Apple is not stealing your freedoms. Apple is not being tyrannical. Apple is not snatching your children away as payment for your access to the AppStore. I bet there's very few on here who are even developing apps that are doing all the bitching. Your the I'm-a-Flash-developer I-can-has-AppStore? type who want to have access to the latest opportunity to grab some money bags without having to learn a new trick. Did you try to read a line of Obj-C and realize it has a different syntax than your Flash timeline and AS and decide to cry cos you might have to work for those riches you're after?
Do you all gripe that the web requires HTML? Did you all complain when Macromedia didn't give you Flash that only required you have a scant understanding of HTML and CSS to call yourself a "developer"? Your complaints are so lame.
Companies make products. They want to control their products and its image as much as they can because it is their livelihood. If you want to base your livelihood on that product, then you pay by their rules. If you want to be outside the rules then you go your own way, but don't start crying cos they ask you to use their product to make your money in a specified way that protects their name and image.
If you don't want to use the tools, make a web app. If you want to complain thAt web apps can't do everything a native app can, then learn the platform and make a native app.
Stop expecting that the company that has taken all of the risk and spent all the money on r&d and given you an SDK and documented it exhaustively and covers the costs of distribution and takes all that WORK and COST off your back is asking too much to request that you repay them with the kindness of protecting and advancing the brand by using the tools THEY BUILT FOR YOU so the product and the platform can stay popular and successful and keep making you money.
Good grief. That shit is so hard to swallow isn't it? Let's all burn our Macs and sound like morons with all our Android-or-Nothing cries while we quietly go fire up XCode so we can make some offing money.
This is stupid.