In my experience so far with such "cross platform compatibility layers," they always produce results that water down each platform's individual strengths and differentiations. And of course, instead of the developer being locked into the phone platform, they are locked into the compatibility layer's platform.
Adobe's Flash compiler is a classic maneuver to "commoditize your complements," as Joel put it so well. Apple don't want to be commoditized, especially if it means having apps that don't take advantage of the iPhone's strengths.
Adobe want to lock developers into Flash and commoditize everything else as Flash-delivery devices. Apple want to commoditize applications and lock developers into their APIs.
Perhaps, like Nintendo, they learned the lessons from the collapse of the home video game market in 1983. When Nintendo was contemplating developing the NES, they took a deep look at what had caused the collapse. What they concluded was that the main cause of death was the market being flooded with too many crappy games.
Originally, if you wanted to write, say, an Intellivision game, you went and got a job with either Mattel or APh Technological Consulting (the company that did the hardware design, system software, and many of the early games for Mattel). If you wanted to do an Atari game, you went and got a job with Atari.
The games at this time were all pretty good. A consumer could go out, buy a game based just on the information printed on the box, and go home and be pretty sure they'd have a good experience.
As time went on, a few more companies joined the party. Activision and Imagic, for instance. These companies were started by people who had worked for Mattel or Atari or their contractors like APh, and generally produced quality games.
A consumer still could be confident that plunking down $40 or whatever for a new game, based just on the box description, would be a good move.
More time passed, and companies that had little or no connection to Atari and Mattel jumped in, using information gleaned by reverse engineering the consoles and system software. The information was not always complete, and they didn't know all the tricks and techniques we authorized developers knew to squeeze greatness out of the hardware. They produced a lot of crap games.
Consumers now found that spending $40 on a game was a big gamble. They had to work to get good games--be aware of brands, read reviews. They stopped buying--all games, not just the bad games.
Nintendo's conclusion was that their new console must be locked down. Only developers that Nintendo approved would be allowed to produce cartridges. This way, they could ensure that quality remained high, and get the now shy consumers to come back and give games another change.
It clearly worked--and consoles have been locked down ever since, and the console game market is huge.
However, that isn't what the approval process is. There are literally thousands of crappy applications that were happily approved and clogging up all categories in the app store. It seems non-trivial app rejections are not done on behalf of the user but are done solely to protect Apple's own interests. Remember when a bunch of high-quality Google voice apps disappeared from store? And that's just one example, there have been many more.
And now they're rejecting apps not based on their quality, but based on the programming language or development environment used to create them. How is that at all relevant to the user? This is entirely about protecting Apple's own interests and the comparison to Nintendo's lock down of the NES is not applicable.
What I don't understand is how this isn't anticompetitive behaviour. By creating an app store and lock-in for application vendors, Apple become the only provider in the market. They now appear to be leveraging that monopoly to restrict another market, that of developer tools, to their commercial advantage and at the expense of a competitor.
I'm not a lawyer and don't live in the US where presumably any legal action would be brought, but can someone please explain to me how this isn't black-and-white illegal under US law?
First, we have the hardware, and as you say, Apple is certainly in a competitive market with its iPhone offering.
Then, for each type of hardware, we have the software that runs on it. Anyone could write software to run on Apple's hardware, but because Apple lock down the phone and run the only app store in town, they have a de facto monopoly on the supply of software to iPhone users.
Finally, we have software development tools. Again, anyone could write tools to help software developers using Apple's hardware. Indeed, according to recent reports connected to this story, many people have, from Adobe's Flash CS5 team to fans of Haskell.
Again, I'm no lawyer, but I would expect that Apple would be perfectly within its rights to lock down the software that can run on its device, but would not be allowed to use that power to unduly influence the secondary market of how that software can be made.
What gets interesting is when/if Apple's dominance continues to grow to the point that the "iPhone software" market is effectively the "smartphone software" market. At that point, the DoJ and/or FTC will almost certainly decide that many of these policies are anticompetitive and initiate some sort of action to remedy the situation.
So in some sense, Apple needs to worry about becoming too successful: it is only because they haven't achieved total dominance that they can get away with being so ruthless.
"Apple responsible for 99.4% of mobile app sales in 2009"
... it doesn't take a Phd to acknowledge that the AppStore IS the mobile SW market - thus - i would rather disagree that Apple argument SHOULD be successful in case somebody (or some agency) should bring a legal challenge to the new draconian policies - which are obviously abusing Apple dominance ...
the question for me is more technical = who can bring such a legal challenge? can developers do that? can users do that? or only a govt. agency can ... in which case - considering the influence and connections Apple has with Washington - that might not happen ...
would it be possible for developers to team up in class-action suit based which could then trigger a govt. investigation?
The fact that 99.4 cents of every dollar spent "after the hardware purchase" goes to apple platform.. doesn't make it a monopoly.. it merely means that it is the most successful add-on market..
A good example for comparison.. is a bit old, but illustrates this particular point beautifully.
The fact that more companies are interested in producing add-on stuff for a product and that consumers are more interested in BUYING that stuff.. doesn't mean that a company is anti competitive/antitrust regulated/a monopoly..
Volkswagen beetle aftermarket parts spent some 30+ years as the king of aftermarket parts .. everything from "third party" replacement oem style parts (stuff that matched the original but was cheaper for whatever reason) as well as stuff that essentially completely changed the product into something else (dune buggy conversions, engine swaps, totally different interiors, etc)
Was VW a monopoly because for 30 years 4 out of every 5 dollars spent on "aftermarket parts" was spent on Beetle bits? no and no one ever thought to consider or call it one.. it was just a hugely successful model that didn't change every 11 months, and therefore was a fixed point in space for manufacturers to target.. but more importantly CONSUMERS WHERE BUYING.. as opposed to your avg Ford/GM/Chrysler buyer who for the most part do NOT just go out and buy total conversion kits/hopped up engine parts.. there where many manufacturers who made parts for various successful models such as muscle cars over the years, and still do.. they didn't cry about antitrust because VW add-on makers made more money, nor did they cry that VW should change the way they made the beetle so that "beetle engines" would fit in any car (or vice versa)
There is NO ONE who could bring a class-action and win, and there is no way that adobe could sue and win either.
Because "customer choices" when they have real choice, do not make a monopoly, rather removing customer choice creates a monopoly.
but still it seems the aforementioned laymen had the right intuition - as the appstore draconian policies are finally being looked at by regulators ...
"According to a person familiar with the matter, the Department of Justice and Federal Trade Commission are locked in negotiations over which of the watchdogs will begin an antitrust inquiry into Apple's new policy of requiring software developers who devise applications for devices such as the iPhone and iPad to use only Apple's programming tools."
Presumably Adobe are the most likely candidates who also have serious legal firepower.
This is a chicken-egg situation. Apple has Apps, so more people buy iPhones than other phones. with Adobe's dev tools, devs could make apps for more than one platform - that is a threat to apple. When apple shuts this down it is an anticompetitive act against other phone makers. Because if devs could put apps on more than one platform, then other platforms would become more appealing.
and they aren't locking in any body. You don't forfeit rights to your source code, you're welcome to write some crossplatform app using some shared code library you build in-house. You just have to build apps natively, rather than use some watered down piss poor Common language that breaks standards on all platforms. This is the same reason adobe apps suck on macs. They have tried to abstract away the os from their applications to the point where they don't look, act, or function properly on any platform. They look like ass and run like ass on everything .
#1 - Two of the worst carts preceding the '83 crash were E.T. and Pacman, both developed and produced by Atari itself, not these mysteriously inferior 3rd parties you're alluding to. And how many games has Apple, who logically has the most know-how on the platform, produced? None.
#2 - You're making an oranges to apples comparison anyway. The video game market was not crashed by the availability of cross-compilers or tools that lowered the bar of entry. Similarly, Nintendo did not solve the problem by restricting what tools developers could use. They solved it with a strict editorial process.
#3 - Video game production in 1983 required producing and marketing physical goods. It relied on predictable "hits" just like AAA game development to recoup the considerable outlay required to get these games in front of consumers in the first place. iPhones games are virtual and the marketing for many of them non-existent (simply because I can't spend $0.25 CPC on Google trying to sell my $0.99 app). Additionally, there's a long tail of developers creating a more robust landscape of content. There can be tons of failures and still leave plenty of room for successes. Just look at how many games on the iPod have made it big. Many of them came from virtual "nobodies".
#4 - In 1983, there was no manifestation of "wisdom of the crowds" to guide any consumer purchases. Word of mouth was about it. Today, at Apple's scale, one can find dozens of opinions about the quality of a game that only 0.01% of total users may actually purchase.
Many of these problems continue to persist in the locked down AAA console world that you seem to be so fond of. You know, I can accidentally buy 50 terrible iPod games and still spend less money than I would have spent accidentally buying 1 terrible PS3 game.
Though I have not purchased this game and cannot comment on whether it diminishes your first point.
And as mentioned above, they dont have a monopoly in the market so, from Apple's perspective, 'if you don't like it, there are other opportunities' ... developing for the bberry :)
They've spent some money and time, and they have taken a lot of other people's work: Mach, gcc, Smalltalk, BSD, etc.
"they dont have a monopoly in the market"
That's not so clear to me. I own an iPhone and an iPad even though I think they really suck technically. But there is content available for them that simply is not available for other platforms.
if (A && B)
else if (A && !B)
What I am getting at is, with programming there are rules. The rules Apple have are actually very minor and very easy to stay within. If you program to make political statements, choose Android or BB or Symbian. If you program to make money, pick the platform that will do that and follow the rules. If it stops making you money, move on. If you don't like the rules and don't want to abide by them, move on.
You can enjoy programming while still staying within the rules. Sometimes, it is part of the fun and challenge.
Apple is trying to accomplish the same kind of lock-in that Microsoft managed with Windows. And we better nip this thing in the bud, because Apple would screw us even worse than Microsoft has.
As a long time Mac user, I've experienced a lot of Mac applications that have been straight ports from other platforms and they are, for the most part, pretty awful. I can understand from this why Apple wants its developers to code iApps natively.
This 'lock in' makes perfect sense for Apple in other ways too, ways in which end users and developers will benefit. Imagine that Apple allow apps to be ported from Flash. Developers would stop coding natively for iPhone OS as they would be able to create their apps in Flash and distribute them as web apps at the same time, reaching a greater audience. Then add in Android, Blackberry & other export options for Flash. Soon enough Flash would be the only IDE in use and platforms such as iPhone OS would be at the mercy of Adobe. If Apple were to introduce new features and efficiencies to their hardware and APIs, they would have to wait for Adobe to implement them in its Flash translation layer before the features would really become available to end users. Even the most willing and motivated of developers would not be able to get around that, they would have to wait for Adobe. So in the end, Apple would lose sales and credibility, and good developers would get screwed because they wouldn't be able to out pace their competitors in updating their apps to take advantage of new features. Everyone becomes 'locked in' to Adobe. Given Adobe's poor history when it comes to timely bug fixes and support of its OS X applications, I do not think that this 'lock in' would be a nice place to find yourself, whether you're Apple, a developer or an end user.
If you don't like Apple's stance then develop for other platforms and buy other products. But if you want to be in on the action, then accept the rules as they are not unreasonable and will ultimately benefit everyone.
We tend to frown on loops and mutating variables.
It's the same strategy that they've always used with their software/hardware combination, which has worked beautifully in the case of the mac.
check the numbers - AppStore IS a monopole ...
Rose tinted glasses. They still managed to release buggy, downright broken software. It wasn't about quality, it was about control.
1. It's only a matter of time until apple realizes this model is not good for anyone.
2. Android (who by the way is powered by the very people who specialize in filtering out the crap on other platforms) will dominate the next few years.
That idea started 1.5 years ago. Now you get Google ads in your free apps. Now, as an option, you can have iAds or Google ads. So all Apple did is open an option (read choice) for developers already doing ads in apps.
Choice is now bad? Or is it good?
Do something like Reddit, Digg, or Hacker News... let people guinea-pig apps and upvote/downvote them and sort in each category.
(HN is, IMO, above the rest but it's certainly not immune to mob rule)
Besides, with the money involved in high rankings you'd have to constantly police the system against gaming, which would be, I'm guessing, more work than policing the submissions directly a they do now.
(First my credentials: I was an avid player of Atari 2600 games at the time, so I remember the period in question first-hand.)
My take is not that there was a decline in quality due to any sort of technical reason, it was due to a drop in the quality of the gameplay design and playtesting. That, in turn, I'm guessing was due to the number of Atari 2600 game creators increasing past the threshold of GOOD game designers & playtesters available. After all, it was a relatively new field at the time, video game design. Perhaps it reached a point where there were say 50 new games being "designed" concurrently but there were only 20-30 good designers. Whereas before, it was under that threshold.
This was my theory because I've heard your position stated a few times in the web, but from direct experience I remember it being more a drop in the quality of game play rather than in code quality or technical polish.
Don't worry, if you savor the cross-platform software experience there will be plenty of options for you. At bargain prices, in fact.
... please - you Apple fan-boys should try things before jumping to conclusions ...
i dumped my iphone for a nexus specifically for Google Voice and Apple draconian control on the platform ...
I think apple is doing well in the (presumably free, it's way more open than other places) market, because they are very good at negotiating their property rights.
I guess it's just fashionable to imply the "bad guys" are "communists"
Do you mean the FCC approval process for new cell phones is a massive barrier to entry? I'm sure it's not free, but i can't imagine they'd want more that a few dozen phones and $100k of studies. maybe a half a million?
Of course, this is why I don't use C, C++, or Objective-C. I'm always a little surprised when someone writes code in one of those languages that actually runs.
The good news is that there isn't very much of it.
The rest of the software is buggier than it should be. My web browser has remotely-exploitable security holes. Random drivers in Linux randomly regress as the version number increases. OS X and Windows 7 crash for no reason, and don't support enough hardware.
The only program I use regularly that doesn't crash on me is Xmonad. And guess which language that isn't written in.
It doesn't crash because it's simple, not because it's written in a good language.
Well you'd probably have 100X as many bugs in a C program (it takes 10X as many LOC, and I bet bugs scale with N^2), so Haskell is a bit better, but language isn't as important as scope. Big programs have more bugs.
Plus, I don't understand why people keep praising XMonad for its lack of bugs. It has some bugs that are quite annoying (eg. stuck windows that don't close, locked with only one tile on a screen) -- on the other hand, I've never seen a bug in metacity.
My take is that even if it's easier to screw things up in C, that doesn't mean that if you program in a higher level language, you'd automatically produce elegant code.
I can tell you, though, that this doesn't really matter in real life. The compiler warns you when you use a free variable, so it's pretty hard to accidentally misuse a dynamic variable. There are pathological cases that people point out, but these rarely matter in elisp that most people actually write.
Programming Emacs is a little different from programming other systems, but once you use its idioms instead of the ones you took from your favorite language, everything works quite nicely.
I agree that emacs lisp can still be used to productively write software. People put up with much worse things.
Why does it really scare me when people that claim to programmers say stuff like that. If what you say is true, that you are "surprised when someone writes code in one of those languages that actually runs.", then you really should not be programming.
Apple has definitely crossed-over to the dark side. After 26 years of being a fanboy, they've finally exceeded what I can stomach.
The web isn't appropriate for the apps I want to write yet, so I can't develop on a perfectly open platform and expect to find customers. And I can't reasonably create my own platform and create the software I want to create.
So I have to pick a platform that can reasonably support the apps I want to write and that gives me a reasonable chance to make a living at it. I'm going to be somebody's sharecropper.
When I have more resources, I can consider supporting multiple platforms to mitigate my risk. But until then, pointing out that we develop at the pleasure of the platform holder is redundant and the differences between more- and less-restrictive platforms is splitting hairs.
I can't see any way that this turns out as remotely positive for developers.
Developers will make apps where it is fun to make apps. I haven't had any horror stories with the app store, so it's still just more fun to make iPhone software than anything else.
Handing my phone to my friends and telling them I made what they are looking at has been really fun. Also, telling anyone with an iPhone how they can just search the app store and find my little app is way fun. My Mom installed it even!
The facts disagree.
The App Store model is unusual and it is not perfect. Few developers have any experience with Cocoa or Objective-C. Developers must use a Mac. iPhone software only runs on the iPhone and is not easily ported to other platforms.
Despite all of that, Apple has attracted developers to the App Store and the iPhone in numbers nobody would have predicted. Meanwhile, all other mobile platforms are rushing to duplicate the model.
It would seem Apple knows exactly what developers want.
Aye, money outweighs freedom even today.
There are not only consumers which want to download ebooks with ads. There could be an area of innovative and experimental use where Universities want to develop novel applications - applications that might be written in Smalltalk, Lisp, Haskell - or any other language that can compile and is not Objective C. In many cases the innovation lies in the core logic and the innovative use of a touch screen. Why should I develop my core logic in a way that it is tied to the iPhone or iPad (assuming that the iPad will get the same developer agreement) and where I have to use a relatively low-level language like Objective C.
It is one thing what you assume Apple's target (Adobe, cross platform frameworks, ...) is and another thing who else is also affected by these clauses the developers have to agree to.
I now develop iPhone applications, and I personally agree with Apple in this respect. If you want to develop iPhone applications, spend some time and become proficient with the tools.
And, @raganwald is correct that it's still lock-in, it's just a matter of where the lock is.
It makes more sense to reject Fart apps -- or least the 100th Fart app. It does not make sense to reject apps written in Python or Clojure, for example.
To put it another way: they are disproportionately turning away the better quality devs, not the lower quality ones.
3.3.1 — Applications may only use Documented APIs
in the manner prescribed by Apple and must not use
or call any private APIs. Applications must be
originally written in Objective-C, C, C++, or
engine, and only code written in C, C++, and
Objective-C may compile and directly link against
the Documented APIs (e.g., Applications that link
to Documented APIs through an intermediary translation
or compatibility layer or tool are prohibited).
Theoretically, you could make it hard to tell the difference, but in practice its pretty easy to tell machine generated from human written code.
As with everything Apple does - there are company-centric motives, which have been neatly balanced against a set of consumer-centric motives.
Apple is run by smart people, who realise they have a dedicated following. By making the 'we don't want to diminish the quality of the App pool' argument clear, they allow their ardent followers to do battle for them.
The corollary of this is that Apple ensure that their platform receives the developer investment it requires, enabling the company to become a permanent fixture in the mobile market.
If Flash developers didn't have to make a new investment of time and money to learn their platform - what would stop this pool of developers from leaving Apple's side tomorrow?
They want full control over what is allowed _into_ their market, and they want a dedicated team of developers who won't walk away.
If Flash was allowed, neither of these requirements would be assured.
People would lose their damn minds.
If you're the dominant OS in the smartphone market or in the desktop market, where's the difference?
Apple has to unlock the iPhone and let people get their apps from wherever they want.
If people want the security of knowing an app is Apple approved to work and play nice with others on their systems, they can go through the App Store.
It's not their rules, but the fact that they remove choice from the market for both consumers and developers by FORCING themselves into the consumer/developer relationship as a restrictive middleman.
Android and MS might have swapped in the last month or so. They are close. As for choice, consumes don't seem to mind. You have tons of "choice" on Symbian, Android and WM for app selection. The AppStore is beating them all.
Developers will go where the money is. It is a job. You play by the rules or move on. Simple really.
BUT... Microsoft was given into trouble by the competition commission over the tight reign it had on Windows. Now what are we saying, Apple should be allowed to get away with near enough the same things Microsoft got fined for? just because they are Apple?
If Apple just lets this happen, and lets iPhone apps be developed on other OSs/SDKs/whatevers, then if a developer wants to produce a piss poor version of something then let them. Apple can then say yeah or nay when the App goes into the Store. They are still going make money, their phones are still going to be bought in droves.
Open the doors Apple, you might let something good in.
What MS decides on does effect the whole market. What Apple decides affect the Apple users only. But me as an Apple user Is affected of all dumb anti standard decisions MS have done. My web experience is crippled and I have to deal with name extensions and even more so now in OSX 10.6.
Under MacOS9 i only had to know if a file was going to be used on a windows machine and then add the proper extension in the name. But it's uggly and wrong.
I am not a developer but have been investigating starting a development company, I have seen 1,000 of bedroom chancers on the fourms....
Yeah man lets make an app and cash in, lets make an app called twitbook, it's a cross between a twit and a book we will get 1,000 downloads a day and we will spend it all the profit on weed, yeah man good idea ! Ok lets start... we can't code but we can use 3rd party apps that even my mum can use and we are done.
Yeah I am exaggerating slightly but thats how the market is going, it means that the app store is constantly full of shite with shite apps and will get worse if these 3rd parties are allowed to run riot to let any Tom Dicks and Harrys to release apps, which will happen if it's allowed to. Apple does not want to encourage that and nore do I, I want quality not quantity apps.
This is a new phenomenon for the world so Apple are bound to make mistakes on how they operate it and they have realised this is a bad thing that is happening to THEIR brand.
I have seen so many good serious developers with good apps, they complain they are not getting seen on the app store and above is the main factor for that, piles of it.
These guys can code with their eyes closed and yes the 3rd parties apps maybe an inconvenience but I am sure they can work round it and actually get the coverage and sales they deserve.
In the short it's not good but the long of it is that it will be good for the people who know what they are doing hence brining us apps that don't sit on our phones for a few days and get ditched.
Why are people screaming about this ? It's all about money on both sides,
not future development of the up and coming kids, end off.
If I rent a room in your nice house and start pissing on the carpet would you want to boot me out ?
As far as i remember it was always Apple being victomised by other OS and software companies, how the tables have turned and good on em.
Your complaint here is disingenuous. When you learned to be a Flash developer, did you complain that Macromedia should be ashamed cos they didn't build their tool in HTML & CSS?
If you want to develop for the iPhone, then develop for the iPhone.
Since the iPhone has less than 20 percent of the smartphone market, it seems unlikely that the Monopolies Commission will be interested.
You will find your self in serious trouble moving forward if you tie yourself to only knowing a single company's toolset and tying your future to the well being of that companies tool set. Especially Adobe's; a company as fickle as Apple.
If they wanted to eliminate cross-platform apps, they could have just as easily put something which specifically mentions that in their terms of service. It wouldn't be any more ridiculous than the conditions they have put in place right now.
These new terms of service effectively bar tools like Monotouch - a development environment that exclusively targets the iPhone OS.
Does this stretch to build scripts??
It's pretty obviously an out clause so they can kick Adobe in the pants, and possibly prevent app-mills from popping up all over, completely saturating their approval process for the app store. I doubt they'll go after an individual developer who isn't obviously using some mass-market code generator to pump out apps.
When they produce less buggy Flas that doesn't have memoryleaks enough to kill a new computer with 4GB ram Core2Duo only running outlook and Google maps in IE7. Then maybe I'm also intressted to run Flash in my phone. ATM flash is blocked in my Nokia.
the Modern C# language provides many advantages over the Objective C used on the iPhone which does even have garbage collection.
If they intend to succeed in the Enterprise they should of encouraged MonoTouch not squashed it.
What a rose-tinted way to describe lock-in.
It was more difficult to program for Mac than for DOS. Hell I can make a decent DOS application. But in my younger years doing the same on Mac was way more troublesum.
Ok I havn't programmed Cocoa just done som experiments, made de calculator and currency converter Apple has as tutorial. And yes it simplifies a lot.
But simple programming comes with a cost in quality. Takin the step from DOS programming to program for Apples System 1-MacOS9 was huge. Eventloops, memoryheaps etc etc. Those who know programming had little problems. Those who made hello word apps had huge problems, aka me.
The greater challange there is to programming the greater programs will de creators do.
I wonder if the open source world can successfully fight back, by making compilers that generate code the app store police can't tell from hand written.
I think the answer is definitely yes. Apple's software engineering is not that great. There is always some hole in Safari that allows root access to the entire device. They can't get atomic syscalls working in OS X. Does anyone really think they can recruit and afford people that can tell computer-generated software from hand-written software?
My guess is that this is a scare tactic to keep anyone thinking of supporting two platforms at once to "not want to risk it" and go for the iPhone instead. More users, only so many hours that the developer can be awake, safer to just go with the iPhone. (Of course, you are already risking it anyway; use the wrong multi-touch gesture -- app denied. Use a Google service -- denied. Do something useful that Apple wishes they thought of first -- denied. And people wonder why there are so many fart apps...)
My next guess is that this tactic will be successful. People seem to adore doing whatever Apple tells them to do. It frightens me.
What I've learned from iPhone vs. Android (among other things) is that people will pick pretty and mean over average and nice.
I'd expect many languages and frameworks are going to have many very signature functions and patterns of code. If Apple decides to enforce this, they won't have a hard time.
On top of that, if they miss it, and let a bunch of apps in, then later on determine those apps were crosscompiled, they can revoke the current versions and block that developer...
Have you filed a bug?
Is this the kind of company you want to build software for? This is bullshit. Do I really even want to play along anymore?
This company makes great products, but they can do completely dickish things.
Enforcing which LANGUAGES can be used on a platform?!? Insane!
Edit: I've been looking at these guys:
(I don't work with or have any vested interest in them, but they look cool.)
The system76 laptops are probably generic machines from Clevo, Sager or some such with a custom badge. Alienware used to do the same thing.
I really hope that this ridiculousness doesn't start to bleed through - the MacBook Pro is pretty much the only laptop I've ever considered usable for development, and I don't know what I'd do if I had to abandon it...
I specially loved the one that came with the 3278 terminal. And the clicky ones, like the ones that came with the 3290.
Nowadays, when on my desk, the netbook is hooked up to a Microsoft natural keyboard. I would like the Sun keyboard, but Sun won't ship it to me in Brazil and local dealers want... US$400 for it.
"Quality of hardware" boils down to user interface... just at the hardware level.
Their UI ability is the secret of Apple's success. Everyone else treats UI and design like an afterthought.
Boycotting Apple's content creation tools and consequently not developing for the iPad/iPhone is one way to send a message.
However, I'm ultimately interested in serving my application to the largest number of people, and it seems clear to me that the App Store is the best way to achieve that today.
So, do you cave in and develop an iPhone version? Or, do you stand by your morals, and, in turn, limit your audience?
Maybe the next iBook will just be a foldable iPad with the same closed OS.
ALL other user interfaces by ALL other vendors suck.
For some reason, no human beings on the entire planet other than those that work at One Infinite Loop in Cupertino are capable of doing a UI.
The same caries over to other things. Sure the install and uninstall of applications on OSX is a great UI. But apt-get just seriously leaves it in the dust. The problem with linux was never that it had a bad UI or was too customizable. The problem with linux was actual hardware bugs in drivers, lack of office, lack of flash, lack of games. Which have mostly been addressed asides from games (which is a sore point for Apple too). UI is far too overrated over actual features.
And it's all based on a myth. Steve Jobs ringing up SUN over their looking glass and threatening with UI patents is just ridiculous. As are the claims to Apple fame with interfaces taken from Xerox.
So yes, I agree Apple are very good, possibly the best at UIs. But my points are 1) UIs aren't as massively important as people say, 2) Apple's "innovations" have been overrated 3) your statement that "no human beings on the entire planet other than those that work at One Infinite Loop in Cupertino are capable of doing a UI" is just ridiculous.
When something on a mac looks like it should be able to be frobbed, dragged, etc., it usually can and in exactly the way that it seems like it should.
This is not the case on Windows, Linux, or anything else.
1. It's way the hell better than Windows. No competition here. Windows is horribly clunky, and if you disagree with me, try using both for a while.
2. It has less of a tendency to plunge you into pesky little technical details than Linux. I personally don't mind fiddling with a config file every now and then, and the more recent Ubuntu versions are getting surprisingly good about this, but Linux still demands more effort to get a good, productive environment going. And Flash support still sucks.
Of the two, I prefer Linux in terms of usability. I doubt most non-technical users would agree. (Chrome, though, is just unambiguously better than Safari in every way. I use both regularly and don't want to dislike either of them.)
I'd say the all user interfaces from all vendors suck, including Apple.
Essentially, the menu items are infinitely tall hit targets… no matter how fast the mouse moves towards them, one can never overshoot them vertically. Menus in Windows and most *nix environments require both horizontal and vertical precision.
Furthermore, why are you bothering with the Dock when Exposé and Spotlight (or Quicksilver, Launchbar, etc.) offer great power user alternatives?
Personally, I still can’t stand how Windows and Linux make no distinction between applications and windows.
I understand the original reason for it. However, hitting the menu is an extremely large distance from what you might be working on now. I'm typing this on a multi-monitor machine -- this app is totally self-contained on one monitor. Why must I move my mouse across two monitors just use this app's menu?
> Personally, I still can’t stand how Windows and Linux make no distinction between applications and windows.
I can't stand how you can close all windows on a Mac and still have the process around with the only indication being a slightly difference in the menu bar. Amazingly confusing.
Now this may be personal preference but it still shows that the Mac GUI isn't some ultimate model of perfection that everyone agrees on.
Also, Fitts' Law says that the time to select a target is proportional to the distance and the size of the target. Putting the menu on the edge of the screen makes the effective size of the targets bigger, but in some cases it also means that they are much further away.
As I type this, the menu bar is about 4x further away than the top of my browser window.
And they violated that rule in the dock, unless that's been fixed in the last couple of versions.
I booted my other laptop to transfer some data over and waiting for Vista to load to the point where I could interact with Chrome was like pulling teeth.
1) They removed much of the "originality" that they did from Vista.
2) They copied more stuff from Apple, and a little bit from the Linux ecosystem.
Oh... and what's their fetish with that sickening shade of teal/blue? Yuck! It looks like smurf vomit.
Linux fans need to relax a bit and realize they're number three for a reason and work harder not complain harder.
Edid: I'm sorry, do I sound bitter? Maybe I've grown disillusioned with the difference between the reality in the trenches and fans extolling the features of product X, where X is any of Linux, Mac, Windows, iPhone, etc.
In the old days, when you minimised something, it always went to the bottom of the stack. (Except Excel, which had a silent entry in the stack when you had more than one document open. Outlook has been bizarre for a while, too.)
In one of the NT4 service packs (I think) this changed so that if something minimised to the status bar (rather than the task bar) it worked differently. I'm not sure what Vista did, but XP was bearable.
Now there are lots of variables - different apps respond to being removed from focus differently, I've read that the number of open applications affects stack response to minimise also.
Windows used to be very friendly to rapid keyboard-only operation. You could drop things in the start menu and activate them with two keystrokes. Alt+tab was dependable. No longer.
ALL other user interfaces by ALL other vendors suck.
If this is a big thing for you, I'd recommend the path I took after Be folded. Accept that complex user interfaces have bad tradeoffs (platform dependence, inflexibility). Find a full-screen tty you like (I use iterm because with apple+key + enter it goes full screen and gets rid of aqua) and return to living in the habitat of your ancestors!
A few things are inherently visual: paint programs, 3d games and movie editing. Everything that is not can be done effectively on the console. These interactions are often far superior to GUIs.
There's a learning curve. But once you're over it you'll have enormous power at your disposal and won't ever get locked in again.
Two things make this far easier than it has been previously:
1) Python. The standard library contains everything you'd want to do to push a system around. You can hammer out powerful tools in python in a casual manner and at a speed that has not been available to mere mortals before. You can get it on a variety of platforms.
2) Web browsesr. It's now easy to get high-quality web browsers on any platform you'd want to use. Where you do need to produce a GUI, you can knock up a trim webapp with html and forms.
Some systems have a CLUE (Command Line User Environment).
That's the thing about Apple's capricious, passive-aggressive contract language... you have no idea, and no way to even guess, if your business model will be the next one they target for termination.
I actually went the opposite route and installed mac os on a netbook.
It's a little flimsy, physically.
And I am a proud free software zealot, so I moved it to debian.
Toughbooks are also promising, but they're light on the specs... could someone please make a durable and well-specced laptop?
Maybe we were not that zealot after all when we free software "zealot" said that proprietary software allow their owners to treat their users badly and that eventually, this happens to every proprietary software. Just saying. It amazes me how surprised users of proprietary software are every time they get screwed by their masters even though this has been happening for the last 30 years.
Products are OK to be proprietary as long as the value provided is top-notch. Sorry, but I don't see profesional designers using Gimp over Photoshop.
Relying on a platform for your existence is a different story. But as a business you need alliances with other businesses, and not just in software. And everybody can pull the plug on you, that's why reputation matters and in many cases it's all you need.
About free software, programmers need to eat too. Myself I use open-source everywhere, but for the last 7 years I've been doing consultancy work (turn-key apps that are never released in any form or web services that put a lock on your data ... the worst kind of closed systems). And until you'll teach me a business model that would empower me to work on "free software" while providing for my family, then I'll keep doing it.
Until then it's only fair I get paid for my work, that's why I consider the free software philosophy as extremist bullshit.
Free software is an infrastructure. I drive on it daily delivering value and getting paid.
That's a fallacy.
The majority of open-source sponsors are selling closed systems or services to sponsor their "free software" involvement. Google doesn't have a business model around "free software". Neither does IBM or Sun.
I'm also not Mozilla and my apps would probably never get in front of 40% of all Internet users. Even if I'm that lucky, it's probably not going to be a desktop app that's used to search for stuff.
To get paid for customizations, your software also has to be really popular for businesses (consumers don't pay for that, they either endure it or search for something better).
Did I mentioned that I don't live in Silicon Valley nor in Cambridge, but in an Eastern European country? So training is off.
I already mentioned consulting, but then I would be a hypocrite if I promoted the free software ideology while working on the worst kind of closed software, wouldn't I?
> That's a fallacy.
> The majority of open-source sponsors are selling closed systems or services to sponsor their "free software" involvement.
Red Hat, Canonical and a large chunk of IBM GS wouldn't be able to sell those services with proprietary products - the OSS licensing of various Linux OSs, Apache, etc. are the basis for them being able to have an audience to sell their services.
I agree with you re: Google. They're not a service business and could have written their own OS or used a proprietary one and not affect revenue.
Not sure what licensing fees would have done to their overhead costs in the early days.
Maybe you should have a look at this again... I've been on a trip to Romania for training (about architecture of some specific piece of OSS) at some point. As long as you can provide good training, it could work for you too.
And the software that makes Google their money is proprietary, unless you can show me the link to the AdWords server source.
I think they basically just do it through consulting + a little support and training. (See http://phusion.nl/services). But since they wrote Passenger and REE, I'm sure they can command a very high rate.
It's also worth noting that the users aren't getting screwed at all by Apple, only the developers and only a small fraction of them.
Secondly, users are harmed by these actions as there will be fewer developers making apps for them, plus, Apple are potentially stifling innovation.
idk, I've never thought the appstore was anything but a stop gap personally, and developing for it a crazy risk.
Also, a stop gap? I think there's an app for that.
Definitely, if other phones or devices have such capabilities, and a killer webapp comes out that uses them.
If I was Apple I'd just shut the App store down to get rid of all the ungrateful developers. Sorry, but it's their show. They get to decide if you can play, and what the rules are. If you don't like it, don't play with them.
Of course, if you didn't sign that, then you can do whatever you want.
As opposed to where they are right now?
Enforcing what higher level language you write in before it gets compiled down to machine code?
Native apps that run directly on their hardware = closed - you play by their rules or not at all.
shrug pretty clear difference IMHO.
But yeah, no camera/GPS/microphone/etc.
Let's say I write an iPhone app originally in Scheme (like this guy did: http://jlongster.com/blog/2009/06/17/write-apps-iphone-schem...), and compile it down to C, which is then compiled to object code and linked against the iPhone libraries. At this point, the object code is the same (or functionally the same in terms of its syscalls, library calls, and general program flow) as if I had originally written it in C, except that I would have lost the unique developer efficiencies I got from using Scheme in the first place. I'm not saying Scheme is better or should be an officially sanctioned source language for the iPhone SDK. I'm just saying, where the rubber meets the road -- object code linking against libraries and making certain calls -- there is no difference to the computer what the original source was.
Seems very silly.
2) The app approval process is so opaque that it might be hard to tell if they were banning you due to this reason (note I hear things have improved, so this point may be out of date).
When Alan Kay said Apple would take over the world with an iPad, I don't think he realized that eToys or anything like eToys would never be allowed to run on the device and that a majority of the apps will probably have commercial spots embedded within them. Actually it reminds me of "educational" tv all over again...
I'm starting to see the point of people who complain about the consume vs. create nature of the iPad...
Who do you mean by this? Microsoft doesn't make sense in this context.
The majority of apps on the app store already do. Free apps, ad-supported. Apple just wants the ads to not suck so much, but hey, nobody has to use iAds.
I really wouldn't put it past them, and I think today's a bad to err on the side of "things won't change."