Hacker News new | past | comments | ask | show | jobs | submit login

I developed on both Apple II and MS-DOS. The Apple II was OK for developers, by the standards of the day: It had BASIC and an open hardware interface. In my view, the hostility towards developers started with the Mac, and may have had its roots in good intentions on Apple's part.

Apple wanted users to have a uniform, high quality UI across all apps. That required some policing and a close relationship with a small number of favored developers. Their API was probably too complex and brittle for widespread unfettered software development, and they probably didn't want their platform to be defined by a proliferation of "bad" software from developers like me who didn't care about their UI standards. Many users would accept "bad" software that got the job done: Small apps for automating business processes, engineering tools (cross assemblers for microcontrollers), games, etc. Developers turned to MS-DOS, which had a vastly simpler API. I wrote "bad" software, i.e., that respected no centralized UI standard.

I'm not sure Windows was much better before Visual Basic came along, but long into the Windows 3.1 era, programmers still used MS-DOS for simple things.

Today, I think the situation is much different. You can program an Apple because you can program any computer. Most developers have at least a dim grasp of GUI concepts, and the API's (including their documentation and development tools) have improved to the point where it's easier to use the built-in UI features of a platform than to re-invent the wheel. This doesn't necessarily produce great software, but it allows platforms to evolve without breaking existing apps. And you can write "bad" software that is utterly platform independent thanks to JavaScript running in the browser. Without the "passive curation" of an opaque API, Apple has chosen to actively curate apps




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: