I was present at the BMUG meeting on Berkeley campus when Andy demonstrated Switcher for the first time. He did no talking, but simply ran up two or three applications... and then the whole screen scrolled sideways and the audience was on its feet cheering! The event was amazing, only perhaps matched by Bill Atkinson's demo of Hypercard. Happy days!
Bill Gates: "a really good programmer like you should be able to write at least a thousand lines of code per week".
So the expectation for a really good programmer was around two hundred lines per day, assuming five days of work every week. That is not far away from today, even after almost four decades.
Bill Gates: "If it takes ten weeks, and you get paid four thousand dollars per week, that means you should get paid $40,000 for writing it."
So, around $100 per hour, effectively? Looks like Steve offered $250 per hour (assuming Bill's calculation is not widely off the mark). This was back in 1984. Thirty-eight years later, salaries have not increased by that much!
Bill Atkinson, the author of Quickdraw and the main user interface designer, who was by far the most important Lisa implementor, thought that lines of code was a silly measure of software productivity. He thought his goal was to write as small and fast a program as possible, and that the lines of code metric only encouraged writing sloppy, bloated, broken code.
[...]
He was just putting the finishing touches on the optimization when it was time to fill out the management form for the first time. When he got to the lines of code part, he thought about it for a second, and then wrote in the number: -2000.
Dijkstra, of course. From the excellent and predictably poorly received On the cruelty of really teaching computing science[1]:
My point today is that, if we wish to count lines of code, we should not regard them as “lines produced” but as “lines spent”: the current conventional wisdom is so foolish as to book that count on the wrong side of the ledger.
> So, around $100 per hour, effectively? Looks like Steve offered $250 per hour (assuming Bill's calculation is not widely off the mark). This was back in 1984. Thirty-eight years later, salaries have not increased by that much!
200 lines a day is ridiculous. A good programmer isn't just someone who writes a lot of code. I'd rather hire someone who writes 3 lines a day, but fixes bugs on the way, than someone who writes 200 lines without proper testing, leaving others to fix the mess. Lines of code is a terrible metric.
Folklore has a lot of these amusing and insightful stories. Here is one which I found amusing (except for the condescending adjective to describe Bill's voice - which I don't agree with), involving both Steve and Bill: https://www.folklore.org/StoryView.py?story=A_Rich_Neighbor_...
Bill Gates' clumsy attempts at lowballing were humorous, but Jobs was the true star of the story with his thinly-veiled threat of a lawsuit if Andy didn't sell it for the price Jobs demanded.
Bill Gates raising his offer because Andy doesn’t know what to respond is also a great story. And shows how great a manager Bill is with his congratulating letter. Andy was extra motivated to make Microsoft apps work because he felt he promised them.
Jobs offering more than twice as Gates is nice, but boy must it be stressful when he decides to go confrontational.
The various TSRs and application switching tricks on the PC that had their heyday just before Windows came down were amazing - so many applications had been made to run in low-memory systems that 512k or even 1MB (UMB!) was quite a bit.
I had a HP Palmtop that was greased to the skids with things like that, quite a powerful device. Even the modern iPhone doesn't quite do app switching as well, to be honest (the apps are often just reloading).
Well, it works great if the apps aren't flushed -- but after an update about, I dunno, I wanna say 12 to 18 months ago, apps get terminated all the time (at least on my iPhone 12).
Since modern apps sort of expect not to be quit, they're generally quite bad about restoring state.
Back in the early days of iOS it wasn't such a big deal -- apps expected to be quit and were good about restoring state.
(This is a general problem with iOS itself too -- to use one salient example, pre-iOS 7, the phone was rock solid at remembering what you were listening too. Set your phone down, come back however much longer later, plug in the headsphones, hit the pause button on the headphones, whatever you were listening to starts right back up.
Post iOS 7, not a chance.
In general iOS 7 was just a dramatic downward plunge in software quality, and iOS has never recovered.
Speaking of terrible software quality, now that I've reached 4 paragraphs, it's taking Safari maybe 1 second to echo each keystroke. So maybe some sort of quadratic algorithm processing text in this input box. Good lord.)
> Since modern apps sort of expect not to be quit, they're generally quite bad about restoring state.
Which is also a recent development that includes Apple's own apps, too. iOS guidelines used to explicitly state that your app must expect to be terminated, and should restore state.
Another example of this is scrolling views w/ buttons. Apple put out a whole whitepaper on how you have to take care to get this problem right: it's a subtle thing where you need to figure if the touch input is intended to be a scroll or a button tap.
But after the latest redesign (a few years ago, now) "App Store" doesn't even get this right.
Try scrolling around on the "Today" tab, picking various random points on the screen to scroll. You'll notice sometimes your scroll input is completely disregarded. Totally buggy.
That's because if your touch starts on a button, you can't scroll. The button just steals it. Apple needs to read their own whitepaper!
I don't know if it's just all the original people left or new leadership that doesn't have a feel for UI or what, but things are prettty rough.
(I'm tempted to jump ship to Android/Linux, but despite Apple falling off, I know things will actually still be worse there. Unless a miracle has occurred.)
Apple should hire me as Czar of Software Quality. ;-)
I’m glad I’m not the only one seeing these problems. App flushes have felt more aggressive and arbitrary, and apps are hit and miss for remembering state. Podcasts seems to be one of the worst for me, relaunching into the useless “listen now” state, or just hanging for 10 seconds before crashing. Apple really needs to spend a year just squashing bugs.
I'm surprised Andy didn't counter Jobs' offer of $100K, even if it didn't go anywhere. I have to think Jobs was savvy enough to expect him to counter, in which case he would've met him somewhere in between.
Perhaps he knew AH well enough to know that he wouldn't counter..?
Fascinating story. Especially interesting how he still felt compelled to make Microsoft’s apps work without any input from them. If it was important for them you’d think they'd agree to let him have access to the actual source code to debug. Or maybe I’m misunderstanding what pseudo code is in this context.
Early versions of Word ran in a custom interpreter Microsoft wrote. Think like Java or WASM bytecode. To debug this properly would require writing a special version of that interpreter with debugging features built-in.
Debugging an interpreted program by debugging the emulator it's inside of is a special kind of hell. Even moreso if you don't have debugging symbols for that emulator either. Instead of "step to next instruction", it's "step to next bytecode fetch, step through the interpreter's jump table, step through the implementation of that instruction, figure out what it is, repeat". Each one of those is multiple machine code instructions and you will lose track of things very quickly.
> the instructions comprising their applications were encoded in pseudo-code to save space, in the tradition of the byte-code interpreters from Xerox, which Charles Simonyi advocated.
I never heard of that; Simonyi is usually associated with his "Hungarian style" of prefixing variable names by type indicators (sz_* for pointers to strings implemented as zero-terminated character sequences etc. - https://en.wikipedia.org/wiki/Hungarian_notation ).
Writing office in interpreted bytecode must have slowed down things quite a lot, and back then there was not JIT compilation available. Does anyone have specifics on that VM (e.g. opcode table)?
I'm sure they would have - but in the heat of the moment you don't have time to learn someone else's code when you can just fiddle some bytes, throw it in the debugger, aha!
It’s pretty amazing that these programs used 128KB of memory. Nowadays, it is rare to find a program that uses less than 128MB.
I know I’m being a bit hyperbolic. There are definitely apps that use less than that, but my point is more that the actual capabilities of the software have not always increased at the same pace as the resource usage.