Hacker News new | past | comments | ask | show | jobs | submit login
Quotes from 1992 (apenwarr.ca)
230 points by panic 11 days ago | hide | past | web | favorite | 143 comments





I enjoy the quote referencing Gary Kildall:

Gary Kildall's ambition was limited, something that is not supposed to be a factor in American business. If you hope for a thousand and get a million, you are still expected to want more, but he didn't.

I would very much liked to have known Gary Kildall. Everything I've read about him gives me the impression he'd be an immensely interesting person to know. The persona that comes across in his co-hosting of the Computer Chronicles show (something, to itself, that you should look up if you're not familiar) is one of great technical competence combined with a gentle humbleness.

(Edit: If you're into learning more about Kildall there's a good prior discussion of an unpublished biography on HN: https://news.ycombinator.com/item?id=12220091)


I don’t think it’s true he had limited ambition. His later turn to alcoholism which lead to his death in a bar fight was a direct result of losing to DOS. That’s not the reaction one has if they were content with their accomplishments.

I don't get the sense that the true details of the latter part of his life are publicly known, and perhaps aren't even documented. That makes me somewhat uncomfortable to speculate about it. It fits my view of his character and public persona that being known as "the guy who lost to DOS" (or "the guy who went flying and blew off IBM"), rather than the actual "loss" itself, would be the biggest source of discontent for him. (For too many people that's all of Kildall's legacy, and that makes me very sad.)

I read a lot into the fact that the half released auto biography ends where the IBM PC is released. His family saying they censored it because it doesn’t reflect his true self.

FWIW I’ve researched this time period extensively and I think Gary was much kinder than Bill and a great guy. But I don’t think it’s accurate to say he wasn’t ambitious. And certainly inaccurate to say that he was content having had a good run and not winning.


It takes new ideas a long time to catch on - time that is mainly devoted to evolving the idea into something useful. This fact alone dumps most of the responsibility for early technical innovation in the laps of amateurs, who can afford to take the time. Only those who aren't trying to make money can afford to advance a technology that doesn't pay.

This is how I feel about hacker news. People who have the time to advance technology. And I'm grateful to witness so much of it here.


Obligatory Feynman quote Science is like sex: sometimes something useful comes out, but that is not the reason we are doing it.

That is one of my favorites, although I often substitute 'learning' for 'science'.

This also describes NASA (sadly) and university research teams

>On big things failing to scale down:

It occurs to me that the same applied in the transition to mobile. Microsoft was obsessed with making hand-helds and phones into little PCs, keyboard and all. It actually required a radically new interaction model and interface technology to unlock the potential of the form factor.


Story I was told from someone who worked on the early-2000's "Tablet PC":

Microsoft's team originally produced a new touch-first interface that wasn't entirely different from the eventual iPad. They took it to various corporate customers, and the response was unanimous: it needs to run Real Windows or it has no future. Facepalm. So they went back to Redmond and changed it to be Windows on a touchscreen. It met all stated customer requirements, technically, but of course it was clumsy and people didn't really see the benefit over a laptop, and it never took off.

Sounded to me like the classic "faster horse". It's not so much that they needed to make everything a PC, and more that they were such experts at enterprise sales that they didn't have confidence in their own new ideas.


> Microsoft's team originally produced a new touch-first interface that wasn't entirely different from the eventual iPad. They took it to various corporate customers, and the response was unanimous: it needs to run Real Windows or it has no future. Facepalm. So they went back to Redmond and changed it to be Windows on a touchscreen. It met all stated customer requirements, technically, but of course it was clumsy and people didn't really see the benefit over a laptop, and it never took off.

This is not so obviously wrong. It's fairly well acknowledged that the biggest barrier to the iPad becoming a general computer for most people is iOS. It's unclear that the answer is 'Windows on a touch screen' but it's also equally clear that Apple hasn't done anything like enough to make iOS work for the larger form factor.


My work issued me a pre-iPad tablet computer, the NEC Versa Litepad.

I agree with what you said, but part of the problem was resistive touchscreens just weren’t very good.


Funny how these days it's the inverse, isn't it? Everyone, including Microsoft with Windows 10, seems to be obsessed with making PCs into large hand-helds and phones as far as user interfaces are concerned.

I think there are useful lessons that can be learned from mobile, that might be applicable to desktop, but you're right Microsoft made exactly the same mistake again in the opposite direction. I'd say they did that most egregiously with Windows 8 though.

Microsoft keep looking at a few data points they have now, extrapolate those into the future and then try to jump directly to that future. Currently it's an attempt to 'catch up' with Apple, but they keep extrapolating straight lines when the actual track of innovation is in curves. They did the same sort of thing with Longhorn, trying to re-invent the PC platform with object oriented database file systems and the whole OS running on .NET because Java looked like it was the wave of the future. They're trying to out-compete rival technologies without really thinking through what they're doing.


I don't think they're doing that anymore under Nadella.

I don't think he sees Apple as their main competitor, but is really competing with AWS for the future of business computing, and not doing that badly.


That’s a good point. Microsoft saw what Amazon was going with AWS, projected that into the future and got the right answer. It’s an example of where this strategy paid off. To be fair to them it’s not just random chance it worked out either, execution matters too and they seem to have hit the nail on the head with Azure.

A quote from the post calls this pattern "Microsoft takes three tries to get anything right": Microsoft's entry into most new technologies follows this same plan, with the first effort being a preemptive strike, the second effort being market research to see what customers really want in a product, and the third try is the real product.

I didn't use a windows computer in all of 2018 and 2019. MS's "let's make PC GUIs more like a 2006 flip phone" strategy and other random instabilities and anti-user designs have inverted the difficultly of switching to linux to now being difficult to stay on windows.

Windows 10 doesn't come with the tablet experience on by default, you have to put it into tablet mode to get that. That's been true for several years now.

Win10 is a lot less user hostile that 8, but it still has the legacy of having two different UIs in parallel. Finding specific configuration setting has always been hard on Windows but you also have the dumbed down don’t-call-it-metro interface getting in the way as well.

I've never understood that problem, whenever I'm looking for a Windows setting my process is always the same.

Hit windows key, start typing until the name of the setting I'm looking for pops up, arrow keys and enter.


One key aspect of this was that on the "big iron", software was under the control of the administrator and generally trusted, while the users need to be protected from one another. On "small iron", there is one user, but the applications come from a variety of sources which may not be trustworthy, so the user needs sandboxing between the apps.

On interaction design: the iPhone may not have been quite the first consumer device with a capacitative touchscreen that worked brilliantly without a stylus, but it certainly felt like it was. Styluses were the bane of pre-iPhone touch devices and I'm surprised that Samsung tried to bring them back.


Two things that styli do better than fingers are writing and drawing. That was the point of Samsung's use of it, and even Apple now has their Pencil. Yes, using them for everything, like in the Palm Pilot era, was annoying.

Towards the very end of the 90ies, some introductory CS course had a nice diagram of how general architecture features like memory management, multitasking, networking, multi-CPU, multi-user OS, virtualization and maybe some others had slowly trickled down from research prototypes to mainframes to personal computers to handheld devices to embedded controllers. Everybody in the lecture hall was mentally continuing the lines and I remember prodding my neighbor, joking about when a successor of the Palm III he was using for taking notes would run a multi-user OS. Android, with its full-blown Linux kernel underneath, came just in time.

If we trace personal computers to their root origins, we'll see the LINC. The LINC was designed to be used by a single person. This was before time-sharing and interactive computing happened and, due to the differences in their use, the two kinds of computers diverged into what we saw in the 70's until today in every way - from the hardware to the software to the way they are used.

OTOH, my laptop has more in common with a VAX running Unix than it has with a Commodore PET. Or a LINC.


> OTOH, my laptop has more in common with a VAX running Unix than it has with a Commodore PET.

This was my thought upon reading the "Big computers and little computers are completely different beasts" quote, and it's also something I've been thinking about lately. Personal computers definitely represented a distinct development track from the multi-user/multitasking industrial/academic computing systems of the time. However, as PCs became more capable and merged key innovations from the higher-end computing branch (e.g. virtual memory, protected execution environments, and the operating systems that can leverage such features), we seem to have arrived at the same point -- a phone in my pocket that's more like a VAX 11/750 available in 1981 than an IBM PC from the same year. The "microcomputer" development track that included the Commodore PET and 8086-based PC systems was greatly useful for jumpstarting the PC industry, but was ultimately set aside.

So maybe it's not so much that small computers are completely different beasts than large computers, as their initial development required a unique approach to fit the constraints of that particular era.

As someone who has personally installed UNIX on a VAX 11/750, I do appreciate the attention to human factors that the microcomputer era has contributed to computing. :)


Desktop PCs of today evolved past the level of minis of the 90's, gaining many features minis never had, but, while minis are no more (our current servers that replaced them are overgrown versions of PCs and RISC workstations), mainframes didn't stand still and continued to evolve hardware and software features that reflect the demands of their segment: reliability, throughput, IO, security, and so on.

Their evolution is pretty cool. And their software is very alien to people who grew up on personal computers.


It depends how you look at it. For example the PET had a single chip CPU and recognisably PC like hardware architecture, while the VAX processor was built from individual TTL logic units. The point the book was making was that the two types of systems have different lineages, but he wasn’t ignorant of the fact that influence went back and forth between them.

"Big computers and little computers are completely different beasts"

Are we living in a period in which many things seems to be going backwards? With ARM trying to go into servers/desktop, or the (somewhat) failed attempts to bring the tablet/phone interface to PC (Ubuntu and/or ugly full-screen-only apps)?


From Accidental Empires, which the article quotes: "Then there was Flight Simulator, the only computer game published by Microsoft". Even at the time (long before XBox), this wasn't true. Microsoft had published several games before Flight Simulator -- a version of the classic Colossal Cave adventure called "Microsoft Adventure", and "Microsoft Decathlon", an Olympic sports game.

If you're ever near the Microsoft Redmond campus, building 16 has an open courtyard with tiles on the ground, each corresponding to some Microsoft product that was released. I'd be surprised if most people here (or in the industry in general) recognized even half of the product names on the tiles.

https://i.imgur.com/meVHgYm.jpg


E.g. have you ever heard of Microsoft Dinosaurs?

https://twitter.com/ThreddyTheTrex/status/101214445833740697...


Somehow donkey.bas always gets overlooked :)

Not to mention some of the most popular games of all time - Minesweeper, Solitaire, and the earlier Reversi. And a pinball game in Windows 95.

Windows 95 also had Hover! https://en.m.wikipedia.org/wiki/Hover%21

A while back they made an online version: http://www.hover.ie/


what about Fury 3?!

At a higher level, Minesweeper, Solitaire, and Reversi weren't games. They were educational tools.

This was how Microsoft taught people to do things like double-click, click-and-drag, and a number of other things that we take for granted today but were completely foreign back then.


https://blogs.msdn.microsoft.com/oldnewthing/20121218-00/?p=...

Pinball game was licensed, not developed by Microsoft AFAIK


And Flight Simulator was from subLOGIC in Champaign, Illinois, IIRC.

Monster truck madness was a similar situation too. I remember there was still a logo from the original company in there somewhere, on a license plate maybe?

This is from memory, mind you.

I still exclaim "roll over, Beethoven!" whenever appropriate.


... and Microsoft Tinker in Windows Vista Ultimate Edition. (-:

Those were all after Flight Simulator.

There was also ... the MSX.

https://en.wikipedia.org/wiki/Msx


The Z-80 computer I had the the 80's had an entire K-7 tape of games from Microsoft.

Also my absolute favorite game as a kid: SpaceSim

https://en.wikipedia.org/wiki/Microsoft_Space_Simulator


Not forgetting they licensed flight simulator from SubLogic.

"A little-known partnership between IBM and Apple to try to make a new OS (Pink) that would finally beat DOS:

"IBM has 33,000 programmers on its payroll but is so far from leading the software business (and knows it) that it is betting the company on the work of 100 Apple programmers wearing T-shirts in Mountain View, California."

Quote from a co-worker who spent a lot of time at Taligent: "They don't realize mice can have more than one button."

(I worked for the IBM Taligent Project Office in Austin. We were an AIX (mostly) and OS/2 (to an extent) shop.)

(Advice: Don't work for IBM.)


And another one of those weird tidbits of obscure knowledge...

When the Java group was trying to come up with a name for the browser that was going to ship with the Alpha 1 code it was called "WebRunner", that was the name we used up until we discovered through a trademark search that the name was currently registered to Taligent. Sun couldn't get anyone at Apple or IBM to acknowledge it, much less talk to us about licensing or transferring it, so we renamed the browser to 'HotJava' (which I at the time wasn't aware of the sexual innuendo there but it was what it was). I still have my jacket that has Fang on it and says "WebRunner." I have never had the heart to throw it out.


Q: What do you get when you cross Apple and IBM?

A: IBM.

https://instantrimshot.com/index.php?sound=rimshot&play=true

(I worked for Kaleida!)


Robert X. Cringely is often incredibly observant and accurate. That is why his writing stands the test of time because he saw things accurately and wasn't just a hype pedeler or mindless trend follower. I feel he doesn't get enough credit for that.

> Robert X. Cringely is often incredibly observant and accurate.

Quite the opposite: http://www.digibarn.com/friends/jef-raskin/writings/holes.ht...

Cringely is a good writer and storyteller. The stories he tells are often hearsay or wildly inaccurate recounting of events by third parties. His writings should in no way be treated as being accurate, and should not be referenced as historical sources or anything like that.


Oh, he's certainly sometimes observant (and indeed sometimes accurate), it's just that he's far from being particularly reliable.

Indeed, I recall the mid-00's when Cringely's predictions would show up on Slashdot to be roundly mocked. I think it was the first time I understood the practice of making outrageous claims just to drive traffic.

Thus, I only remember Cringely as a bloviating laughingstock. It's interesting to see where his authority came from, this book seems very interesting.


https://www.youtube.com/watch?v=sX5g0kidk3Y

(1996 documentary by Cringely. Excellently presented) There's a part 2 too.


One of the most interesting characters from the documentary "Triumph of the Nerds" was Kildall. His company created CP/M, and depending on who's telling the story either lost the deal with IBM to create the operating system for the PC, or just didn't care about it.

Here's the quote about him:

Let's say for a minute that Eubanks was correct, and Gary Kildall didn't give a shit about the business. Who said that he had to? CP/M was his invention; Digital Research was his company. The fact that it succeeded beyond anyone's expectations did not make those earlier expectations invalid. Gary Kildall's ambition was limited, something that is not supposed to be a factor in American business. If you hope for a thousand and get a million, you are still expected to want more, but he didn't.

If this is true, he was the polar opposite of almost everyone else in the story. It reminds me of that line in Breaking Bad when Jesse asks Walter how much is enough. Kildall knew the answer, and stuck with it. I didn't get that sense from any other character, except maybe Woz.

Kildall spent the later part of his career hosting The Computer Chronicles, a PBS weekly news show that ended up documenting the entire early history of personal computers. The shows are on YouTube and are fascinating to watch:

https://www.youtube.com/channel/UCkJ6eQKpHZgsZBla4JgKj3A


I'm convinced that, with few exceptions, to succeed in business requires you to destroy that part of your brain that understands the word "enough". What salary next year is enough? How many assets is enough? Business leaders will never say. How much profit this year is enough? Shareholders will never give you a figure--more is always better than less. These people might have goals, but they are meaningless because when they reach those goals, they don't stop. Unending accumulation is the only way. Anyone who does have a concept of "enough" gets out-competed by the ones who don't.

Ahhh 1992....the year I discovered QNX and Mosaic.

This post brings back so many memories...Novell! the first damn network system that was ever worth a shit on a IBM PC-compatible. Installing that software finally put and end to the nightmare of "sneaker-netting" all my office PCs.

OS/2 failure! Who would have thought that shit-DOS would withstand the challenge of such a superior (see QNX!) and deep-pocketed competitor? What were we all smoking back then??

Jobs and NeXT!...man did I covet one of those ridiculous priced machines, but with no market penetration and my company focused on cheap compatible hardware, it made no sense to develop with it.

Linux!...see QNX!

I could go on and on...it's rather amazing...without me knowing it I've become my own history book.


I obsessively read I, Cringely when I was in college in the late 90s. It was the Stratechery of its time, and taught me an immense amount about the tech business.

I don't know that anyone is popular, yet respected enough, to be an equivalent today. I'd say Leo Laporte had a similar impact circa 2010. Maybe Kara Swisher? Maybe Marco Arment? Some Youtuber?

At any rate, none of those exactly fit the bill, because Cringely's two "Nerds" documentaries† hold a special place in popular tech reporting history.

https://youtu.be/sX5g0kidk3Yhttps://youtu.be/Pk2BWphDfvc


I read it too in highschool. I even found a Dover Art book containing a copy of that suited-frog spot art sketch he used and cribbed it for my blog. Wonder what old Cringely is up to these days

You can find out on his blog (cringely.com). The TL;DR is that he mostly does consulting for startups and writes his own columns on the tech industry at his own pace. A few years back he did a Kickstarter for a $100 Minecraft server with his teenage kids that failed. The blowback from the failure and his poor communication about it has at times taken over the blog comments.

For a while he did a series of columns exposing the decline and mismanagement at IBM; he eventually turned these into an ebook that I purchased and thought was a reasonably interesting read.

More recently he had cataract surgery and had to move to a new home after his old one was damaged in the California wildfires.


The book's Wikipedia page mentions that Cringely made it available online for free, complete with link:

https://www.cringely.com/tag/accidental-empires/


On the value of limiting yourself to standards:

...which was why the idea of 100 percent IBM compatibility took so long to be accepted. "Why be compatible when you could be better?" the smart guys asked on their way to bankruptcy court.

It sounds a bit like a jab at Apple (or maybe even NeXT, which I believe was struggling in 1992 with its hardware releases). AFAIK neither company ever went bankrupt.

Or was it referring to the crop of other personal computing platforms from the late 70s/early 80s? (Vic20/C64, TRS-80, Sinclair, Heathkit, etc.)

Or something else? Big SGI workstations, etc.


There were tons of almost-PC-compatible manufacturers back then, selling their proprietary very slightly better, but incompatible systems. They'd have a slightly faster but incompatible bus system, slightly higher capacity floppy disks with a custom format, a proprietary interface that was quicker but only worked with their own printers, a higher resolution display or with more colours but which required special software to take advantage of it.

The Unix workstation market that NeXT, Sun and SGI competed in was a bit different and didn't really overlap much with the PC market.


Many included "better" features that depended on a somehow non-compatible BIOS. Quite a few of those had dual modes so you could get PC compatible but slower or worse, or incompatible but better. They eventually died out.

One company I remember that was pretty successful for a while were Apricot: https://en.wikipedia.org/wiki/Apricot_PC


I remember buying an LS-120 drive. To be fair, the drive was backwards compatible with 720/1.44MB floppies. But the 120MB 3.5 floppy could only be used in another LS-120 drive, oh which, I was the only one in my area with one. As the point of the floppies was to allow me to take my work with me, the larger disk capacity became essentially useless (except as a general backup).

https://en.wikipedia.org/wiki/SuperDisk


It's referring to 8088/8086 machines that ran MS-DOS but were not 100% IBM PC compatible. Stuff like the Amstrad PC 1512 (an 8086 machine with proprietary graphics) and the DEC Rainbow 100 (a very expensive and versatile dual-CPU machine that could run both MS-DOS and CP/M)

The PC1512 was very IBM compatible except for stuff that reprogrammed the 6845 into odd modes (IIRC) it had an odd 16 colour CGA mode that was available.

The only other issues I can remember I think is the PSU was in the monitor so video connectors were odd and the keyboard/mouse were Amstrad custom designs so you couldn't swap with a regular PC or clone.

I worked at Borland in the UK at the time and I don't remember any real issues.

The other major vendor in the UK was Apricot which started out rebadging Victor machines and had a similar level of compatibility to the DEC rainbow (I think it used an 80186 as the CPU)


Yes, probably a jab at Apple... although funny how the tides have turned (and likely the reason for this pull quote).

>And the old blood is getting tired - tired of failing in some cases or just tired of working so hard and now ready to enjoy life.

I think that this is, perhaps, the most apt quote that will live-on in a state of permanence throughout the generations.


> lend it out digitally, one person at a time

Think about how much better off we'd be if books were available digitally like Wikipedia articles are, and social media posts were restricted to "borrowers" one-at-a-time.


Cringley was involved in adapting this book into a 3 part documentary called Triumph of the Nerds. It's a fun glimpse into the early days of computing. Cringley managed to interview Bill Gates, Steve Jobs, Larry Ellison, and a whole host of figures from those days (too many to list).

https://en.wikipedia.org/wiki/Triumph_of_the_Nerds


The quote about reorgs practically gives me PTSD. Certainly irrational anger:

> The rest of the company was as confused as its leadership. Somehow, early on, reorganizations - "reorgs" - became part of the Apple culture. they happen every three to six months and come from Apple's basic lack of understanding that people need stability in order to be able to work together. [...] Make a bad decision? Who cares! By the time the bad news arrives, you'll be gone and someone else will have to handle the problems.

I worked for 6 years at STMicro, and the group I was in (Nomadik SoC) did reorgs every 18 months, practically like clockwork. This was a huge multinational company, and it was my first job so I didn't have any context to weigh it against, but in hindsight it was a huge sign of disfunction and leadership that didn't know WTF it was doing. As expected, the group shut down a few years down the line.

The place I currently work hasn't had a reorg since I joined, almost 4 years ago. But I'm sure that's largely attributable to a steady and successful product line.

It can be either a vicious or a virtuous cycle...


And Apple's 3-6 months reorg cycle is insane! That sounds like organizational panic. Makes it all the more impressive that Steve Jobs pulled them out of that nosedive.

It's weird seeing 1992 associated with some kind of far off distant time.

It was 26 years ago. End of the Road by Boyz II Men came out and topped the charts. I saw Home Alone 2 in the theaters. The little blond munchkin who met the future President in that movie is now 38. Cartoon Network premiered. Bill Clinton became President. Windows 3.1 and Microsoft Works were released.

It's a long-ass time ago.


I tend to think of the "modern world" as starting with the fall of the Berlin Wall in 1989, but now that I am beginning to work with adults with graduate degrees that weren't even born then, it's beginning to be a bit absurd.

Berlin wall? Imagine my distress at having tuned in to the modern world during the Watergate fiasco. My father had to explain to me that it was the name of a hotel and not some sort of earthworks.

But it feels like that to me, even though I lived through it. Sometimes even a year ago can feel so far away.

I found this quote the most interesting:

"Microsoft's entry into most new technologies follows this same plan, with the first effort being a preemptive strike, the second effort being market research to see what customers really want in a product, and the third try is the real product." Robert Cringely in "Accidental Empires"


> I was recently recommended to read the book Accidental Empires by Robert X. Cringely, first published in 1992 (or was it 1991?) and apparently no longer in print...

You can buy a new copy on Amazon.

https://www.amazon.com/dp/0887308554/


Hey look, the audio cassette (what's that??) is only $869!

That's right everyone, we live at a pivotal moment in history where trading algos are making wild-arsed guesses at the market value of low-sales-volume prerecorded compact cassettes.

Is it Robert X. Cringely's voice on the tape?

I used to listen to his podcast during my morning commute. I'll miss him when he retires.

Got a link to the podcast? I’m unable to find it in Overcast or Google.

I'm sorry, but I couldn't find it. He may have taken them down. Not even the PBS years seem to be available.

> Big computers and little computers are completely different beasts created by radically different groups of people. It's logical, I know, to assume that the personal computer came from shrinking a mainframe, but that's not the way it happened.

This seems off. Those little computers in your phones run a flavor of Unix, just like the mainframes did (besides VMS, etc). Also, those little computers connect to the cloud, which is just like a mainframe. If anything, history is repeating itself.


The idea that big computers and little computers were fundamentally different goes back at least as far as 1976, when I ran into Steve Jobs at Country Sun Natural Foods in Palo Alto.

Both of us being hippies, we got to talking and Steve told me he'd just started a company with a friend and explained the name: "Take a byte out of the apple, get it?"

He mentioned that they needed a 6502 disassembler, and I said that was right up my alley. I'd been programming for eight years and had done low level assembly and binary machine code on several different architectures, so this looked like a fun little project.

We didn't talk money, not that he had any, it just seemed like an interesting weekend diversion.

Steve called me a couple of days later and said, "Mike, I've been thinking about this. Your experience is all with mainframe computers. The 6502 is a microcomputer. It works on fundamentally different principles than those mainframes. There's no way you could possibly write this disassembler, so forget it."

I tried to explain that I'd gone through the 6502 reference, and it was just another instruction set with concepts similar to the others I'd worked with. Steve wouldn't have any of it: "I'm sorry, I've made up my mind. You just don't know anything about microcomputers, all you know is mainframes. Goodbye."

Naturally I said to myself, "Who is this Steve Jobs guy telling me I can't program?" So I went ahead and wrote enough of the disassembler to show that I did indeed know how to do it.

I was about to call Steve back to tell him the good news, but then I thought, "The last phone call didn't go so well. Maybe I'd better drop by his office and show him the code."

So I looked up the address for Apple Computer and found it at 770 Welch Road in Palo Alto. I walked into the building and looked around. It didn't look like a computer company, all I saw was a row of telephone switchboards - the old kind with plugs and jacks - with an operator at each one.

I asked one of the operators where Apple Computer was. She hesitated a moment and said, "Uh, this is their answering service."

Well, of course no successful business used an answering service. I turned around and walked out the door, saying to myself, "These guys are flakes. They're never going to make it."

To this day I don't know if I missed out on becoming a billionaire, or if I dodged a bullet.


Great story - thanks for sharing :) In hindsight, do you think it was any different? (Microcomputer vs mainframe for the disassembled you were talking about building)

Thank you! I went into more detail in another comment in the thread, but going from one instruction set to another just felt like having more than one way to say the same thing. Every instruction was different, but the concepts and kinds of instructions were pretty much identical. Once you've seen an overflow flag and a carry flag and registers and bit shifts and memory load and store on one CPU, you'll certainly recognize them on another.

I guess Steve thought the CPUs really were different, like, say, the difference between a CPU and a GPU.


I guess Steve thought the CPUs really were different, like, say, the difference between a CPU and a GPU.

There's a lot of revisionist history on the internet about Steve Jobs not being technical, but he had an incredibly detailed understanding of the 68000 by the time the Macintosh project was launched. Perhaps your encounter encouraged him to delve into that area of study.


Where did you hear he had such a deep understanding of the 68000? (Not questioning whether he did, just curious because I've never heard that.)

Reading old computer magazines from the library.

You'd be a billionaire with a bullet in him.

So what you are saying is that big computers and little computers were fundamentally the same?

Definitely the same, at the level Steve and I were talking about. The 6502 has instructions that manipulate registers and memory, logical and arithmetic operations, branches and calls, status flags, and so forth. Just like all the "mainframe" CPUs I'd been programming.

The word size is different, the special areas of memory are different, it has different opcodes, and so on. But any competent machine language programmer of the day - regardless of which CPUs they'd programmed on before - would have had no trouble understanding this:

http://www.obelisk.me.uk/6502/instructions.html

That's what Steve didn't understand at the time. Not being a programmer himself, he didn't know how familiar all this would look to any machine language programmer.


Steve always had a problem with heavy big blue computers:

"Unaccustomed as I am to public speaking, I'd like to share with you a maxim I thought of the first time I met an IBM mainframe: NEVER TRUST A COMPUTER YOU CAN'T LIFT." -Macintosh

https://youtu.be/pptb1Uzn7SQ?t=3m04s


Amazing how he was so astoundingly wrong about so many things and still managed to luck and cheat his way to the top.

When he talks about "little computers," he's talking about the microcomputers of the '70s and '80s, like the IBM PC, Apple II and Macintosh, Commodore 64 and Amiga, etc.

And he wasn't wrong. Those machines were completely different than the "big iron" of the day -- they were single-user, single-tasking systems with no meaningful memory protection or other features to protect the system against poorly written or malicious applications. Microcomputers were supposed to be personal computers, with "personal" in this context meaning that the operator had complete control over 100% of the system's resources, up to and including the ability to shoot themselves in the foot. The big iron world was all about multi-user paradigms like time-sharing (https://en.wikipedia.org/wiki/Time-sharing), so the micro mindset was definitely a radical departure.

The irony, of course, is that the micros ended up eating the world, and then the Internet came along and completely demolished their entire philosophy. It turned out that there's no way to connect a single-user, single-tasking computer to a global public network without it running into fundamental performance and security problems. The two microcomputer OS vendors left standing by the 1990s, Microsoft and Apple, both ended up having to throw out their old systems and spend enormous amounts of time and money building new ones (Windows NT and OS X) that could actually be used practically in the new internetworked world. And, not coincidentally, those new systems looked a lot more like the systems that ran on the old big iron than they looked like the systems that had made the microcomputer revolution. The only way for the "little computers" to survive was to become the "big computers" they had so loudly put up against the wall.


> Those machines were completely different than the "big iron" of the day -- they were single-user, single-tasking systems with no meaningful memory protection

The PDP-7 and PDP-11 were also like that. (And yes, they were used for multi-user scenarios, but even an early micro could be so used, if you e.g. ran a BBS on it.) Memory protection and the like are important features to be sure, but they don't define a segment class. And the micros quickly gained these features anyway, with machines like the Motorola 68020 and the Intel i386. The early OS's like MS-DOS/Win 9x and Classic Mac OS didn't make use of them, but this was entirely due to performance concerns, combined with running a CPU-intensive GUI. The transition to a memory-protected, secure-by-default OS was entirely foreseeable, even aside from the sensible concerns about an "internetworked world".


Unix wasn't a mainframe OS to begin with. It was a minicomputer OS which eventually scaled up to mainframes, but it was still born on minicomputers and retained a minicomputer design: Character terminals as opposed to block terminals, no channel I/O, and batch support sidelined as opposed to front and center.

VMS also wasn't a mainframe OS: It ran on the VAX, a high-end minicomputer.

Looking at what early microcomputers looked like and what their OSes were based off of (CP/M in particular) it would be more reasonable to say that early microcomputers were scaled-down minicomputers, which were still in a different league from mainframes in terms of number of concurrent users, I/O design, and focus on interactivity.


UNIX is not a mainframe OS. It was developed for minicomputers. As was VMS, by the way.

Minicomputers are just smaller and cheaper versions of their mainframe counterparts. It's only a small step in evolution, and not an important distinction in this context (imho).

From what I understand, the greatest difference between mainframes and minicomputers (and later, microcomputers), is that mainframes were batch-oriented machines, while minicomputers were interactive-oriented.

Mainframes (pretty much "IBM only" today - I don't know of any other actual mainframe computer manufacturer; to be honest, I am not even sure if IBM is still making mainframes? Ok - just looked, I guess they do) are still batch oriented; what "interactive piece" they have, is just a batch process with a "run forever" run time.


> From what I understand, the greatest difference between mainframes and minicomputers (and later, microcomputers), is that mainframes were batch-oriented machines, while minicomputers were interactive-oriented.

Actually, a lot of early interactive computing was done on time-shared mainframes. I learned to program in 1968 when I saw a Teletype machine in our high school math classroom and found out it could dial into a timesharing system where you could write and run programs interactively.

At first we were dialing into a General Electric mainframe, and partway through the year switched to an SDS Sigma 5 at a local Phoenix timesharing company, Transdata. I got my first job there that summer.

The next year I moved to the Bay Area and started working at Tymshare, where we used a variety of mainframes to provide interactive services, along with some minicomputers for network routing. It was the biggest timesharing company of the day, and there were many others.

Timesharing was a big business from the late 1960s through the 1970s. Of course there were many mainframes doing batch processing too, but interactive computing certainly wasn't exclusive to minicomputers or microcomputers.


pretty much "IBM only" today - I don't know of any other actual mainframe computer manufacturer

Does Cray count?


I heard a lot of Western European governements run on Siemens BS1000/BS2000/BS3000.

Humans and chickens share 65% of their DNA. Both have a lot of similar anatomical features: bones, brains, feet. And yet a human is not just a larger chicken.

What I took from this quote was that IBM and even DEC were shipping enormous devices that were engineered like digital tanks, with redundant data paths, redundant power supplies, somewhat hot-swappable component cards, and a guarantee of several nines worth of uptime, when combined with their first-party operating systems. Breakthrough home computers, by contrast, tended to emerge from "good enough to play games and play with balancing your bank account," having a strong emphasis on arcade graphics, but not so much on reliability.

> Breakthrough home computers, by contrast, tended to emerge from "good enough to play games and play with balancing your bank account," having a strong emphasis on arcade graphics, but not so much on reliability.

That was true after 1977, certainly, but going earlier, the S-100 bus computers which ran CP/M had text terminals (and front panels with lights and switches!) and were cheaper than any other kind of real computer, but were more in the mold of the small-scale business machines of the era, called minicomputers, than the Apple II or Commodore-64 were.


By that standard there’s no difference between your computer and most of the consumer electronics market.

You’d be astonished of how mant things run flavors of unix


If you're interested in older Silicon Valley history,

In Search of Stupidity: Over Twenty Years of High Tech Marketing Disasters ( https://amzn.to/2EJjvHG )

is pretty good, although he gets some things pretty wrong himself, like the rise of open source.


Apenwarr podcasts are really good, recommended to everyone.

Been reading his blog for a long time, but didn’t know there was a podcast. I’m unable to find it though — do you have a link?

He definitely doesn't have podcasts (though there are a few recorded talks). His blog posts seem popular though. :)

This guy sounds pretty ungrateful.

He was able to read Robert X. Cringley's book "Accidental Empires" thanks to Archive.org's "virtual library card" service.

This is a service where Archive.org sources a physical copy of a book for you, scans it, and lends it out to you digitally, one person at a time.

Sounds like a useful service. But all this monumentally ungrateful guy can do is moan about how "aggressively user-hostile" is it because you have to double tap to open a book from the list.


A few of the local library systems I've used have the same Adobe Editions lending scheme, and it's terrible in so many ways that I just gave up on e-books from the library.

On those rare occasions I want a an e-book these days, I go to iBooks or Nook because the library programs (including Bibliowhatsits) are so bad.

Mostly I just buy the physical book, or order a physical copy from the library. It might take a few weeks to get from the other side of the country, but that's less hassle than dealing with the current state of library lending DRM.


He said you need to be extremely precise with your finger to have a double tap register. This sounds like a legitimate usability complaint and reasonable feedback.

Having UI feedback doesn't mean you don't appreciate what was done well. If the reply to a bug report is "but look at all I've done for you!" you're not going to fix many bugs and no one improves. (I have definitely been there, though.)


Oh please. He sounded quite grateful to archive.org. He sounds frustrated because the mobile application developers didn’t think that maybe double tapping a single pixel is extremely difficult to register on a touchscreen. The app dangles content in front of you, tells you that you can have it, but then it seems impossible to actually get to it. That’s the epitome of frustrating software. That software has nothing to do with gratitude to archive.org.

Presumably, he would prefer the publishers' option of not having Archive.org online at all.

... has this person never checked out an ebook from their local library? That Adobe DRM is the industry state-of-the-art.

Libby app makes it trivially easy.

* not affiliated in any way, just a happy user.

https://meet.libbyapp.com/


Does it run under wine?

https://help.libbyapp.com/6059.htm

> If you have an Adobe-compatible ereader (like a NOOK or Kobo), you can download Libby ebooks on a computer, then use Adobe Digital Editions (ADE) to transfer them to your device:

> 1) Open libbyapp.com on a computer and go to Shelf > Loans.

> 2) Select Send To A Device.

> 3) Click select your device, then choose Adobe-compatible ereader. Note: If you've sent books to Kindle in the past, you may need to click your Kindle device at the top of the screen instead of select your device.

> 4) Select Download DRM File.

> 5) Open the file in ADE and transfer the book to your ereader. (Learn how using this article from OverDrive Help.)

trivially easy


Oh. I never actually looked into what it would take to read on a laptop, I read on my phone or tablet. This is definitely a lot more painful :(

Or just use your phone.

The more punishing the process of getting shared access to the book, the more units it'll sell. At least, that seems to be what publishers believe in.

In USA you have rights to format shift, so can libraries just 'rip' books in to an open format and lend those versions as long as they internally reserve a paper copy?

You can change formats, but they can't lend more than a single copy at any given time or they will be violating copyright (and making more copies of a book than they actually have).

Ha, so banks can have fractional reserves but libraries can't. Typical.

Gotta keep the wealthy bankers happy

They'd keep librarians happy if librarians donated to political campaigns.

The library could lend in a useful format though. I wasn't suggesting they lend tortuously.

Also if they limited loans to 1hr (or any time short of the regular loan), but allowed reservation ahead, then they could probably have many more people with the book available. The majority of the time I have a library book checked out it's on my shelf; doing the checkout virtually, when it's on my virtual shelf the library can loan it to someone else. Either I wait (if I want to access it and all copies are currently in use), or they take a payment from me to access the book immediately and they buy access to an extra copy.

Of course publishers would probably just increase the loan charge.


has this person never checked out an ebook from their local library?

To be fair I don't know anybody who has either. I know it's possible and I know my local library offers this, but I've never met anybody who's actually used this service.


I use it all the time, and is incredible convenient. I use Libby, and it works basically the same as the Kindle app, but with fewer features.

Check it out, it's a great resource!


Definitely check it out (pun not intended). Libraries are amazing institutions that many younger people assume are antiquated but are really keeping up with the times. Also, besides ebooks, many of them also have digital magazines you can read on your phone or tablet. I read the New Yorker and The Economist that way.

My county library has a poor electronic collection. Still, I tried to check out a book and got put on a waiting list with 30 other people. That was before Christmas. I haven't heard anything else about it.

Waiting lists happen with popular new books with physical copies as well. If you look for books that are a year or so old, or are something other than popular fiction (science or history for example), normally you can get a book immediately in my experience.

I'm surprised. To me it seems like the only reasonable way to use eBooks.

The only reasonable way to use ebooks is to buy the non-DRMed epub, drop it on your web server in a password-protected directory, and read it from all the devices you want to read it from.

I tried it once. You had to wait like 3 weeks for Overdrive to let you download one of the few books in their collection (mostly kids books and just pop garbage).

Going to libgen is a much better experience.


Ehm, haven't even seen a library in the past 10 years. Pretty sure the amount of people who has checked out an ebook from a library here is very small.

[From the tech scene], not just general quotes about the state of the world or life.



Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: