Gary Kildall's ambition was limited, something that is not supposed to be a factor in American business. If you hope for a thousand and get a million, you are still expected to want more, but he didn't.
I would very much liked to have known Gary Kildall. Everything I've read about him gives me the impression he'd be an immensely interesting person to know. The persona that comes across in his co-hosting of the Computer Chronicles show (something, to itself, that you should look up if you're not familiar) is one of great technical competence combined with a gentle humbleness.
(Edit: If you're into learning more about Kildall there's a good prior discussion of an unpublished biography on HN: https://news.ycombinator.com/item?id=12220091)
FWIW I’ve researched this time period extensively and I think Gary was much kinder than Bill and a great guy. But I don’t think it’s accurate to say he wasn’t ambitious. And certainly inaccurate to say that he was content having had a good run and not winning.
This is how I feel about hacker news. People who have the time to advance technology. And I'm grateful to witness so much of it here.
It occurs to me that the same applied in the transition to mobile. Microsoft was obsessed with making hand-helds and phones into little PCs, keyboard and all. It actually required a radically new interaction model and interface technology to unlock the potential of the form factor.
Microsoft's team originally produced a new touch-first interface that wasn't entirely different from the eventual iPad. They took it to various corporate customers, and the response was unanimous: it needs to run Real Windows or it has no future. Facepalm. So they went back to Redmond and changed it to be Windows on a touchscreen. It met all stated customer requirements, technically, but of course it was clumsy and people didn't really see the benefit over a laptop, and it never took off.
Sounded to me like the classic "faster horse". It's not so much that they needed to make everything a PC, and more that they were such experts at enterprise sales that they didn't have confidence in their own new ideas.
This is not so obviously wrong. It's fairly well acknowledged that the biggest barrier to the iPad becoming a general computer for most people is iOS. It's unclear that the answer is 'Windows on a touch screen' but it's also equally clear that Apple hasn't done anything like enough to make iOS work for the larger form factor.
I agree with what you said, but part of the problem was resistive touchscreens just weren’t very good.
Microsoft keep looking at a few data points they have now, extrapolate those into the future and then try to jump directly to that future. Currently it's an attempt to 'catch up' with Apple, but they keep extrapolating straight lines when the actual track of innovation is in curves. They did the same sort of thing with Longhorn, trying to re-invent the PC platform with object oriented database file systems and the whole OS running on .NET because Java looked like it was the wave of the future. They're trying to out-compete rival technologies without really thinking through what they're doing.
I don't think he sees Apple as their main competitor, but is really competing with AWS for the future of business computing, and not doing that badly.
Hit windows key, start typing until the name of the setting I'm looking for pops up, arrow keys and enter.
On interaction design: the iPhone may not have been quite the first consumer device with a capacitative touchscreen that worked brilliantly without a stylus, but it certainly felt like it was. Styluses were the bane of pre-iPhone touch devices and I'm surprised that Samsung tried to bring them back.
OTOH, my laptop has more in common with a VAX running Unix than it has with a Commodore PET. Or a LINC.
This was my thought upon reading the "Big computers and little computers are completely different beasts" quote, and it's also something I've been thinking about lately. Personal computers definitely represented a distinct development track from the multi-user/multitasking industrial/academic computing systems of the time. However, as PCs became more capable and merged key innovations from the higher-end computing branch (e.g. virtual memory, protected execution environments, and the operating systems that can leverage such features), we seem to have arrived at the same point -- a phone in my pocket that's more like a VAX 11/750 available in 1981 than an IBM PC from the same year. The "microcomputer" development track that included the Commodore PET and 8086-based PC systems was greatly useful for jumpstarting the PC industry, but was ultimately set aside.
So maybe it's not so much that small computers are completely different beasts than large computers, as their initial development required a unique approach to fit the constraints of that particular era.
As someone who has personally installed UNIX on a VAX 11/750, I do appreciate the attention to human factors that the microcomputer era has contributed to computing. :)
Their evolution is pretty cool. And their software is very alien to people who grew up on personal computers.
Are we living in a period in which many things seems to be going backwards? With ARM trying to go into servers/desktop, or the (somewhat) failed attempts to bring the tablet/phone interface to PC (Ubuntu and/or ugly full-screen-only apps)?
A while back they made an online version: http://www.hover.ie/
This was how Microsoft taught people to do things like double-click, click-and-drag, and a number of other things that we take for granted today but were completely foreign back then.
Pinball game was licensed, not developed by Microsoft AFAIK
This is from memory, mind you.
I still exclaim "roll over, Beethoven!" whenever appropriate.
"IBM has 33,000 programmers on its payroll but is so far from leading the software business (and knows it) that it is betting the company on the work of 100 Apple programmers wearing T-shirts in Mountain View, California."
Quote from a co-worker who spent a lot of time at Taligent: "They don't realize mice can have more than one button."
(I worked for the IBM Taligent Project Office in Austin. We were an AIX (mostly) and OS/2 (to an extent) shop.)
(Advice: Don't work for IBM.)
When the Java group was trying to come up with a name for the browser that was going to ship with the Alpha 1 code it was called "WebRunner", that was the name we used up until we discovered through a trademark search that the name was currently registered to Taligent. Sun couldn't get anyone at Apple or IBM to acknowledge it, much less talk to us about licensing or transferring it, so we renamed the browser to 'HotJava' (which I at the time wasn't aware of the sexual innuendo there but it was what it was). I still have my jacket that has Fang on it and says "WebRunner." I have never had the heart to throw it out.
(I worked for Kaleida!)
Quite the opposite: http://www.digibarn.com/friends/jef-raskin/writings/holes.ht...
Cringely is a good writer and storyteller. The stories he tells are often hearsay or wildly inaccurate recounting of events by third parties. His writings should in no way be treated as being accurate, and should not be referenced as historical sources or anything like that.
Thus, I only remember Cringely as a bloviating laughingstock. It's interesting to see where his authority came from, this book seems very interesting.
(1996 documentary by Cringely. Excellently presented) There's a part 2 too.
Here's the quote about him:
Let's say for a minute that Eubanks was correct, and Gary Kildall didn't give a shit about the business. Who said that he had to? CP/M was his invention; Digital Research was his company. The fact that it succeeded beyond anyone's expectations did not make those earlier expectations invalid. Gary Kildall's ambition was limited, something that is not supposed to be a factor in American business. If you hope for a thousand and get a million, you are still expected to want more, but he didn't.
If this is true, he was the polar opposite of almost everyone else in the story. It reminds me of that line in Breaking Bad when Jesse asks Walter how much is enough. Kildall knew the answer, and stuck with it. I didn't get that sense from any other character, except maybe Woz.
Kildall spent the later part of his career hosting The Computer Chronicles, a PBS weekly news show that ended up documenting the entire early history of personal computers. The shows are on YouTube and are fascinating to watch:
This post brings back so many memories...Novell! the first damn network system that was ever worth a shit on a IBM PC-compatible. Installing that software finally put and end to the nightmare of "sneaker-netting" all my office PCs.
OS/2 failure! Who would have thought that shit-DOS would withstand the challenge of such a superior (see QNX!) and deep-pocketed competitor? What were we all smoking back then??
Jobs and NeXT!...man did I covet one of those ridiculous priced machines, but with no market penetration and my company focused on cheap compatible hardware, it made no sense to develop with it.
I could go on and on...it's rather amazing...without me knowing it I've become my own history book.
At any rate, none of those exactly fit the bill, because Cringely's two "Nerds" documentaries† hold a special place in popular tech reporting history.
For a while he did a series of columns exposing the decline and mismanagement at IBM; he eventually turned these into an ebook that I purchased and thought was a reasonably interesting read.
More recently he had cataract surgery and had to move to a new home after his old one was damaged in the California wildfires.
...which was why the idea of 100 percent IBM compatibility took so long to be accepted. "Why be compatible when you could be better?" the smart guys asked on their way to bankruptcy court.
It sounds a bit like a jab at Apple (or maybe even NeXT, which I believe was struggling in 1992 with its hardware releases). AFAIK neither company ever went bankrupt.
Or was it referring to the crop of other personal computing platforms from the late 70s/early 80s? (Vic20/C64, TRS-80, Sinclair, Heathkit, etc.)
Or something else? Big SGI workstations, etc.
The Unix workstation market that NeXT, Sun and SGI competed in was a bit different and didn't really overlap much with the PC market.
One company I remember that was pretty successful for a while were Apricot: https://en.wikipedia.org/wiki/Apricot_PC
The only other issues I can remember I think is the PSU was in the monitor so video connectors were odd and the keyboard/mouse were Amstrad custom designs so you couldn't swap with a regular PC or clone.
I worked at Borland in the UK at the time and I don't remember any real issues.
The other major vendor in the UK was Apricot which started out rebadging Victor machines and had a similar level of compatibility to the DEC rainbow (I think it used an 80186 as the CPU)
I think that this is, perhaps, the most apt quote that will live-on in a state of permanence throughout the generations.
Think about how much better off we'd be if books were available digitally like Wikipedia articles are, and social media posts were restricted to "borrowers" one-at-a-time.
> The rest of the company was as confused as its leadership. Somehow, early on, reorganizations - "reorgs" - became part of the Apple culture. they happen every three to six months and come from Apple's basic lack of understanding that people need stability in order to be able to work together. [...] Make a bad decision? Who cares! By the time the bad news arrives, you'll be gone and someone else will have to handle the problems.
I worked for 6 years at STMicro, and the group I was in (Nomadik SoC) did reorgs every 18 months, practically like clockwork. This was a huge multinational company, and it was my first job so I didn't have any context to weigh it against, but in hindsight it was a huge sign of disfunction and leadership that didn't know WTF it was doing. As expected, the group shut down a few years down the line.
The place I currently work hasn't had a reorg since I joined, almost 4 years ago. But I'm sure that's largely attributable to a steady and successful product line.
It can be either a vicious or a virtuous cycle...
It's a long-ass time ago.
"Microsoft's entry into most new technologies follows this same plan, with the first effort being a preemptive strike, the second effort being market research to see what customers really want in a product, and the third try is the real product."
Robert Cringely in "Accidental Empires"
You can buy a new copy on Amazon.
This seems off. Those little computers in your phones run a flavor of Unix, just like the mainframes did (besides VMS, etc). Also, those little computers connect to the cloud, which is just like a mainframe. If anything, history is repeating itself.
Both of us being hippies, we got to talking and Steve told me he'd just started a company with a friend and explained the name: "Take a byte out of the apple, get it?"
He mentioned that they needed a 6502 disassembler, and I said that was right up my alley. I'd been programming for eight years and had done low level assembly and binary machine code on several different architectures, so this looked like a fun little project.
We didn't talk money, not that he had any, it just seemed like an interesting weekend diversion.
Steve called me a couple of days later and said, "Mike, I've been thinking about this. Your experience is all with mainframe computers. The 6502 is a microcomputer. It works on fundamentally different principles than those mainframes. There's no way you could possibly write this disassembler, so forget it."
I tried to explain that I'd gone through the 6502 reference, and it was just another instruction set with concepts similar to the others I'd worked with. Steve wouldn't have any of it: "I'm sorry, I've made up my mind. You just don't know anything about microcomputers, all you know is mainframes. Goodbye."
Naturally I said to myself, "Who is this Steve Jobs guy telling me I can't program?" So I went ahead and wrote enough of the disassembler to show that I did indeed know how to do it.
I was about to call Steve back to tell him the good news, but then I thought, "The last phone call didn't go so well. Maybe I'd better drop by his office and show him the code."
So I looked up the address for Apple Computer and found it at 770 Welch Road in Palo Alto. I walked into the building and looked around. It didn't look like a computer company, all I saw was a row of telephone switchboards - the old kind with plugs and jacks - with an operator at each one.
I asked one of the operators where Apple Computer was. She hesitated a moment and said, "Uh, this is their answering service."
Well, of course no successful business used an answering service. I turned around and walked out the door, saying to myself, "These guys are flakes. They're never going to make it."
To this day I don't know if I missed out on becoming a billionaire, or if I dodged a bullet.
I guess Steve thought the CPUs really were different, like, say, the difference between a CPU and a GPU.
There's a lot of revisionist history on the internet about Steve Jobs not being technical, but he had an incredibly detailed understanding of the 68000 by the time the Macintosh project was launched. Perhaps your encounter encouraged him to delve into that area of study.
The word size is different, the special areas of memory are different, it has different opcodes, and so on. But any competent machine language programmer of the day - regardless of which CPUs they'd programmed on before - would have had no trouble understanding this:
That's what Steve didn't understand at the time. Not being a programmer himself, he didn't know how familiar all this would look to any machine language programmer.
"Unaccustomed as I am to public speaking, I'd like to share with you a maxim I thought of the first time I met an IBM mainframe: NEVER TRUST A COMPUTER YOU CAN'T LIFT." -Macintosh
And he wasn't wrong. Those machines were completely different than the "big iron" of the day -- they were single-user, single-tasking systems with no meaningful memory protection or other features to protect the system against poorly written or malicious applications. Microcomputers were supposed to be personal computers, with "personal" in this context meaning that the operator had complete control over 100% of the system's resources, up to and including the ability to shoot themselves in the foot. The big iron world was all about multi-user paradigms like time-sharing (https://en.wikipedia.org/wiki/Time-sharing), so the micro mindset was definitely a radical departure.
The irony, of course, is that the micros ended up eating the world, and then the Internet came along and completely demolished their entire philosophy. It turned out that there's no way to connect a single-user, single-tasking computer to a global public network without it running into fundamental performance and security problems. The two microcomputer OS vendors left standing by the 1990s, Microsoft and Apple, both ended up having to throw out their old systems and spend enormous amounts of time and money building new ones (Windows NT and OS X) that could actually be used practically in the new internetworked world. And, not coincidentally, those new systems looked a lot more like the systems that ran on the old big iron than they looked like the systems that had made the microcomputer revolution. The only way for the "little computers" to survive was to become the "big computers" they had so loudly put up against the wall.
The PDP-7 and PDP-11 were also like that. (And yes, they were used for multi-user scenarios, but even an early micro could be so used, if you e.g. ran a BBS on it.) Memory protection and the like are important features to be sure, but they don't define a segment class. And the micros quickly gained these features anyway, with machines like the Motorola 68020 and the Intel i386. The early OS's like MS-DOS/Win 9x and Classic Mac OS didn't make use of them, but this was entirely due to performance concerns, combined with running a CPU-intensive GUI. The transition to a memory-protected, secure-by-default OS was entirely foreseeable, even aside from the sensible concerns about an "internetworked world".
VMS also wasn't a mainframe OS: It ran on the VAX, a high-end minicomputer.
Looking at what early microcomputers looked like and what their OSes were based off of (CP/M in particular) it would be more reasonable to say that early microcomputers were scaled-down minicomputers, which were still in a different league from mainframes in terms of number of concurrent users, I/O design, and focus on interactivity.
Mainframes (pretty much "IBM only" today - I don't know of any other actual mainframe computer manufacturer; to be honest, I am not even sure if IBM is still making mainframes? Ok - just looked, I guess they do) are still batch oriented; what "interactive piece" they have, is just a batch process with a "run forever" run time.
Actually, a lot of early interactive computing was done on time-shared mainframes. I learned to program in 1968 when I saw a Teletype machine in our high school math classroom and found out it could dial into a timesharing system where you could write and run programs interactively.
At first we were dialing into a General Electric mainframe, and partway through the year switched to an SDS Sigma 5 at a local Phoenix timesharing company, Transdata. I got my first job there that summer.
The next year I moved to the Bay Area and started working at Tymshare, where we used a variety of mainframes to provide interactive services, along with some minicomputers for network routing. It was the biggest timesharing company of the day, and there were many others.
Timesharing was a big business from the late 1960s through the 1970s. Of course there were many mainframes doing batch processing too, but interactive computing certainly wasn't exclusive to minicomputers or microcomputers.
Does Cray count?
That was true after 1977, certainly, but going earlier, the S-100 bus computers which ran CP/M had text terminals (and front panels with lights and switches!) and were cheaper than any other kind of real computer, but were more in the mold of the small-scale business machines of the era, called minicomputers, than the Apple II or Commodore-64 were.
You’d be astonished of how mant things run flavors of unix
In Search of Stupidity: Over Twenty Years of High Tech Marketing Disasters ( https://amzn.to/2EJjvHG )
is pretty good, although he gets some things pretty wrong himself, like the rise of open source.
He was able to read Robert X. Cringley's book "Accidental Empires" thanks to Archive.org's "virtual library card" service.
This is a service where Archive.org sources a physical copy of a book for you, scans it, and lends it out to you digitally, one person at a time.
Sounds like a useful service. But all this monumentally ungrateful guy can do is moan about how "aggressively user-hostile" is it because you have to double tap to open a book from the list.
On those rare occasions I want a an e-book these days, I go to iBooks or Nook because the library programs (including Bibliowhatsits) are so bad.
Mostly I just buy the physical book, or order a physical copy from the library. It might take a few weeks to get from the other side of the country, but that's less hassle than dealing with the current state of library lending DRM.
Having UI feedback doesn't mean you don't appreciate what was done well. If the reply to a bug report is "but look at all I've done for you!" you're not going to fix many bugs and no one improves. (I have definitely been there, though.)
* not affiliated in any way, just a happy user.
> If you have an Adobe-compatible ereader (like a NOOK or Kobo), you can download Libby ebooks on a computer, then use Adobe Digital Editions (ADE) to transfer them to your device:
> 1) Open libbyapp.com on a computer and go to Shelf > Loans.
> 2) Select Send To A Device.
> 3) Click select your device, then choose Adobe-compatible ereader.
Note: If you've sent books to Kindle in the past, you may need to click your Kindle device at the top of the screen instead of select your device.
> 4) Select Download DRM File.
> 5) Open the file in ADE and transfer the book to your ereader. (Learn how using this article from OverDrive Help.)
Also if they limited loans to 1hr (or any time short of the regular loan), but allowed reservation ahead, then they could probably have many more people with the book available. The majority of the time I have a library book checked out it's on my shelf; doing the checkout virtually, when it's on my virtual shelf the library can loan it to someone else. Either I wait (if I want to access it and all copies are currently in use), or they take a payment from me to access the book immediately and they buy access to an extra copy.
Of course publishers would probably just increase the loan charge.
To be fair I don't know anybody who has either. I know it's possible and I know my local library offers this, but I've never met anybody who's actually used this service.
Check it out, it's a great resource!
Going to libgen is a much better experience.