Hacker News new | past | comments | ask | show | jobs | submit login
Xerox’s influence on Steve Jobs and Bill Gates (2018) [video] (youtube.com)
80 points by thunderbong on Jan 3, 2023 | hide | past | favorite | 30 comments



Hum, usual confusion of what the Alto and what the Xerox Star was.

(The Alto was an experimental development system and did not have a consistent user interface. Many of the innovations at PARC were eventually consolidated by Xerox SDD into the Xerox Star, which provided a consistent interface, metaphor and 'persona' of the computer. The Star was a commercial computer and highly impressive for its time, but not a success. The famous Apple visit comprised a tour of the Alto, but not of the Star. According to their own word, Apple developers like Bill Atkinsons had not seen or experienced the Star. On the other hand, they had read papers from PARC even before the visit and had been influenced by them.)

It may be also worth mentioning that Xerox hired a substantial amount of the talent floating around the ARPANet community. (Bob Taylor, founder of PARC's CS department, had previously headed ARPA IPTO.) Rather few projects emerged genuinely at PARC, many ideas existed already previously and people were hired because of these ideas. (Possible video, "How Xerox Ripped Off the ARPANet Community"? /s)


Excellent, I can't find anything to correct, and I was there :) This is also an old video.

Maybe a note expanding on "Bill Atkinsons had not seen or experienced the Star." That's because in Dec. 1979 there was nothing he could see. It was introduced in May, 1981 and barely worked even then.

I have all the facts about Inventing the Future, including the lightning storm that took down the Ethernet: Jan. 5, 1978. Maybe on the 45th anniversary this Jan. 5 we'll have another networking disaster with the big storm.

https://www.albertcory.io/inventing-the-future

I knew about the ARPANet connection, but I don't have a list of names. Eric Harslem, my boss, was on RFC 39 and 40.


Regarding "Bill Atkinsons had not seen or experienced the Star": This was more aimed at the use of icons (esp. desktop icons) and drag&drop interaction on the Lisa, which emerged only late in development, when the Star was already released (introduced in April 1981). However, this was apparently a parallel development, compare this talk [0] by Bill Atkinson at 48:22. (Similarly, just before this, Atkinson mentions the very same background raster alignment problem for desktop icons, the Star had to solve. Compare "Designing the Star User Interface", cited in another comment.)

[0] Bill Atkinson, video talk, Carnegie Mellon University, Feb 04, 2019: https://scs.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=...

P.S.: I guess, realistically there would have been a chance to demo some preview of what became the Star interface in December 1979 (16 months before the official introduction), but this is not what was shown at the (in)famous Apple PARC tours. I guess, I'll have to buy your book, though… :-)


> I'll have to buy your book

Why, yes, I guess you will. You're an excellent scholar.

Actually in my book I mention the Smalltalk prototype of Star, which was done in 1978 or so. This was not kept up to date and I'm sure he did not see that.


At the end of a talk at the EE380 course in Stanford with L Peter Deutsch, Allan Schiffmann tells a story about being at the launch of the Xerox Star. As more and more things were demoed, the guy behind him kept cursing very loudly. Allan got fed up with this and turned to tell the guy to be quiet - it turned out it was Steve Jobs.

The April 1982 issue of Byte magazine had a very detailed article about the Xerox Star: "Designing the Star User Interface" by Dr. David Canfield Smith, Charles Irby, Ralph Kimball, B


I have this in my book (not the cursing part). I was worried that only Dave Liddle even knew about Jobs being there, so now I feel better.

It's possible I talked to Jobs without knowing who he was! I talked to a lot of people.

Dave Smith wrote the foreword to my book, and that's his son on the cover, "playing" MazeWar. There's a guy in Seattle with two working Altos in his basement, and that's the real software.


> The April 1982 issue of Byte magazine had a very detailed article about the Xerox Star: "Designing the Star User Interface" by Dr. David Canfield Smith, Charles Irby, Ralph Kimball, B

Ooooh. Found it here: http://billverplank.com/CiiD/XeroxStar.pdf



I'm kinda disturbed by the notion that Xerox took anything from Arpanet. by the time I was involved, people like Lixia and Steve were really core members of design group - and I always considered their participation and willingness to host meetings a huge boon to the community.


This was more a snarky remark aimed at those "how X stole from Y" narratives, not a serious criticism.

(Progress is always a combination of ideas floating around and brilliant people who sharpen these ideas and make them happen. And one of the ways to identify those people is listening to their ideas, which is in turn, why and how these ideas spread…)


People always seem to forget that Xerox borrowed a lot of ideas themselves.

Douglas Englebart at Stanford Research Institute gets credit for many innovations, including the mouse. The infamous "mother of all demos" can be seen here:

https://www.youtube.com/watch?v=B6rKUf9DWRI

Going back to the early 60's, there was an equally famous GUI demonstration at MIT, using a light pen instead of a mouse as a user input device for rudimentary CAD.

https://www.youtube.com/watch?v=6orsmFndx_o

Finally, anybody who claims that the Mac or Windows was nothing but a copy of the Alto has never seen an Alto in action.

https://www.youtube.com/watch?v=9H79_kKzmFs


Almost all ideas can be traced to something else.

Icons were Dave Smith's invention. He came from SRI, so one could easily find someone there who inspired him.

Apple and Microsoft both made significant contributions and Mac/Windows were in no way a "copy of the Alto." In fact, Star itself was not a copy of the Alto; that was the problem it had to solve! The Alto's programs all had different interfaces.

"Direct manipulation with the mouse" was what wowed Jobs, and what he saw was Smalltalk, which had a different interface than any of the stuff I mentioned.

Interestingly, Bill Atkinson was sure he'd seen something cool at the demo that he liked, so he implemented it. It turned out that he hadn't actually seen it.


> Interestingly, Bill Atkinson was sure he'd seen > something cool at the demo that he liked, so he > implemented it. It turned out that he hadn't > actually seen it.

In a little more detail: the Smalltalk MVC (model-view-controller) GUI had an overlapping windows scheme, but only the topmost window is active. Since it is topmost, it isn't partially covered by any of the others and updating it is trivial. Clicking any other window brings it to the top and only then is it updated.

Switching between windows was a lightweight operation which gave Bill the wrong impression that even background windows were being updated. That would have required clipping graphics operations by arbitrary shapes, which Smalltalk didn't do.

Bill spent a while figuring out how to do that and as a side effect could easily implement non rectangular windows, like the rounded corners that Steve Jobs wanted.


Thanks. I do recall beginning GUI programmers being flummoxed by "repaint this rectangle" calls. All they knew how to do was repaint the entire window, which was not at all what was wanted.


The book "The Dream Machine" by M. Mitchell Waldrop was a great read if you're interested in learning more about the inspirations behind PARC itself. It covers the history of personal computing from the early 20th century to the end of the 20th century, and does a great job covering a lot of the personalities behind the tech that is so ubiquitous today.


I had learned of Doug Engelbart doing my thesis in the early 90's and was struck dumb when I was able to actually watch it on YouTube. Rumor has it Stewart Brand was the A/V tech.


More interesting to me is Douglas Engebart's influence on Xerox

https://dougengelbart.org


By the way, Jerry Morrison and I wrote a paper revisiting all this from the point of view of, "OK, we know all that history. So what should they have done?" You might find it entertaining:

https://www.albertcory.io/lets-do-have-hindsight


The video opens with the claim that Apple 'stole' from Xerox, but students of history will know that Apple took $1m from Xerox as an investment and the visit to Xerox PARC was part of that arrangement.

I know Xerox sold some stock at the IPO in 1980 but still owned a chunk, it would be interesting to see what their potential upside was today.

https://en.wikipedia.org/wiki/History_of_Apple_Inc.


More surprising to me: Xerox patents were allowed to expire. What I draw from this is that when patents are allowed to work as intended[0], that corporations are very much more likely to invest in new technologies and push for new innovations. They were able to still reap billions off the printer market they invented, but this seems to have kept them on their toes, founding PARC, which inspired / lead to tons of innovation in its own right (even if it wasn't owned by Xerox).

I know this isn't the direct point of the video, but that is my ultimate takeaway here, that the patent system, when applied as intended[0], actually drives innovation and reasonably protects innovators long enough for them to make enough to fund their next thing etc.

[0]: Approximate definition I mean of this: Giving the inventor(s) the monopoly on a technology / innovation for a finite period of time, then released into the public domain

edit: apparently I got my facts wrong (took video at its word that the patents expired) but that is, in fact, not the whole story. Though, I think my broader point is still sensible, that a functioning patent system, when working as intended, would foster innovation & investment that otherwise wouldn't be fostered.


Xerox did not 'allow' their monopoly on copiers to expire. They regularly filed large numbers of patents and engaged in monopolistic commercial practices in an attempt to lock up one or more elements of the supply chain. The FTC then forced them into a consent decree in the 1970s that required them to license the patent portfolio at a reasonable royalty fee. This allowed the Japanese market entrants to compete and therefore innovate for the benefit of the consumer.

Edit: arguably, because Xerox did not have to compete with the Japanese until legal actions were taken, they were NOT actually on their toes and lost ground rapidly once the market was opened. Had they been forced to compete by the late 1960s instead of mid 1970s, they may have been in a better position to sustain themselves against foreign competitors.


I should have read this before my comment, sorry. You're correct.

Laser printers didn't create PARC -- it was the other way around.


They're not "allowed" to expire, they have an inherent time limit. Of course, smart patent holders make "improvements" and get new patents before that happens.

On the printer market: maybe you mean the laser printer? The 9700 came out in 1977, and the other laser printers a few years after. PARC was founded in 1970.

There was also an antitrust settlement that Xerox made, which forced them to license the patents to competitors before they expired.


That how insulin business works... Patent expire, a new version is out, old version is removed from production.


Where would you go to 'see the future of computing' today?


You can find lots of labs claiming they have it. Determining which ones really do is the hard part - otherwise everyone would be investing in it.


AI, quantum computing.

The first one is becoming a bit of a buzz word, sometimes with great(excessive) expectation not in line with what the actual thing is (statistical algorithms fed with a very huge amount of data in order to build a model consisting of serialized data saved on disk and serving as filter to classify input or generate new output, like picture (midjourney), text(chatGPT), or actions (brake for self-diving car) from input).

Quantum computing seems to progress quickly, but according to comments from an article published yesterday, it's still prone to error which makes it unreliable for actual commercial use and is more used for training for future quantum computing software dev.

The problem is that the ones reporting on these things, for example on youtube, don't always understand how or what these technologies do and have excessive expectations about them. The other problem is people having a ball in the game (scientists, startup CEO, investors, paid commenters looking for clicks) do the same in order to have a financial gain.


If you were asking me I think integrated circuits are going to get bigger. I find them too small and disposable right now.

I'm trying to figure out how to get the most value of large circuits.



..nice little video—- but then completely ruined by the ad at the end. Ironically it kind of shows how we have evolved. From the fresh, new, and exciting technologies, now to a blistering reality of miles of tabloid ads at the end of every article and ads incorporated even into a tech video about Xerox and the old days..




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: