Hacker News new | past | comments | ask | show | jobs | submit login
Larry Tesler Has Died (gizmodo.com)
1346 points by drallison on Feb 19, 2020 | hide | past | favorite | 149 comments

I knew Larry Tesler as a colleague, friend, member of my research group, manager, etc. for more than 50 years, almost as long as I knew Bert Sutherland.

There is an excellent obit for Larry at: https://gizmodo.com/larry-tessler-modeless-computing-advocat...

... and I expect another one from John Markoff -- who was a friend of his -- in the NYTimes.

In many ways, Larry did too many interesting things and had so much influence in too many areas for there to be any chance to characterize him technically. In short, he was a superb wide-spectrum(real) computer scientist who was also a very talented and skilled programmer.

His passing was sudden and unexpected, and I may return later to this note to add more details of his rich career.

For now, I remember him as great to work with in all aspects of his life. He was a great guy, and perhaps that sums him up as best as can be.

Is anyone from Parc still involved in research these days? Or is everyone retired or just chugging along at regular corporate jobs?

Most of the people who were at Parc who are still alive are still doing research.

Pretty much only one wanted to get rich (and did). Several were more or less forced into becoming rich. Money has its own dynamics and none of these folks wound up doing further research.

But Butler Lampson (the "Oppenheimer" of Parc) is still going strong, as am I and many others.

It was a calling, never a job.

When I was 17-18, I read 'Dealers of Lightning' it clued me into the foundations laid for modern computing in the 60-70's in the Bay Area (between SRI, CSAIL and PARC), and markedly changed my own approach to information and computing.

I've always wondered how those who were featured in the book felt about it?

The best book about the ARPA/Parc research community (Parc sprouted from ARPA) is "The Dream Machine" by Mitchell Waldrop: it is both the most complete and most accurate.

"Dealers of Lightning" is at the next level but far from the bottom. Its flaws are too much "Heroes' Journey" and a very complex and confused jumping around timeline (I had trouble myself orienting in some of the spots). But it also has a lot of good stories, of which a reasonable number are "true enough".

"Fumbling The Future" is extremely inaccurate.

Thanks! I'll add that one onto my reading queue.

I actually didn't have too much trouble following Dealers, because it (more or less) mostly followed each project separately, which while creating an interspersed timeline in the book, was mostly coherent within each section.

I did read at least part of Fumbling, but I found it a hard book to thread the needle on, and it was in such stark factual disagreement with what else I'd read that I don't recall finishing it.

I know too many who draw the same conclusions from the Alto and related technologies that fumbling does, so I try to get people to read more about PARC, because I'll hope that once they know more they'll draw much the same conclusions I have - My frustration with much of the traditional criticism of Xerox in failing to commercialize the PARC innovations is it completely ignores both the high cost of the technology (it was literally the technology of the future) and the sales culture of technology at the time.

I don't think any large technology company (which Xerox was broadly) could have made something wondrous out of the innovations PARC created because the people who could recognize the value (and use) of this kind of technology were not the people being sold to, or for that matter doing the buying - nor did they have the budgets to buy a Alto, as was later seen with the Star when it came out.

It took direct to consumer sales (allowing department managers to buy stuff), and lower cost products to allow personal computing to penetrate into the home and corporate america - also the traditional criticism completely ignores that the 9700 (and follow on products) paid for the money spent at PARC several times over.

(incidentally, I believe that this sales culture issue is a prime reason why DEC no longer exists as a company because they failed to see that their market was shifting and could no longer be sold thru the same mechanisms they always had been)

In the end, basic research and the undirected applied science that flows from it is important, even if it has no direct tie to your line of business, because it's the innovation that drives a company forward, and frankly drives humanity forward. I wish more people knew that modern interconnected the world we live in was built on billions spent with no firm idea of what would result from them, and how much of a debt we owe to PARC, Bell Labs and others.

Also, thank you for taking the time to respond to my question!

Yes, the separate timelines of "Dealers" quite missed the cross fertilization and synergies of Parc (which were its main unusual features), so to me, this is a real drawback to this book. On the other hand, "systems" don't parse well into sequences, and Parc was a system, and thus needs something more like a 2D or 3D or 4D chart to do a decent explanation.

There are at least two big issues regarding cost that many people miss: (a) the first is the difference between what should be spent on "prototypes for learning and vetting" and what can be done when designing for manufacturing, and (b) the second is the the difficulty most people had with valuing what personal computing might be for them.

In the first public paper I wrote about the Dynabook I pointed out that Moore's Law meant that powerful tablet sized personal computers would likely wind up costing what a color TV set would cost (they would have pretty similar components, and most of the cost in electronics is in packaging).

But we also had another analogy that we though could work via education: that of the personal automobile in the US. People value cars enough to be willing to pay quite a bit more for them than for most consumer devices. This was very interesting because the ARPA dream of an interactive personal computer connected to a world-wide network was a kind of "information and intelligence vehicle".

If people could see this, then they might be willing to pay what they would pay for a car. Certainly most computer people and most scientists and engineers would be able to assign value in this way. We thought most knowledge workers would eventually be able to see this also, and that there would be an intermediate phase before getting to the TV set kind of technologies.

An analysis of what happened to eventually quash this idea is beyond the scope of this note. (But, to make a point in talks, I've tried to get people to think about what "a car's worth" of personal computer could be like (the average car in the US a few years ago was a Toyota Avalon at $28K, so about 10 times what most personal computers go for).

This is a different slant than the problem that DEC and similar companies had (which was to not be able to understand personal computing in any reasonable form).

"It was a calling, never a job." - This is so heart-warming to read, in a world where the lust for more and more money seems to be so depressingly high at times.

> Pretty much only one wanted to get rich (and did).


I think Alan Kay is referring to Charles Simonyi. According to Forbes, his net worth is $4.3 billion: https://www.forbes.com/profile/charles-simonyi/

I was thinking of Eric Schmidt. A quick check shows a net worth of $14.7 billion.

Looking at all of the work that came out of Parc during your time there... What drew you to that company to begin with? I find it amazing that so many of the Big Names in computer science/software engineering came from the same place.

Parc was an "extension" or "outgrowth" of the ARPA (before the "D") sponsored computer research in the 60s that was catalyzed by Congress overreacting to Viet Nam protests and changing ARPA's charter for the worse.

ARPA/Parc as a community had the best and most enlightened funding for computing research starting in 1962 (Parc started in 1970), and a very large percentage of the familiar technologies of today -- including personal computing, tablets, dynamic OOP, the GUI, the Internet, etc -- were invented by it.

The best (and pretty accurate) book about this remarkable group is "The Dream Machine" by Mitchell Waldrop.

Bob Taylor, who had been a director of the ARPA computing research, looked for a way to fund some of the "ARPA Dream" projects that Congress was curtailing, and found Xerox (which wanted to set up a longer range research lab).

Taylor was particularly interested in recruiting a number of the young Phds that ARPA had funded, and I was one of them.

We've put a link to that article in the title above. (This submission was originally a short text post.)

Larry kindly traded letters with me when I was a young man attempting to learn programming via Object Pascal. Eventually, my mom made me write him a check for all the postage he had spent. In addition to sending me at least two letters a week for just around a decade, he shipped me dozens of books and manuals. One year for the holidays, someone sent me 4 large FedEx boxes filled with networking gear I desperately needed for a “M”MORPG game I was building. The return label read “53414e544120414e442048495320574f524b53484f50”. In the game, players were elves scrambling to defeat a corrupted workshop. The final boss was S̶a̶t̶a̶n̶ Santa himself.

It was only when I was older that I appreciated that he had probably sent me thousands of dollars worth of gear (and not in 2020 dollars!) in addition to the invaluable advice he provided, sometimes (frankly, often) unsolicited but always direct and always thought provoking.

While I never did become an extremely competent commercial developer, to this day I enjoy programming for programming’s own sake. Larry’s push for me to fix my own headaches, rather than simply giving me a metaphorical aspirin, resulted in my development of solutions for small hobby problems that it appeared often only myself and perhaps a few others shared.

As it turns out, in spite of (or thanks to) my niche interests, my curiosity and the method of targeted problem solving Larry fostered set me on a path I remain on today. Frankly, his contributions helped mold me as a man more than those of any other mentor of mine; that is absolutely meant as a compliment to his prescient pedagogy, rather than a slight at my life’s many other wonderful influences.

I’ve sold a few businesses thanks to Larry’s problem solving approach. The rest I founded are running profitably - and somehow I’ve never lost an investor money. My customers have always, above all else, been happy because they had their problems fixed. (Or, perhaps thanks to his influence, their happiness stemmed from my teams simply providing them with the tools they needed to solve their own problems!)

And because I followed Larry’s personal advice, I have been able to spend every day for nearly two decades doing what he encouraged and what has consistently engaged me: finding, isolating and destroying problems.

Thank you for everything.

Cute... 53414e544120414e442048495320574f524b53484f50 = SANTA AND HIS WORKSHOP

Ha, I was wondering how you managed to reverse the hash and then had a facepalm moment.

I still don’t get it.

i use this one alot, it has various decryption tools and you can apply "recipes" : https://gchq.github.io/CyberChef/#recipe=From_Hex('Auto')&in...



Those numbers represent the hex ascii code.

To get you started.


53 = S

41 = A

4e = N


It's hexadecimal ascii

python, decoding and encoding the message


"".join([chr(int(a[i:(i+2)],16)) for i in range(0,len(a),2)])



"".join([hex(ord(c)) for c in a]).replace('0x','')

=> '53414e544120414e442048495320574f524b53484f50'

An easier version and IMO more true to what's actually going on:

>>> bytes.fromhex('53414e544120414e442048495320574f524b53484f50')




Thanks, nice. bytes.fromhex is really reading bytes from a hex string, so perhaps a better name should be bytes.fromhexstring, but since python names are short I understand the trade-off. Also, my first impression is that hex should be called bytes.tohexstring, but for those using python daily I understand the need of shorter names.

It's a hexadecimal encoding of the ASCII character codes: 53 41 4e 54 41 20 41 ... etc.

Wow, that is quite the gesture.

Great story, thanks for sharing.

Great story to honor him. Maybe it is now time for you to be the ‘Larry’ in others lives.

> doing what he encouraged

Did he encourage you do be you ?

Are you You because of him, maybe not, because its impossible to grade.

But here you are paying respect to a man that you met, so I would say: he had an impact on you. Even at that moment.

We could play highschool politics and as what you learnt from him.

But from your message it's clear.

I never met him personally, but I certainly felt his impact.


I met Larry in about 1992 when I went to work on the Newton project. I had seen him around Apple before, and I knew who he was and what he was known for, but I didn't actually meet him until I joined the Newton team. I found him friendly, modest, smart, shrewd, compassionate, full of interesting knowledge and ideas, and interested in other people and their ideas.

I got to know him better when John Sculley ordered him to have the Newton team ditch its Lisp OS and write one in C++. Larry approached me and a couple of other Lisp hackers and asked us to make a fresh start with Lisp and see what we could do on Newton. We wrote an experimental OS that Matt Maclaurin named "bauhaus".

Larry had a sabbatical coming up right about then. He took it with us. He crammed into a conference room with three or four of us and hacked Lisp code for six weeks. He was a solid Lisp hacker. He stayed up late with us and wrote AI infrastructure for the experimental OS, then handed it off to me when he had to, as he put it, "put his executive hat back on." He hung around with us brainstorming and arguing about ideas. He had us out to his house for dinner.

A little later, when things were hectic and pressure was high on Newton, one of our colleagues killed himself. Larry roamed the halls stopping to talk to people about how they were doing. I was at my desk when he came by, next to another colleague that I considered a friend. Larry stopped by to check on us. My friend had also been a good friend of the fellow who had died, and he lost his composure. Larry grabbed a chair, pulled it up close and sat with him, an arm around him, patting him gently while his grief ran its course.

After Newton was released, Larry moved on to other projects. I worked on the shipped product for a while, but I was pretty burned out. Steve Jobs persuaded me to go to work for NeXT for a little while.

Steve is infamous for being, let's say, not as pleasant as Larry. In fact, he sat in my office once trashing Larry for about half an hour, for no good reason, as far as I can see. I politely disagreed with a number of his points. Larry made important contributions to the development of personal computing, and he didn't have to be a jerk to do it.

Larry was extremely smart, but I never knew him to play I'm-smarter-than-you games. I saw him encourage other people to pursue, develop, and share their ideas. I found him eager to learn new things, and more interested in what good we could do than in who got the credit for it.

We weren't close friends, except maybe when we were crammed in a conference room together for six weeks. I didn't see him much after Newton, though we exchanged the occasional friendly email over the years.

I was just thinking lately that it was about time to say hello to him again. Oops.

Larry Tesler was one of the best people I met in Silicon Valley. He was one of the best people I've met, period. I'll miss him.

This is the sort of environment I had always envisioned growing up dreaming of being a software professional

I've lost some colleagues along the way too you never know when it's going to happen every chance to speak should be treated with the respect of knowing it could very well be the last chance to make a connection

(I'm having trouble with this seeming to be a feel-good anecdote about a high-pressure working environment in which people are burning out and killing themselves.)

I suppose that, in part, it's exactly what you say it is.

If it's a feel-good story, I think that must be because I feel good to have had the opportunity to meet and work with Larry Tesler. He impressed me with his intelligence, his generosity, and his compassion. I feel that I'm better for having known him, and I suppose that comes through in my account.

You're right: Newton was a pressure cooker. Larry didn't put that pressure on us, though. We put it on ourselves. We got the idea that there was an outside chance of making something great, and we pursued that dream as hard as we could. Some of us--I include myself--were intemperate in that pursuit, and it cost us.

Now, the pursuit of greatness is a species of vanity, and vanity is a cruel and fickle master. But in Newton's case, at least, I think those of us who were seduced by that vanity have only ourselves to blame.

Thanks for the context!

I guess the feel-good part is remembering that Larry Tesler was a good person inside of and despite that environment.

And I think it's a polite way of reminding everyone that being an asshole is not required to do great, world-changing work.

Your second point may be one reason why, whenever I think of Larry, I also think of Steve Jobs.

Steve was infamous for riding roughshod over employees in pursuit of making something great.

Larry's example shows that such treatment is not necessary to get people's best work. If you can't get great work out of people without abuse, that's your own limitations showing; it's not a law of nature.

I suppose I also think of Steve because I met and worked for Larry and Steve in the same fairly short period of time, and because both projects were high-risk, low-percentage attempts to create something great. Also, perhaps, because Steve was opinionated about Larry and I argued with him about it.

This is a great story, definitely made things a little dusty for me. Thanks for sharing. My own experience with him was also special - he was such an amazing kind, generous, and insightful person. And he would easily qualify for my "ten smartest people I've ever met" list if I was the sort of person who made such lists :-)

Having worked on the Newton you might get a kick out of this - when I was in high school there was a guy still using one with a WiFi PCMCIA card, probably straight up to when the iPhone launched. I imagine he jumped to a smartphone eventually, but he was still on the Newton in 2006.

Seemed like a neat little device.

Larry sounds like the kind of guy I'd want to know. Also, I didn't know Newton was written in Lisp, that's so cool.

The released one wasn't. Its OS was written in C++ and its apps were written in Newtonscript, which was an interpreted language that loosely resembled Javascript (but well before Javascript was created).

The first Newton OS was written in Lisp (specifically in a dialect called Ralph, which was basically Scheme plus CLOS) over a C++ microkernel.

By the way, that Lisp was written by Apple Cambridge, a group in Apple that was created when Larry arranged for Apple to purchase the assets of Coral Software and hire some of its best hackers to design a new programming language.

Apple later spun out that group to create Digitool, which took with it the code and the rights to Macintosh Common Lisp. Digitool did not prosper, and one of its employees, Gary Byers, negotiated the rights to turn the MCL compiler into an open-source implementation that later became Clozure Common Lisp. So Larry is also responsible for the creation of Clozure Associates and Clozure Common Lisp.

There were some issues with the first Newton OS that led to two developments:

1. John Sculley ordered Larry Tesler to redo the OS in C++. That's the version that shipped.

2. Larry asked me and a few other Lisp hackers to see what we could do on Newton with Ralph. That resulted in the bauhaus OS, which didn't ship.

The authors of the first Newton OS were smart programmers, but they weren't Lisp hackers. Larry speculated that that had something to do with some of the early issues, and maybe it did. I think our version offered some improvements.

Another set of issues had to do with the UI design of the initial OS. Larry was critical of it on the grounds that it was basically trying to be a desktop UI in a handheld device. He thought we should try for a UI experience more suited to a new kind of device, and, in the end, both the shipping OS and the bauhaus OS did what he asked, and were better for it.

(In a final evaluation meeting for bauhaus, our managers told us that we had met and exceeded all goals of our experimental project, but Apple was of course going to ship the OS that the CEO told us to ship, and it was of course not going to ship two OSes for the device. None of this was a surprise to the bauhaus team. We were grateful to have had the support to work on the project for as long as we did, and disappointed that it was over.)

Wow, thanks for sharing that history. It's super interesting. Now I know where CCL came from too - so cool!

This breaks my heart. I used to work next to Larry—literally sat next to him—on Yahoo’s central design team. We were in frequent meetings together, but didn’t talk one-on-one often. One evening commuting from work, during one of many Caltrain failures, he noticed me as I waited outside the train and offered me a ride home. I remember sitting nervously in the car, a bit awestruck, and I finally got up the courage to ask him “Did you really invent copy and paste?!”


From then on the ice was broken and we chatted more freely: fun discussions about the (then) up-and-coming voice recognition UIs (I compared them to CLIs which he liked), wearables, design, and cycling.

I consider him a friend. Didn’t expect us to lose him so soon.

To clarify, as the dialogue could be construed otherwise, Larry was actually very humble. While he was not as famous as he should have been, he had so much influence on the industry, it could easily go to your head. He was very approachable and helpful, and overall a generous and kind person. Will be sorely missed.

It’s ok, I think anyone would realise that the fact he gave you a lift meant he was a pretty nice guy. I really enjoyed reading your story. I’m sorry you lost a friend.

> he noticed me as I waited outside the train and offered me a ride home

Ctrl-X + Ctrl-V

Oh my, Larry was only 74? That is far too young.

We were friends, off and on. Perhaps somewhat "off" after I stole his girlfriend. (In my defense, it was her idea!) But that was 35 years ago, and all was forgiven (and hopefully forgotten) in more recent years.

Here is Larry's Smalltalk article from the August 1981 BYTE, complete with a photo of the famous T-shirt that a mutual friend made for him:


A couple of other good articles:



He gave us so much more than cut, copy, paste. It's clear from all the design history books that I've read that he's a legend.[1]



[1] One of the more rare sources for Larry Tesler's contributions is his interview for Bill Moggridge's Designing Interactions (http://www.designinginteractions.com/interviews/LarryTesler)

The Wikipedia article for Cut, Copy, and Paste[1] seems to have this bit that's cited to that book:

> Inspired by early line and character editors that broke a move or copy operation into two steps—between which the user could invoke a preparatory action such as navigation—Lawrence G. Tesler (Larry Tesler) proposed the names "cut" and "copy" for the first step and "paste" for the second step. Beginning in 1974, he and colleagues at Xerox Corporation Palo Alto Research Center (PARC) implemented several text editors that used cut/copy-and-paste commands to move/copy text.

I imagine those "early line and character editors" refers to vi's delete, yank, and put, and emacs's kill, copy/"save as if killed", and yank. I wonder what other editors had back then, before the names he came up with became standardized.

I also wonder how the idea of the operations developed before Larry Tesler contributed to it.

Looking at POSIX[2], it seems ex has delete, yank, and put, but I can't see similar functionality in standard ed (GNU's ed does have yank, but I guess it's an extension).

[1] https://en.wikipedia.org/wiki/Cut,_copy,_and_paste#Populariz...

[2] https://pubs.opengroup.org/onlinepubs/9699919799/

TECO from the 1960's had more than cut and paste, multiple storage slots called "q registers" (they were named with letters) into which you could put text from the main buffer, and from which you could retrieve it back. Text regions in the buffer were referred to by numeric ranges, some number of characters forward or back relative to the current cursor position, with additional notions such as "here to beginning/end of line" or "n lines back/forward from cursor"

TECO commands were typed in as code, essentially, and that code could also be saved and run as macros.

it was extremely powerful, such that emacs was originally written as TECO Edit MACros.

Ah! That's why vi's letter-named registers are written to with the q command.

I wonder if they're supposed to be "q(uick) registers".

EDIT: Maybe not:


I was confused by the ‘q’ command in vi, so I did quick testing (in nvi(1)) and scanned vim cheat sheets. I still don’t know what ‘q’ does in the named buffer context - help?

Recording macros, I think.

Looks like you’re correct![0]

In nvi, I just key my commands on a scratch line (testing occasionally), then:

to load the whole line (yy) into buffer ‘x’ (“x). Execution (@x) thereafter looks ~same.

[0] https://vim.fandom.com/wiki/Macros

FWIW, Larry also contributed some of his historical knowledge to the Talk page [1] of the Wikipedia article and also edited the article himself.

[1] https://en.wikipedia.org/wiki/Talk:Cut,_copy,_and_paste

Nice catch! And it was on a discussion on the question of "Does anyone know where this idea came from?"

Here's a nice bit:

> I chose Z/X/C/V when I was in charge of the user interface design of Apple's Lisa. In addition to their adjacency on the keyboard, I wanted them to have mnemonic value: "X" a cross-out; "V" an inverted caret or proofreader's arrowhead; "C" the first letter of "copy"; the strokes of "Z" tracing a reversal followed by a new path forward. -- Larry Tesler

> Correction: Apparently, my memory was incorrect. The Lisa user interface seems to have used "U" for Undo. On the Mac (as in Gypsy), "U" was and is for Underline. I do not know who chose "Z" for Mac Undo, or why. I suspect that its proximity to the Command key was the reason. -- Larry Tesler

There's more, but it's too much to insert here.

I'm just hoping this isn't someone posing as Larry Tesler.

Douglas Engelbart demonstrated copy/paste with a mouse in 1968 [1], however I'm not sure what he called the process. It was also very much an experimental system and not something for sale.

[1] https://www.youtube.com/watch?v=yJDv-zdhzMY

> I imagine those "early line and character editors" refers to vi's delete, yank, and put, and emacs's kill, copy/"save as if killed", and yank.

No, 1974 predates both vi and Emacs

You're right, I missed that. Vi's predecessor ex, which also had these operations, is from '76, so it also couldn't be it. However, Emacs's TECO, which also had yank, is from '62/'63, so that might've been one.

Ed doesn't seem to have those operations. GNU ed's documentation includes yank, but it's not described in POSIX.

There's more than emacs or vi in "editor history" e.g. https://en.wikipedia.org/wiki/Bravo_%28software%29 and follow-ups at Xerox PARC.

The inspiration for my username.

RIP Larry Tesler

i have a .vimrc file that set C-C/C-X/C-V to work in each mode; that gets me the best of both worlds: fast text navigation in normal mode as i can switch to the next/previous word with w/b; but I can still copy and paste (limits us to one buffer, though; nothing is perfect)


If you want to watch the interview and have trouble with flash, you can download http://www.designinginteractions.com/fla/LarryTesler.flv and then ffplay it.

Is there somewhere he elaborates on his No Modes philosophy?

Is it a blanket rule for him for all interfaces, or just text editors?

I don't know if he wrote anything formal, but I remember from talking with him, from his critiques of particular interfaces, and from projects that he was interested in, that he favored UI that made operations as concrete and manifest as possible, that made it as easy as possible to discover operations by experimenting, that made mistakes as low-cost and painless to recover from as possible, and that featured direct manipulation.

See his article "The Smalltalk Environment" in the August 1981 issue of Byte magazine: https://archive.org/details/byte-magazine-1981-08/mode/2up

The basic idea is that modes make the same action (pressing the "D" key, for example) do different things. They make things easier for programmers who want many operations on machines that only have a few possible actions, but they make things hard for the user who have to pay attention to the current mode and know how to navigate from that to the mode where the desired operation is possible.

It is easy to overlook the biggest source of modes: applications!

Here is a fascinating talk where he describes the approach for designing Apple's Lisa:


Fascinating because that exploratory process is essentially what we recognize as "graphical user interface" as of today, and the whole industry has committed itself to that particular design, to the point that exploring how to build interfaces from any other set of principles feels like a titanic task.

Larry was a great thinker. I got to discuss "vi vs. emacs" at one of the Fellows induction ceremonies held at the Computer History Museum. He could easily articulate counter cases and keep the discussion both productive and quite civil!

I first met him while I was visiting my wife at her office in Xerox Business Systems (XBS). He came over to discuss some suggestions to improve the protocol she was working on. I thought he was one of her co-workers because the discussion was very peer to peer as opposed to top down. She corrected me to point out he was one of the movers and shakers at PARC. That left a very positive impression on me.

He was also "the other Larry" at Xerox. Larry Garlick, who was also "Larry" to most people, was also at XBS (as was Eric Schmidt) and later followed Eric over to Sun.

Only tangentially related but I loved reading this interview with him from Computer History Museum https://archive.computerhistory.org/resources/access/text/20...

It’s not the same impact as the video interview in Designing for Interactions but covers a lot of ground from Larry Tesler’s perspective.

Eric Schmidt was, as I recall (which was when I was there), at PARC proper, doing his PhD thesis research and writing his thesis. But as Larry Tesler's interaction showed, there were fluid interactions between at least some people at PARC and the XBS and Xerox Star teams.

Which side of the debate was he on personally?

He was very much in favor of modeless design, which emacs is much closer to than vi is. My argument was that emacs still has modes, operationally. You can make it a debugger, a mail reader, or a text editor by invoking code that puts it in that "mode." So the discussion quickly becomes what is meant by 'mode' and how are operational modes different than semantic modes which are different than presentation modes. If you haven't guessed it was pretty memorable for me, it helped me see some insights into the difference between design and engineering.

I'm not sure that debate was really important since it's clear that his real answer was 'none of the above'.

I'm guessing Emacs, because vi has modes and he was not a fan of modes.

I was just eating lunch across the street from Apple's headquarters in Cupertino when I read the news.

The John Sculley era of Apple has received a lot of criticism. With that being said, one of the aspects of this era that I'm most impressed with is the work that came out of Apple's Advanced Technology Group. During this time period Apple was serious about advancing the state of research in the areas of programming languages, systems software, and human-computer interaction. There were many great people that were part of this group, including Larry Tesler and Don Norman. I completely understand why Steve Jobs shut down this group in 1997; times were rough for Apple, and the company couldn't afford to do research when its core business was in dire straits. But I wish Apple revived this group when its fortunes changed, and I also wish Apple still had the focus on usability and improving the personal computing experience that it had in the 1980s and 1990s.

Larry was influential in the development and the missions of both ATG and the Human Interface Group, both of which are now gone now. He believed in conducting practical experiments with users and collecting objective measurements of how well UI worked. He wanted to find general principles that could be used to make all software better for everyone.

Steve Jobs killed both ATG and HIG. I think your point about times being rough and money being tight are valid, but three years earlier Steve Jobs sat in my office at NeXT and told me that if it was up to him, Apple would kill ATG and HIG--not because they were expensive, but because, in his words, they had too much influence.

Sure enough, when he took over Apple again, he wasted no time in killing them and replacing them with himself.

You're probably right that cutting those expenses was important to Apple's recovery. I think your other point is right, too, though: we'd be better off if Apple--or somebody--reconstituted something like HIG to show the industry what's possible if you take user experience and human-computer interaction seriously.

Unfortunately, Larry can't help us with it this time.

"he wasted no time in killing them and replacing them with himself."

ooh ouch that's a sharp, and understated, line!

Was Dylan created by the ATG?

Dylan was indeed created by a group at ATG Cambridge run by Ike Nassi

Bagel Street Cafe?

In the late 1960s/early 1970s, before he went to Xerox PARC and Apple, Larry Tesler wrote an early document system (page formatter) called PUB, for use by other computer programmers at the Stanford AI Lab (SAIL). He has put up the old manual online at http://www.nomodes.com/pub_manual.html with some modern annotations. This PUB was influential in at least two ways:

- It was Donald Knuth's first introduction to computer typesetting, and he used to use this program as a convenient way to prepare errata for The Art of Computer Programming on a computer, and hand out the resulting printouts. (At that time he was thinking of computer tools as something like typewriters and in no way related to "real book printing", until he saw the result of a "real" digital typesetter in 1977, which inspired him to write TeX.) In 2012 when he learned of this manual he wrote in TUGboat strongly recommending it to others: https://www.tug.org/TUGboat/tb33-3/tb105knut.pdf

- Another of its users was Brian Reid, who went on to develop Scribe, which itself was influential in two ways: (1) It was a strong influence on Leslie Lamport's LaTeX (in fact LaTeX can be viewed as bringing Scribe syntax/ideas to TeX), and (2) It seems to have been influential in the development of markup languages in general, e.g. from GML to SGML (this part I'm not sure of and there are conflicting accounts), which eventually led to HTML and XML. In fact it would have been better than XML according to Douglas Crockford here: https://nofluffjuststuff.com/blog/douglas_crockford/2007/06/...

Here's an 8-minute video of him accepting an award at a SAIL reunion (I think) for his work on PUB: https://exhibits.stanford.edu/ai/catalog/sj202sv1949 (with some audience comments by John McCarthy and a joke by Knuth).

I am sorry to hear that. I once had lunch with him and John Koza (pioneer in genetic programming) around the 1994 time period.

Larry had the first book I wrote (Common Lisp book for Springer Verlag) and in a good natured way was trying to talk me into writing a book on Dylan. We kept in touch but I didn’t write a Dylan book. Talking with him and John for an hour was like getting a year’s worth of good ideas tossed at you, all at once.

Two tech legends left us this week: Larry Tesler and Bert Sutherland. Both played key roles at PARC, the research center Xerox started that sparked large chunks of what we use today.

Regarding Tesler: I sat next to him when I flew back from interviewing at Microsoft. He was in the last row on the plane. I saw his Blackberry, assumed he was a nerd. He had just left Apple, was on the committee that hired Steve Jobs. He had his fingers in so much of the tech that we use today from object oriented programming to the Newton that set the stage for the iPhone.

Sutherland participated in the creation of the personal computer, the tech of microprocessors, the Smalltalk and Java programming languages, and much more.

Huge losses for our industry.

Also Peter Montgomery.

Legend in cryptography who created many algorithms for fast and secure elliptic curve cryptography.


Yes. I discovered Montgomery Multiplication from the book Hacker's Delight. Potentially very useful for me.

Yesterday I looked at the wiki page for that, followed the link to Peter Montgomery's wiki page, and thought I'd send him a little thank-you just for that (I had no idea of his crypto work, none at all). Then I noticed he had died that very day, yesterday, Feb 18th 2020, age 72. I wish I'd just been able to send him that little thank-you. I missed that window by a few hours.

Keep studying for him, cheers

Rest in peace. It's incredible how young computing is, many of his contributions are so fundamental.

From a guest lecture Larry Tesler gave at CMU in 2014 [1]:

Click to select an insertion point. Double click to select a word. Click and drag to select a passage. All of those were new. Type to replace the selection by new text [...] I think that was not unprecedented, but it wasn’t common. Cut to move the selection to a buffer, Pentti Kanurva had done it, tvedit. Paste to replace the selection by the buffer. Again, Pentti had done that [...] Control B to bold the selection, I and U and so on [...] All of these things were introduced in Gypsy, 1975.

Gypsy: https://en.wikipedia.org/wiki/Gypsy_(software)

[1] https://scs.hosted.panopto.com/Panopto/Pages/Viewer.aspx?id=...

A career in full from his own CV:

"Board director for a FTSE 250 company, vp in three Fortune 500 corporations, president of two small software firms. 32 years building and managing teams of software and hardware engineers, designers, researchers, scientists, product managers and marketers to deliver innovative customer-centered products."


Honest but overly modest summary. I picture hiring managers or AI throwing his resumé away because he didn't have enough experience and was "out of date."

The Apple and Xerox segments are nothing short of astonishing.

ARM (Advanced RISC Machines) Holdings, ltd Cambridge, England (co-founder)

Championed the spinout of Advanced RISC Machines (ARM) from Acorn plc and served on ARM’s board for 13 years.

That turned out fairly well....

I remember his mention in Bret Victor's talk, Inventing on Principle. He is known for cut copy paste, but he did far more than that. He introduced a single 'mode' for working for docs, as opposed to having typing, editing, formatting modes. He lived his entire life by the 'nomodes' ethos; that's his twitter handle to boot. I really respect that.

I'm bad at names, so I didn't know the name, but followed the Wikipedia link and saw he was the "modeless" advocate, and I learned about that from this awesome Bret Victor talk: https://vimeo.com/36579366. Someone has commented to forward to 38 minutes to hear it.

That Bret Victor talk still influences me today, so please watch it I think it'll help you if you haven't watched it already.

Wow serious Baader-Meinhof effect!

I recently downloaded all the PDFs from http://worrydream.com/refs/, and was reading Larry Tesler's A Personal History of Modeless Text Editing and Cut-Copy-Paste on my flight Monday.

It's a good paper, you can find it at the link above if you're in the mood to read it in memoriam.

I was in college, and attended a career fair for Amazon back in 2002. I was not in my last year so I really went just for free food. It was quite sparse, and after a while one guy with a friendly smile just walked up to me and said "Hi, I'm Larry". We talked a little bit about my background, about Amazon and opportunities there. I felt like I was a hotshot at the time, and quite frankly didn't. Only after I went back to my dorm and looked him up did I realize who I was talking to.

He was super humble, super nice. I acted like a jerk, thinking I was hot stuff and everybody was there to court me and was there just for free pizza. Despite being infinitely more accomplished than I could ever be, he was nice, engaging and never treated me in kind.

Every so often I think back to that time and kick myself at the lost opportunity to have a conversation with one of the legends of Silicon Valley.

Thank you Larry.

A reminder that our industry is very young still, and many who laid its foundations are still with us today, but won’t be forever.

There is no better time than now for collecting oral history, interviewing people, asking them about their stories, etc. All of this knowledge and stories can get lost very very fast.

And, pretty basic, but thank them for their contribution. Plenty of the stuff we take for granted should not be taken for granted at all, it took a lot of dedicated people working on machines that were severely limited to give us the luxuries we have today and believe that it has always been so. It wasn't.

Larry Tesler was really convinced of the merits of modeless interfaces. He even got “NOMODES” on his license plate.


Was at 23andME when he was there, too. We (engineering) had no idea who he was initially but I did notice his license plate "NO MODES." Only after we looked him up and found that he had invented copy-paste did we realize he was a living legend. Sad to see he's passed on.

Very sad, Larry Tesler was brilliant and an inspiration.

Smalltalk, copy-and-paste, the Apple Lisa/Macintosh and Newton, Object Pascal (predecessor of Delphi), Stagecast Creator... NO MODES. ;-)

Pretty sure there isn't any bit of personal computing that Larry Tesler couldn't (or didn't) help make better somehow.

Sad news; total legend. folklore.org has a lot of stories featuring Larry — short stories on an era of computing that has faded around the edges. RIP.

This posting and Larry's obit take me back to the math applications class we took at Bronx Science our senior year. Writing a program for the IBM 640 (yes, the 640), coding the IBN cards ourselves and then the fateful experience of loading the program cards into the reader for the 640 was a scary experience. Larry's program apparently did primes while mine did the Fibonacci series. Then the 640, the size of a modern SUV, chugged away all night and the next day it generated voluminous print-out's, lots and lots of paper. What fun!

After graduation he went off to Stamford to continue math - and programming - and I went elsewhere for liberal arts. But no matter, I couldn't have done what Larry did or create what he created.

I remember him as a great guy, very mellow, and if memory serves he was on the track team as well. Perfect for Stanford. Apparently he affected those he worked with the same way he affected me - an easy-going guy, friendly, positive, ready to smile. I know he'll be missed greatly.

I'm very sorry to hear this. He was the instigator of Dylan and it's not wrong to say there would be no Clozure Common Lisp (my preferred development platform) today if not for Larry Tesler.

RIP Larry Tesler, I ran into him a few times at meetup events in Bay Area, few people knew who he was as he kept a very low profile.

I met him at a meetup I went to out in SF while traveling for work. I didn't know anyone there, just going to kill time. I had no idea who he was just someone willing to chat with me for an hour or so. I started asking him what he did, and it was clear he wasn't there to talk about himself. Just struck me as a really cool, really humble, really approachable guy. With a lot of good ideas, and a passion for spreading curiosity. World needs more people like this, not less.

A very nice remembrance by John Markoff in the NYTimes: https://mail.yahoo.com/d/folders/1/messages/AOkUKy0vu8HQXk9u...


One of the strangest moments in my career was Larry asking me for career advice when we were both at Yahoo; I think because I had been an IC for really long time and had figured out how to leverage that into strategic positions. Great man with incredible accomplishments.

I've been meaning to watch one of his presentations for a while: "Origins of the Apple Human Interface"


RIP Larry Tesler. I rather feel a worm for not being aware of him till today. Nevertheless, hearing the many personal stories here on HN of his work, humility and graciousness, make me thankful for him and his tribe that have made the software profession so much more richer.

Wow. I didn't realize how influential the Bronx High School of Science was:


From his resume:

1962-1964 Stanford University Departments of Genetics and Computer Science programmer

I didn’t know of Larry Tesler, but reading the comments makes me want to lead like he did. Looking forward to learning more about his life. My condolences to his friends here.

So long our dear friend Larry. You have been such an inspiration. Amongst other achievements, the Newton is still such an incredible machine (and love Dylan)

There's one thing on his wikipedia page which I think is probably wrong: I don't think Wirth had any involvement with Object Pascal.

I think he did, as a consultant. The main reference I found was an article in MacTech, written by an Apple employee:

"Object Pascal is an extension to the Pascal language that was developed at Apple in consultation with Niklaus Wirth, the inventor of Pascal."


I was (distantly) there, and I recall Wirth being there and visiting.


I loved Larry Tesler’s work. No modes!

Is it possible for the HN staff to put a rollover on the black bar or link it to threads like this. I see an uptick in the "why is there a black bar?" threads every time it goes up and it would be nice to acknowledge the people that have contributed to technology and spread awareness of what they've done for people that might not know.

Don't mode me bro.

Never heard about him here in EU

Larry Tesler has died (gizmodo.com)

1179 points by drallison 23 hours ago | flag | hide | past | web | favorite | 131 comments


rest in peace

Thank you, Larry.


Wow reading the comments sounds like actually a great human being. Help me realize being a jerk is an anti pattern, even though I am sometimes tempted to think otherwise.

Surprised I have not heard of him. He seems quite significant.

Tangentially related question: is this why HN currently has a black bar above the navigation? To commemorate his death?

Yes, HN does this when people who were impactful to the technology world have died to commemorate them.

Yes, people of significance in computing are honored this way.

I couldn't find any corroboration of this. What happened?

Tesler spent 17 years at Apple, rising to chief scientist.

He went on to establish an education startup and do stints in user-experience technology at Amazon and Yahoo.


Some of us actually did know him. I did, albeit not as well as others here, and I see no harm and much good in people celebrating his accomplishments in his chosen career.

You don't necessarily need to know someone personally to feel a sense of loss. You can feel a "bond" with with someone based on all sorts of things: being members of a common community, sharing a common occupation, etc., etc.

To illustrate one case that hits close to home for me... when I think about the 343 firefighters who were killed on 9/11, I find it difficult not to tear up at times. Even though I never met any of them, and couldn't tell you any of their names. But we shared a common bond, by virtue of being firefighters. My sense of loss at their death is rooted in how deeply I admire all of them for the bravery and courage they displayed on that day, putting their lives on the line in the name of saving others. Do I feel that as strongly as if I had been the literal biological sibling of one them? Possibly not, but they were still my brothers, and the sense of loss is still real.

Without piling on: sometimes this is how you get to know someone.

It used to be interesting to scan the obit section of newspapers, just to see the parade of characters and achievements that I had missed or not known enough of.

Does anyone else find it strange that there’s rarely any mention of cause of death in Wikipedia? Is it uncouth to ask how someone passed away?

Markoff's obituary [1] in the New York Times says, "The cause was not known, his wife, Colleen Barton, said, but in recent years he had suffered the effects of an earlier bicycle accident."

[1] https://www.nytimes.com/2020/02/20/technology/lawrence-tesle...

Sometimes it's mentioned, sometimes the family prefers privacy. It's not uncouth to ask, but it's good to respect that privacy if the family does not share the cause of death.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact