There is an excellent obit for Larry at: https://gizmodo.com/larry-tessler-modeless-computing-advocat...
... and I expect another one from John Markoff -- who was a friend of his -- in the NYTimes.
In many ways, Larry did too many interesting things and had so much influence in too many areas for there to be any chance to characterize him technically. In short, he was a superb wide-spectrum(real) computer scientist who was also a very talented and skilled programmer.
His passing was sudden and unexpected, and I may return later to this note to add more details of his rich career.
For now, I remember him as great to work with in all aspects of his life. He was a great guy, and perhaps that sums him up as best as can be.
Pretty much only one wanted to get rich (and did). Several were more or less forced into becoming rich. Money has its own dynamics and none of these folks wound up doing further research.
But Butler Lampson (the "Oppenheimer" of Parc) is still going strong, as am I and many others.
It was a calling, never a job.
I've always wondered how those who were featured in the book felt about it?
"Dealers of Lightning" is at the next level but far from the bottom. Its flaws are too much "Heroes' Journey" and a very complex and confused jumping around timeline (I had trouble myself orienting in some of the spots). But it also has a lot of good stories, of which a reasonable number are "true enough".
"Fumbling The Future" is extremely inaccurate.
I actually didn't have too much trouble following Dealers, because it (more or less) mostly followed each project separately, which while creating an interspersed timeline in the book, was mostly coherent within each section.
I did read at least part of Fumbling, but I found it a hard book to thread the needle on, and it was in such stark factual disagreement with what else I'd read that I don't recall finishing it.
I know too many who draw the same conclusions from the Alto and related technologies that fumbling does, so I try to get people to read more about PARC, because I'll hope that once they know more they'll draw much the same conclusions I have - My frustration with much of the traditional criticism of Xerox in failing to commercialize the PARC innovations is it completely ignores both the high cost of the technology (it was literally the technology of the future) and the sales culture of technology at the time.
I don't think any large technology company (which Xerox was broadly) could have made something wondrous out of the innovations PARC created because the people who could recognize the value (and use) of this kind of technology were not the people being sold to, or for that matter doing the buying - nor did they have the budgets to buy a Alto, as was later seen with the Star when it came out.
It took direct to consumer sales (allowing department managers to buy stuff), and lower cost products to allow personal computing to penetrate into the home and corporate america - also the traditional criticism completely ignores that the 9700 (and follow on products) paid for the money spent at PARC several times over.
(incidentally, I believe that this sales culture issue is a prime reason why DEC no longer exists as a company because they failed to see that their market was shifting and could no longer be sold thru the same mechanisms they always had been)
In the end, basic research and the undirected applied science that flows from it is important, even if it has no direct tie to your line of business, because it's the innovation that drives a company forward, and frankly drives humanity forward. I wish more people knew that modern interconnected the world we live in was built on billions spent with no firm idea of what would result from them, and how much of a debt we owe to PARC, Bell Labs and others.
Also, thank you for taking the time to respond to my question!
There are at least two big issues regarding cost that many people miss: (a) the first is the difference between what should be spent on "prototypes for learning and vetting" and what can be done when designing for manufacturing, and (b) the second is the the difficulty most people had with valuing what personal computing might be for them.
In the first public paper I wrote about the Dynabook I pointed out that Moore's Law meant that powerful tablet sized personal computers would likely wind up costing what a color TV set would cost (they would have pretty similar components, and most of the cost in electronics is in packaging).
But we also had another analogy that we though could work via education: that of the personal automobile in the US. People value cars enough to be willing to pay quite a bit more for them than for most consumer devices. This was very interesting because the ARPA dream of an interactive personal computer connected to a world-wide network was a kind of "information and intelligence vehicle".
If people could see this, then they might be willing to pay what they would pay for a car. Certainly most computer people and most scientists and engineers would be able to assign value in this way. We thought most knowledge workers would eventually be able to see this also, and that there would be an intermediate phase before getting to the TV set kind of technologies.
An analysis of what happened to eventually quash this idea is beyond the scope of this note. (But, to make a point in talks, I've tried to get people to think about what "a car's worth" of personal computer could be like (the average car in the US a few years ago was a Toyota Avalon at $28K, so about 10 times what most personal computers go for).
This is a different slant than the problem that DEC and similar companies had (which was to not be able to understand personal computing in any reasonable form).
ARPA/Parc as a community had the best and most enlightened funding for computing research starting in 1962 (Parc started in 1970), and a very large percentage of the familiar technologies of today -- including personal computing, tablets, dynamic OOP, the GUI, the Internet, etc -- were invented by it.
The best (and pretty accurate) book about this remarkable group is "The Dream Machine" by Mitchell Waldrop.
Bob Taylor, who had been a director of the ARPA computing research, looked for a way to fund some of the "ARPA Dream" projects that Congress was curtailing, and found Xerox (which wanted to set up a longer range research lab).
Taylor was particularly interested in recruiting a number of the young Phds that ARPA had funded, and I was one of them.
It was only when I was older that I appreciated that he had probably sent me thousands of dollars worth of gear (and not in 2020 dollars!) in addition to the invaluable advice he provided, sometimes (frankly, often) unsolicited but always direct and always thought provoking.
While I never did become an extremely competent commercial developer, to this day I enjoy programming for programming’s own sake. Larry’s push for me to fix my own headaches, rather than simply giving me a metaphorical aspirin, resulted in my development of solutions for small hobby problems that it appeared often only myself and perhaps a few others shared.
As it turns out, in spite of (or thanks to) my niche interests, my curiosity and the method of targeted problem solving Larry fostered set me on a path I remain on today. Frankly, his contributions helped mold me as a man more than those of any other mentor of mine; that is absolutely meant as a compliment to his prescient pedagogy, rather than a slight at my life’s many other wonderful influences.
I’ve sold a few businesses thanks to Larry’s problem solving approach. The rest I founded are running profitably - and somehow I’ve never lost an investor money. My customers have always, above all else, been happy because they had their problems fixed. (Or, perhaps thanks to his influence, their happiness stemmed from my teams simply providing them with the tools they needed to solve their own problems!)
And because I followed Larry’s personal advice, I have been able to spend every day for nearly two decades doing what he encouraged and what has consistently engaged me: finding, isolating and destroying problems.
Thank you for everything.
Those numbers represent the hex ascii code.
To get you started.
53 = S
41 = A
4e = N
"".join([chr(int(a[i:(i+2)],16)) for i in range(0,len(a),2)])
=> 'SANTA AND HIS WORKSHOP'
a = 'SANTA AND HIS WORKSHOP'
"".join([hex(ord(c)) for c in a]).replace('0x','')
b'SANTA AND HIS WORKSHOP'
>>> b'SANTA AND HIS WORKSHOP'.hex()
Did he encourage you do be you ?
Are you You because of him, maybe not, because its impossible to grade.
But here you are paying respect to a man that you met, so I would say: he had an impact on you. Even at that moment.
We could play highschool politics and as what you learnt from him.
But from your message it's clear.
I never met him personally, but I certainly felt his impact.
I got to know him better when John Sculley ordered him to have the Newton team ditch its Lisp OS and write one in C++. Larry approached me and a couple of other Lisp hackers and asked us to make a fresh start with Lisp and see what we could do on Newton. We wrote an experimental OS that Matt Maclaurin named "bauhaus".
Larry had a sabbatical coming up right about then. He took it with us. He crammed into a conference room with three or four of us and hacked Lisp code for six weeks. He was a solid Lisp hacker. He stayed up late with us and wrote AI infrastructure for the experimental OS, then handed it off to me when he had to, as he put it, "put his executive hat back on." He hung around with us brainstorming and arguing about ideas. He had us out to his house for dinner.
A little later, when things were hectic and pressure was high on Newton, one of our colleagues killed himself. Larry roamed the halls stopping to talk to people about how they were doing. I was at my desk when he came by, next to another colleague that I considered a friend. Larry stopped by to check on us. My friend had also been a good friend of the fellow who had died, and he lost his composure. Larry grabbed a chair, pulled it up close and sat with him, an arm around him, patting him gently while his grief ran its course.
After Newton was released, Larry moved on to other projects. I worked on the shipped product for a while, but I was pretty burned out. Steve Jobs persuaded me to go to work for NeXT for a little while.
Steve is infamous for being, let's say, not as pleasant as Larry. In fact, he sat in my office once trashing Larry for about half an hour, for no good reason, as far as I can see. I politely disagreed with a number of his points. Larry made important contributions to the development of personal computing, and he didn't have to be a jerk to do it.
Larry was extremely smart, but I never knew him to play I'm-smarter-than-you games. I saw him encourage other people to pursue, develop, and share their ideas. I found him eager to learn new things, and more interested in what good we could do than in who got the credit for it.
We weren't close friends, except maybe when we were crammed in a conference room together for six weeks. I didn't see him much after Newton, though we exchanged the occasional friendly email over the years.
I was just thinking lately that it was about time to say hello to him again. Oops.
Larry Tesler was one of the best people I met in Silicon Valley. He was one of the best people I've met, period. I'll miss him.
I've lost some colleagues along the way too you never know when it's going to happen every chance to speak should be treated with the respect of knowing it could very well be the last chance to make a connection
If it's a feel-good story, I think that must be because I feel good to have had the opportunity to meet and work with Larry Tesler. He impressed me with his intelligence, his generosity, and his compassion. I feel that I'm better for having known him, and I suppose that comes through in my account.
You're right: Newton was a pressure cooker. Larry didn't put that pressure on us, though. We put it on ourselves. We got the idea that there was an outside chance of making something great, and we pursued that dream as hard as we could. Some of us--I include myself--were intemperate in that pursuit, and it cost us.
Now, the pursuit of greatness is a species of vanity, and vanity is a cruel and fickle master. But in Newton's case, at least, I think those of us who were seduced by that vanity have only ourselves to blame.
And I think it's a polite way of reminding everyone that being an asshole is not required to do great, world-changing work.
Steve was infamous for riding roughshod over employees in pursuit of making something great.
Larry's example shows that such treatment is not necessary to get people's best work. If you can't get great work out of people without abuse, that's your own limitations showing; it's not a law of nature.
I suppose I also think of Steve because I met and worked for Larry and Steve in the same fairly short period of time, and because both projects were high-risk, low-percentage attempts to create something great. Also, perhaps, because Steve was opinionated about Larry and I argued with him about it.
Seemed like a neat little device.
The first Newton OS was written in Lisp (specifically in a dialect called Ralph, which was basically Scheme plus CLOS) over a C++ microkernel.
By the way, that Lisp was written by Apple Cambridge, a group in Apple that was created when Larry arranged for Apple to purchase the assets of Coral Software and hire some of its best hackers to design a new programming language.
Apple later spun out that group to create Digitool, which took with it the code and the rights to Macintosh Common Lisp. Digitool did not prosper, and one of its employees, Gary Byers, negotiated the rights to turn the MCL compiler into an open-source implementation that later became Clozure Common Lisp. So Larry is also responsible for the creation of Clozure Associates and Clozure Common Lisp.
There were some issues with the first Newton OS that led to two developments:
1. John Sculley ordered Larry Tesler to redo the OS in C++. That's the version that shipped.
2. Larry asked me and a few other Lisp hackers to see what we could do on Newton with Ralph. That resulted in the bauhaus OS, which didn't ship.
The authors of the first Newton OS were smart programmers, but they weren't Lisp hackers. Larry speculated that that had something to do with some of the early issues, and maybe it did. I think our version offered some improvements.
Another set of issues had to do with the UI design of the initial OS. Larry was critical of it on the grounds that it was basically trying to be a desktop UI in a handheld device. He thought we should try for a UI experience more suited to a new kind of device, and, in the end, both the shipping OS and the bauhaus OS did what he asked, and were better for it.
(In a final evaluation meeting for bauhaus, our managers told us that we had met and exceeded all goals of our experimental project, but Apple was of course going to ship the OS that the CEO told us to ship, and it was of course not going to ship two OSes for the device. None of this was a surprise to the bauhaus team. We were grateful to have had the support to work on the project for as long as we did, and disappointed that it was over.)
From then on the ice was broken and we chatted more freely: fun discussions about the (then) up-and-coming voice recognition UIs (I compared them to CLIs which he liked), wearables, design, and cycling.
I consider him a friend. Didn’t expect us to lose him so soon.
Ctrl-X + Ctrl-V
We were friends, off and on. Perhaps somewhat "off" after I stole his girlfriend. (In my defense, it was her idea!) But that was 35 years ago, and all was forgiven (and hopefully forgotten) in more recent years.
Here is Larry's Smalltalk article from the August 1981 BYTE, complete with a photo of the famous T-shirt that a mutual friend made for him:
A couple of other good articles:
 One of the more rare sources for Larry Tesler's contributions is his interview for Bill Moggridge's Designing Interactions (http://www.designinginteractions.com/interviews/LarryTesler)
> Inspired by early line and character editors that broke a move or copy operation into two steps—between which the user could invoke a preparatory action such as navigation—Lawrence G. Tesler (Larry Tesler) proposed the names "cut" and "copy" for the first step and "paste" for the second step. Beginning in 1974, he and colleagues at Xerox Corporation Palo Alto Research Center (PARC) implemented several text editors that used cut/copy-and-paste commands to move/copy text.
I imagine those "early line and character editors" refers to vi's delete, yank, and put, and emacs's kill, copy/"save as if killed", and yank. I wonder what other editors had back then, before the names he came up with became standardized.
I also wonder how the idea of the operations developed before Larry Tesler contributed to it.
Looking at POSIX, it seems ex has delete, yank, and put, but I can't see similar functionality in standard ed (GNU's ed does have yank, but I guess it's an extension).
TECO commands were typed in as code, essentially, and that code could also be saved and run as macros.
it was extremely powerful, such that emacs was originally written as TECO Edit MACros.
EDIT: Maybe not:
In nvi, I just key my commands on a scratch line (testing occasionally), then:
Here's a nice bit:
> I chose Z/X/C/V when I was in charge of the user interface design of Apple's Lisa. In addition to their adjacency on the keyboard, I wanted them to have mnemonic value: "X" a cross-out; "V" an inverted caret or proofreader's arrowhead; "C" the first letter of "copy"; the strokes of "Z" tracing a reversal followed by a new path forward. -- Larry Tesler
> Correction: Apparently, my memory was incorrect. The Lisa user interface seems to have used "U" for Undo. On the Mac (as in Gypsy), "U" was and is for Underline. I do not know who chose "Z" for Mac Undo, or why. I suspect that its proximity to the Command key was the reason. -- Larry Tesler
There's more, but it's too much to insert here.
I'm just hoping this isn't someone posing as Larry Tesler.
No, 1974 predates both vi and Emacs
i have a .vimrc file that set C-C/C-X/C-V to work in each mode;
that gets me the best of both worlds: fast text navigation in normal mode as i can switch to the next/previous word with w/b; but I can still copy and paste (limits us to one buffer, though; nothing is perfect)
Is it a blanket rule for him for all interfaces, or just text editors?
The basic idea is that modes make the same action (pressing the "D" key, for example) do different things. They make things easier for programmers who want many operations on machines that only have a few possible actions, but they make things hard for the user who have to pay attention to the current mode and know how to navigate from that to the mode where the desired operation is possible.
Fascinating because that exploratory process is essentially what we recognize as "graphical user interface" as of today, and the whole industry has committed itself to that particular design, to the point that exploring how to build interfaces from any other set of principles feels like a titanic task.
I first met him while I was visiting my wife at her office in Xerox Business Systems (XBS). He came over to discuss some suggestions to improve the protocol she was working on. I thought he was one of her co-workers because the discussion was very peer to peer as opposed to top down. She corrected me to point out he was one of the movers and shakers at PARC. That left a very positive impression on me.
He was also "the other Larry" at Xerox. Larry Garlick, who was also "Larry" to most people, was also at XBS (as was Eric Schmidt) and later followed Eric over to Sun.
It’s not the same impact as the video interview in Designing for Interactions but covers a lot of ground from Larry Tesler’s perspective.
The John Sculley era of Apple has received a lot of criticism. With that being said, one of the aspects of this era that I'm most impressed with is the work that came out of Apple's Advanced Technology Group. During this time period Apple was serious about advancing the state of research in the areas of programming languages, systems software, and human-computer interaction. There were many great people that were part of this group, including Larry Tesler and Don Norman. I completely understand why Steve Jobs shut down this group in 1997; times were rough for Apple, and the company couldn't afford to do research when its core business was in dire straits. But I wish Apple revived this group when its fortunes changed, and I also wish Apple still had the focus on usability and improving the personal computing experience that it had in the 1980s and 1990s.
Steve Jobs killed both ATG and HIG. I think your point about times being rough and money being tight are valid, but three years earlier Steve Jobs sat in my office at NeXT and told me that if it was up to him, Apple would kill ATG and HIG--not because they were expensive, but because, in his words, they had too much influence.
Sure enough, when he took over Apple again, he wasted no time in killing them and replacing them with himself.
You're probably right that cutting those expenses was important to Apple's recovery. I think your other point is right, too, though: we'd be better off if Apple--or somebody--reconstituted something like HIG to show the industry what's possible if you take user experience and human-computer interaction seriously.
Unfortunately, Larry can't help us with it this time.
ooh ouch that's a sharp, and understated, line!
- It was Donald Knuth's first introduction to computer typesetting, and he used to use this program as a convenient way to prepare errata for The Art of Computer Programming on a computer, and hand out the resulting printouts. (At that time he was thinking of computer tools as something like typewriters and in no way related to "real book printing", until he saw the result of a "real" digital typesetter in 1977, which inspired him to write TeX.) In 2012 when he learned of this manual he wrote in TUGboat strongly recommending it to others: https://www.tug.org/TUGboat/tb33-3/tb105knut.pdf
- Another of its users was Brian Reid, who went on to develop Scribe, which itself was influential in two ways: (1) It was a strong influence on Leslie Lamport's LaTeX (in fact LaTeX can be viewed as bringing Scribe syntax/ideas to TeX), and (2) It seems to have been influential in the development of markup languages in general, e.g. from GML to SGML (this part I'm not sure of and there are conflicting accounts), which eventually led to HTML and XML. In fact it would have been better than XML according to Douglas Crockford here: https://nofluffjuststuff.com/blog/douglas_crockford/2007/06/...
Here's an 8-minute video of him accepting an award at a SAIL reunion (I think) for his work on PUB: https://exhibits.stanford.edu/ai/catalog/sj202sv1949 (with some audience comments by John McCarthy and a joke by Knuth).
Larry had the first book I wrote (Common Lisp book for Springer Verlag) and in a good natured way was trying to talk me into writing a book on Dylan. We kept in touch but I didn’t write a Dylan book. Talking with him and John for an hour was like getting a year’s worth of good ideas tossed at you, all at once.
Regarding Tesler: I sat next to him when I flew back from interviewing at Microsoft. He was in the last row on the plane. I saw his Blackberry, assumed he was a nerd. He had just left Apple, was on the committee that hired Steve Jobs. He had his fingers in so much of the tech that we use today from object oriented programming to the Newton that set the stage for the iPhone.
Sutherland participated in the creation of the personal computer, the tech of microprocessors, the Smalltalk and Java programming languages, and much more.
Huge losses for our industry.
Legend in cryptography who created many algorithms for fast and secure elliptic curve cryptography.
Yesterday I looked at the wiki page for that, followed the link to Peter Montgomery's wiki page, and thought I'd send him a little thank-you just for that (I had no idea of his crypto work, none at all). Then I noticed he had died that very day, yesterday, Feb 18th 2020, age 72. I wish I'd just been able to send him that little thank-you. I missed that window by a few hours.
From a guest lecture Larry Tesler gave at CMU in 2014 :
Click to select an insertion point. Double click to select a word. Click and drag to select a passage. All of those were new. Type to replace the selection by new text [...] I think that was not unprecedented, but it wasn’t common. Cut to move the selection to a buffer, Pentti Kanurva had done it, tvedit. Paste to replace the selection by the buffer. Again, Pentti had done that [...] Control B to bold the selection, I and U and so on [...] All of these things were introduced in Gypsy, 1975.
"Board director for a FTSE 250 company, vp in three Fortune 500 corporations, president of two small software firms. 32 years building and managing teams of software and hardware engineers, designers, researchers, scientists, product managers and marketers to deliver innovative customer-centered products."
The Apple and Xerox segments are nothing short of astonishing.
Championed the spinout of Advanced RISC Machines (ARM) from Acorn plc and served on ARM’s board for 13 years.
That turned out fairly well....
That Bret Victor talk still influences me today, so please watch it I think it'll help you if you haven't watched it already.
I recently downloaded all the PDFs from http://worrydream.com/refs/, and was reading Larry Tesler's A Personal History of Modeless Text Editing and Cut-Copy-Paste on my flight Monday.
It's a good paper, you can find it at the link above if you're in the mood to read it in memoriam.
He was super humble, super nice. I acted like a jerk, thinking I was hot stuff and everybody was there to court me and was there just for free pizza. Despite being infinitely more accomplished than I could ever be, he was nice, engaging and never treated me in kind.
Every so often I think back to that time and kick myself at the lost opportunity to have a conversation with one of the legends of Silicon Valley.
Thank you Larry.
There is no better time than now for collecting oral history, interviewing people, asking them about their stories, etc. All of this knowledge and stories can get lost very very fast.
Smalltalk, copy-and-paste, the Apple Lisa/Macintosh and Newton, Object Pascal (predecessor of Delphi), Stagecast Creator... NO MODES. ;-)
Pretty sure there isn't any bit of personal computing that Larry Tesler couldn't (or didn't) help make better somehow.
After graduation he went off to Stamford to continue math - and programming - and I went elsewhere for liberal arts. But no matter, I couldn't have done what Larry did or create what he created.
I remember him as a great guy, very mellow, and if memory serves he was on the track team as well. Perfect for Stanford. Apparently he affected those he worked with the same way he affected me - an easy-going guy, friendly, positive, ready to smile. I know he'll be missed greatly.
1962-1964 Stanford University Departments of Genetics and Computer Science
He went on to establish an education startup and do stints in user-experience technology at Amazon and Yahoo.
"Object Pascal is an extension to the Pascal language that was developed at Apple in consultation with Niklaus Wirth, the inventor of Pascal."
1179 points by drallison 23 hours ago | flag | hide | past | web | favorite | 131 comments
To illustrate one case that hits close to home for me... when I think about the 343 firefighters who were killed on 9/11, I find it difficult not to tear up at times. Even though I never met any of them, and couldn't tell you any of their names. But we shared a common bond, by virtue of being firefighters. My sense of loss at their death is rooted in how deeply I admire all of them for the bravery and courage they displayed on that day, putting their lives on the line in the name of saving others. Do I feel that as strongly as if I had been the literal biological sibling of one them? Possibly not, but they were still my brothers, and the sense of loss is still real.
It used to be interesting to scan the obit section of newspapers, just to see the parade of characters and achievements that I had missed or not known enough of.