Hacker News new | past | comments | ask | show | jobs | submit login
Lisa Source Code Release (computerhistory.org)
387 points by bitsavers on Jan 19, 2023 | hide | past | favorite | 113 comments



The QuickDraw code (LISA_OS/LIBS/LIBQD) is just glorious. While reading it, I am overcome by the creative genius of Bill Atkinson. [1]. Bill went on to become quite accomplished at cooking lunch for the team at General Magic. [2]

[1] https://en.wikipedia.org/wiki/Bill_Atkinson

[2] https://en.wikipedia.org/wiki/General_Magic


One of the great stories:

In early 1982, the Lisa software team was trying to buckle down for the big push to ship the software within the next six months. Some of the managers decided that it would be a good idea to track the progress of each individual engineer in terms of the amount of code that they wrote from week to week. They devised a form that each engineer was required to submit every Friday, which included a field for the number of lines of code that were written that week.

Bill Atkinson, the author of Quickdraw and the main user interface designer, who was by far the most important Lisa implementer, thought that lines of code was a silly measure of software productivity. He thought his goal was to write as small and fast a program as possible, and that the lines of code metric only encouraged writing sloppy, bloated, broken code.

He recently was working on optimizing Quickdraw's region calculation machinery, and had completely rewritten the region engine using a simpler, more general algorithm which, after some tweaking, made region operations almost six times faster. As a by-product, the rewrite also saved around 2,000 lines of code.

He was just putting the finishing touches on the optimization when it was time to fill out the management form for the first time. When he got to the lines of code part, he thought about it for a second, and then wrote in the number: -2000.

I'm not sure how the managers reacted to that, but I do know that after a couple more weeks, they stopped asking Bill to fill out the form, and he gladly complied.

https://www.folklore.org/StoryView.py?project=Macintosh&stor...


The story was that Atkinson saw the Parc and it's GUI and said to himself "holy shit they're doing their graphics this way" and write QuickDraw based on that.

The Parc didn't do its graphics that way at all, and Atkinson (and Apple) got a lot of patents on his region code. That code is also why the Mac's graphics performance was so much better than everyone else's (until discrete GPUs came out). It's that fast.

Apple's software guys were really good back in the day. Their software-based floating point stuff (SANE) was faster and more accurate than Intel's hardware FP stuff for a long time.


The QuickDraw source code has been available for some time:

<https://news.ycombinator.com/item?id=2285569>

Submitted several times to HN since then. Most recently:

<https://news.ycombinator.com/item?id=16519132>


Oh, yes.

>{ RoundRect Routines }


Oof, emotions.

My uncle took receipt of a Lisa in the 80s but didn't have much use for it, so he gave it to our family. Lisa was my first computer. I used to use LisaDraw as a very primitive city builder: rectangles with circles to represent cars; boxes for buildings. I'd use the arrow keys to move things around. I can remember seeing the Lisa redraw pixels, top to bottom, as I pressed the keys.

By the time I moved on to my first Mac (Performa 638CD with SoftFPU installed), I learned more about Apple and understood the Lisa to be a failure in the market, but it will always hold a special place. I assume this is Pascal and I can't really follow it very well, but it's quite a special thing to read the inline comments written by engineers who paved the way for me and inspired me.

Writing this many, many years later on a 16" MacBook Pro with M1 Max.


> I used to use LisaDraw as a very primitive city builder: rectangles with circles to represent cars; boxes for buildings.

Oooh you just reminded me that I used to do the same a decade later on my dad's Macintosh. Can kids have fun with super basic toys on computers these days? Or are they too used to seeing more advanced things on screens to even be interested?


My nephew(5 years old) loves text-to-speach software. He likes to type in wierd stuff and then starts laughing, And that makes me start to laugh, we have a lot of fun with it.

I still think the best gift you can give to a young child is a big cardboard box.


yes, turtle can't attract people when tought python.


"Took receipt of" ?! I wonder when in the 80's this was ? At launch in 1983 the Lisa cost $10,000 (about $50,000 in 2023 dollars)... Not exactly the sort of thing one casually gets handed!


He had a very successful business and Apple sales talked to him about modernizing his office and workflows. And then he promptly continued to use what he already had. Not an uncommon story even in 2023. :) But it did create a truly remarkable opportunity for me, because certainly, to your point, my family could absolutely not afford this computer.


I remember walking into the Computer Factory in Grand Central station in NYC and playing with the Lisa. (Later I did some Smalltalk on the Lisa and the Techtronix.)

I still remember the layout ca. 1979-83 like it was yesterday - on the right side when you walked in was a separate room with some kind of P-Code machine which was the "business mini". The Lisa and later the Mac were set up on a table on the right, and the Apple][ in the back. In the center were the Kaypro, and some other CP/M machines, and the Commodore Pets (the OG chicket keyboard version and the green-screen one w/the real keyboard) (incl. some IEE-488? accessories), the Sol II, and then the gfx show-offs (making my TRS-80 jealous), the CompuColor II and Exidy Sorcerer. The other wall had the printers - the Anderson-Jacobson daisywheel printer, an adapter w/solenoids to turn your Selectric typewriter into a printer, a few 9-pin matrix printes like the IDS Paper Tiger (right before the Epson MX-80 took over)

Good times. Sigh...


Very neat code and somewhat legible for a non-Pascal speaker.

Curiously, it doesn't credit authors in the headers as I've seen in sources from the same period.

I also wonder how much of the core OS code made it into Macintosh System 1.

Also, nice styling here: http://revontulet.org/2023/01/19/lisadesk.png. They didn't have to do ASCII art in the comments but they did!


Pascal was designed to be a legible language. It eschews brevity for clarity, e.g., preferring BEGIN…END over {…} and tends to avoid footguns, although I remember writing a significant program in Pascal back in the 80s where I really wanted to be able to pass a function in to another function and was bummed that there was no way to do so (or at least not in Pascal/VS in 1987). Even though I’ve not written a line of Pascal since 1991 or maybe 1992, I still have no difficulty reading and understanding Pascal code when I find myself looking to see how, e.g., TeX manages some task.


> Also, nice styling here: http://revontulet.org/2023/01/19/lisadesk.png.

That site blocks my ISP.


Some of it does (credit authors) some of it doesn't. Some has changelog entries, some single author (e.g. Bill Atkinson's stuff), some nothing.


Releasing to public the sources of the old software, especially from platforms which are gone, is very cool!

I would die to see the original sources of Norton Commander 3.0 written by John Socha. They say, he used "an ortodox" (back in the day) approach of C and assembler mashup, which not many programmers used that time, and "real programmers" always wrote in assembler :-)


Yay! Finally! Thanks so much for this. Have been excited about this for years since the original announcement.

Had today's date bookmarked, and was browsing around your twitter etc earlier looking to see if this had happened.

Can't wait to look inside.



I didn't turn on JS for the form to be shown, either.


Man, I wish beige was still an option when buying a new desktop computer these days. I know I hated that color when I was a teenager, but it just looks so nice to me now.

Like it would be a perfect fit for a bland old boho styled apartment.


> Man, I wish beige was still an option when buying a new desktop computer these days.

Oh those grandpas. We evolved. Now everything is black: computer cases, power supplies, motherboards, keykoards, mouses, RAM modules, even the GUI^w UX. /s


Now the question is: Can I build a LISA in 2023? I know a lot of the parts are common with the Macintosh, but being an earlier model, maybe it has fewer custom ASICs? I can hope?


Thanks for posting this. awesome article.


The A/UX 0.7 source code has been floating around for awhile. If you search emule for 'various operating system source', you will find it in a (700 mb, I think) archive that also has NextStep, SunOS, Ultrix, and some other sources.



Link to the actual source code download page: https://info.computerhistory.org/apple-lisa-code

Released under APPLE ACADEMIC LICENSE AGREEMENT

Lisa OS Software version 3.1


What Pascal compiler did Apple use to build all that?

Looks like Turbo/Borland Pascal but I'm not sure that it existed in that time or/and was ported to 6502.


Apple Lisa Computer Pascal

Apple’s Lisa Pascal developments began from scratch for Apple when it licensed in 1981 a Motorola 68000 native code Pascal compiler from Silicon Valley Software in California. This compiler was based upon the older P4 compiler from Niklaus Wirth of ETH in Switzerland and consisted of two general passes. Pass 1 produced I-Code, a low-level representation of the high-level Pascal constructs. Pass 2, the code generator, converted the I-Codes to optimized 68000 object code. Apple even considered early in the Lisa’s development using a custom Apple processor which would execute P-Code directly, but the expense of developing such a chip was too much for Apple’s accountants and this project was dropped.

http://pascal.hansotten.com/ucsd-p-system/apple-pascal/


Clascal, which eventually grew to become Object Pascal.

Turbo Pascal came much later, and Borland would adopt UCSD units in Turbo Pascal 4, and Apple's Object Pascal on Turbo Pascal 5.5, then its OOP capabilities grew from there based on what was happening on C++, influenced by Borland's work on their C++ compilers.


Actually Turbo Pascal and the Apple Lisa were both released in 1983, although of course Clascal (which I never heard of until this thread) must have existed earlier to be used to develop Lisa.


> which I never heard of until this thread

Apple Lisa Pascal (started back in 1979) and Clascal (early 1980s extension of Lisa Pascal) were well known projects of Apple. The evolution went from Lisa Pascal, to Clascal, and then Object Pascal. Of course back in the 1980s, information doesn't get around as fast as it does now. Also, Apple played games with naming, which creates confusion that what is used was Pascal. You will see Apple Lisa or just Lisa a lot, omitting the Pascal name/description, so people can get the impression its some entirely different language.

When those employed by AT&T started pushing attacks on Pascal in the early 1980s and disparaging the language, to push their investments in C and Unix, Lisa Pascal and Clascal were arguably overlooked. Though this is partially also because of how Apple promoted it.

Unfortunately, that bad habit continued with Borland/Embarcadero, where people don't know that Delphi is a dialect of Object Pascal. In the same way people didn't know that Apple Lisa and Clascal were dialects of Pascal.

https://en.wikipedia.org/wiki/Object_Pascal#Clascal_and_Appl... (Clascal and Apple's early Object Pascal).

http://bitsavers.informatik.uni-stuttgart.de/pdf/apple/lisa/... (An Introduction to Clascal).



The Lisa used the Motorola 68000, not the 6502.

Turbo Pascal didn't exist yet. The leading Pascal of the early-1980s era was UCSD Pascal [1] but it used a cross-platform bytecode approach which wasn't very fast and wouldn't have been suitable for the Lisa.

I suspect Apple built their own 68k Pascal compiler? But I'm sure someone on HN knows the actual facts.

[1] https://en.wikipedia.org/wiki/UCSD_Pascal


Probably Apple Pascal which existed for the Apple II and Apple III, with a later implementation for the Lisa itself (which then got the name Lisa Pascal).


Downloaded the code and I'm not really sure what I'm looking at. What language is this, and how would someone build/load it?


I see assembly and pascal files. According to this[0] article, the development was done on Apple II's and Mac development was done on Lisa!

[0] https://macgui.com/news/article.php?t=518


I can confirm this regarding Mac development. I was part of the team that built FullPaint (https://en.wikipedia.org/wiki/FullPaint) back in 1985. We had just a few Lisas to work with, and they were considered sacred.

Testing code required copying it onto a floppy disk and popping it into a Mac. We probably would have been a lot more productive if any of us had ever heard the words "unit test". (We were a young team, everyone in their early 20s or even late teens.)


Oh, how I wish I could've been involved in the halcyon days of the computer industry. I'm sure it's all rosey looking from the perspective of someone who wasn't there, but the scrappy young development teams, humble small organizations, mail-order software, and general naive optimism of the preceding era seems like it was so much fun. Probably very challenging, and I'm sure frustrating, but I can only imagine the satisfaction of having worked on the Macintosh and breaking so much new ground.

I was born in time to experience the internet when it was a lot more fun and experimental, which I definitely am grateful for. I hope there are more "moments" like this in the future, though I can't help but feel they're mostly over when it comes to computers and technology. Things just can't stay scrappy and experimental forever.


It was pretty rosey for me, even with all the challenges, emotions and anxiety. I have experienced similar challenges, emotions and anxiety in jobs that were much less satisfying.

I think the moments are out there still; you just have to seek them out. I wouldn't say working on Apple ][, Lisa, early Mac or other early PC efforts were as obvious and groundbreaking as they seem now. I think about a lot of parallel efforts going on at the time that I wish I could go back in a time machine to work on: Symbolics Workstations, early Wavefront/Alias software, Self, Smalltalk, Dylan and other languages that didn't become mainstream.

There is so much out there that isn't over in computers and technology. It might be helpful to think about things that seem impossible, crazy and totally economically unviable. Sometimes we can let finding markets, solving scaleable problems and generating 10x returns for our investors cloud our vision.


If you haven't read Masters of Doom before, I highly recommend it! Your comment reminded me of it.


How did you all at Ann Arbor (I assume) get into Mac development? Did you have Lisa versions of software and decided to move to Mac? Did Mike Boich or Guy Kawasaki convince you to start developing? I am always interested at what makes someone decided to pour resources into a new platform. I have done it more than once myself and it is usually very high risk.


I don't think we ever developed any software targeted at the Lisa (or if we did, it was in the brief period before I joined). We were always developing for the Mac, initially cross-compiling from Lisa.

We did move to Mac-based development after not too long. I have zero memory of why/how that happened, but I'm sure it was clear even at the time that the Mac was the more sustainable long-term platform (also the Lisas were insanely expensive). Still, for a while there it was a real stretch to squeeze a reasonable dev environment into Mac hardware. I remember – and this feels insane as I'm typing it, but it happened – at some point we actually paid someone to fab an add-on circuit board that we somehow glommed into our Macs (these were probably 512KB or 1MB models, I can't remember) that doubled the RAM size. The additional RAM couldn't be used for normal applications, it somehow manifested as a RAM drive. We had one RAM drive in the add-on memory, a second RAM drive partitioned from the standard motherboard memory, and the rest of the standard memory was used to run dev tools and/or the application under test. Putting all of the source code into a RAM drive was necessary to make the development experience tolerable (I can't remember whether the concern was source code navigation, build times, or both... EDIT now that I think about it, it might have been the object code rather than the source code that needed to be on the RAM drive; or perhaps both).

Different times...


In the mid-late 1980s we compiled and tested Amiga software on the Amiga entirely from RAM drives using rather expensive 2 MB RAM add ons, and then stored the updated source code on 3.5" floppy disks that had to be rotated out regularly because they regularly went bad.

Only towards the end of the 1980s were hard drives inexpensive enough to become commonplace. My first 65 MB drive cost $949 (ca 1989), and that is why we were using floppies and RAM drives instead. The RAM drives did survive reboots and crashes most of the time, which helped quite a bit as you may imagine.


Fun story, development of Digital Research's CP/M68k and GEMDOS/GEM GUI on the 68k was also done on the Apple Lisa (and I think some VME 68k boxes from Motorola?). So, a competitor to the Mac (albeit not a successful one) developed on the Lisa, too.

And in fact, fun people have gotten GEM/GEMDOS to boot and run again on Lisa emulators and real Lisas.

Computer archeology.


> What language is this, and how would someone build/load it?

In a Lisa emulator running Lisa Workshop, Apple's UCSD Pascal/Clascal-based development system for the Lisa.

Technically you could potentially port Pascal and/or Object Pascal code to another platform like Delphi, but any low-level or hardware specific code would have to be rewritten.

Fortunately the binary images are available on the internet and you can run them in a Lisa emulator.


It's Pascal but not UCSD. UCSD compiles to a virtual machine called p-code. Lisa's Pascal compiled down to the metal.

(I used to use a Lisa to write Mac programs. In the early days Macs didn't have enough RAM to run the Pascal compiler.)


I'm not an expert on the Lisa or Pascal, but my understanding is that Lisa Pascal did compile natively (vs. p-compilers) and was based on the UCSD Pascal dialect (which introduced units), and that Clascal (and later Object Pascal) extended it with objects and classes. The Lisa Workshop's text menu-based command interface appears to resemble that of the UCSD p-System, though it provided access to a mouse-based editor with a Lisa-style GUI.


As everything in computing, it depends.

Yes the original UCSD Pascal only did P-Code, however a few implementations supported AOT as well like the Constellation OS from Corvus Systems.

https://en.wikipedia.org/wiki/Corvus_Systems


As a tangent, p-code is like an ancient version of wasm or java bytecodes, though its interpreter (the p-machine) was simple and compact enough to fit on 8-bit machines. Smalltalk also used bytecodes.

The nice bit is that once you implement a bytecode interpreter, everything just runs. It may be slower than native code, but no new code generators or compilers are required.


The concept predates both of those, Burroughs B5000 was one of the first systems using the idea, and is still being sold nowadays.

https://en.wikipedia.org/wiki/Burroughs_Large_Systems

Mostly safe systems programming language, no Assembly, all Assembly like operations are available via compiler intrisics, UNSAFE code blocks, admin clearance required for modules tainted with use of unsafe code.

All of this in 1961.


This is a bit of a tangent, but I've heard the Mac System was written in Pascal with bits of assembly for speed and I've also heard it was written in assembly with a Pascal API. An Ars Technica article today mentioned some Lisa code was taken and modified to work on the Mac during its development. Was this code Pascal?


Bill Atkinson's QuickDraw is one example of that. The original was written for the Lisa in Pascal, and then brought over to the Mac project. And I believe portions rewritten in assembler for compactness/speed.

The Mac version of QuickDraw is here: https://computerhistory.org/blog/macpaint-and-quickdraw-sour...

Now that both are "open", it would be interesting to compare the two and see how they differ.


I'm curious to see how long before someone manages to get it to build from source with the original toolchain and gets it to run under emulation.


> In a Lisa emulator running Lisa Workshop, Apple's UCSD Pascal/Clascal-based development system for the Lisa.

Here is one: https://lisa.sunder.net/, and on github: https://github.com/rayarachelian/lisaem


Clascal, the predecessor from Object Pascal, used on Mac OS.

Apple was also a precursor in using safer systems languages for OS development.

Even when MPW later replaced Object Pascal, the major frameworks were based in C++, not C.


Most of the Mac OS ROM was still in assembly (there was a story on the front page last week detailing a time bug owing to a misfeature of the 68k DIVU instruction, even). It wasn't about "safety", just productivity.

And C wasn't chosen because Apple had made an organizational bet on Pascal years earlier with the Apple II. C in the early 80's was still a niche "new vogue" kind of thing from academia[1]. Apple's roots were older, they didn't get the Unix bug until decades later.

[1] c.f. Sun Microsystems, founded by Stanford and Berkeley geeks, launching its very different 68k products concurrently.


Every OS has plenty of Assembly on them, even those written in C.

They never got the UNIX bug really, even with the reverse acquisition from NeXT, or the way A/UX was implemented, Steve Jobs opinion on traditional UNIXes is well known in the Apple community as there some infos on his USENIX session, and NeXTSTEP brainstorming sessions.


Is that USENIX keynote available somewhere? I see the references but not the actual dialogue anywhere.


Back in those days conferences weren't filmed.

https://www.usenix.org/blog/vault-steve-jobs-keynotes-1987-u...

Here are some references,

Chris MacAskill did a full overview of what he had to do to convince Job, it was even commented here,

https://news.ycombinator.com/item?id=17420674

Now gone from the Internet,

https://allaboutstevejobs.com/blog/2018-07-11-chris-macaskil...

However it is saved on the Wayback machine,

https://web.archive.org/web/20180628214613/https://www.cake....

=> "They said a Unix weenie was code for software engineers who hated what we were doing to Unix (the operating system we licensed)—putting a graphical user interface on it to dumb it down for grandmothers. They heckled Steve about his efforts to destroy it. His nightmare would be to speak to a crowd of them."

There is also the interview with UNIX Today!

https://www.tech-insider.org/unix/research/1991/11.html

=> "You know, take SGI's new machines-they have 8-bit color frame buffers in them, you can't even put up a bunch of beautiful color photographs without having one of them look good. The rest of them look blah because you can't do it in 8 bits. You know what I'm talking about. So most computers haven't even gotten past the stage where you could put up a bunch of color photographs, much less other types of media, or moving, dynamic media. And I think that that's an advantage that we'll continue to keep. "

=> "We're bringing Unix to the commercial market where what they care about is designing, creating and deploying mission-critical custom apps alongside those in a multitasking environment. They want to be able to use a suite of productivity apps that are compatible in the data formats with all the ones they use in the PCs. And that's where we have an offering that Motif has nothing to compete with and Sun has very little to compete with. But if you take us into the traditional scientific and engineering marketplace, we are a pretty powerful and very cost-effective Unix box. And to address X, we do run X on our product very well. There are two: one X product, soon to be two, that you can buy from third parties that run X-Window right alongside their Next Step windows, and they're quite good. So we don't have anything against X for what it was designed for, but X has not made the crossover into the commercial marketplace. "

Plenty of other similar remarks on the interview.

And some interesting NeXT footage,

https://www.youtube.com/watch?v=KRBIH0CA7ZU


The files I've looked at are Pascal (the article talks about Apple's object-oriented varient). I suspect there's 68000 assembly language there too, but I'm just getting started.


Under the OS section, there are some files with assembler in it.


I understand the app source code is in there too - it would be nice to know what the directory names refer to — which directory corresponds to which app, etc.


This is fantastic.

Wasn't the original Lisa source code famously lost by Apple?


No? I think you're mixing up the fact that they were supposed to release this almost half a decade ago and so random blogs conjectured that it was lost.


That is not what I was referring to at all.

I seem to recall hearing or reading that Apple apparently lost the original GM source code for core Lisa software, because the backup tapes were corrupted or unreadable; fortunately a contractor had taken home and retained a profile drive with an alpha version of the source code, but the final/GM version was lost to the sands of time.

(As always, backups are only good if you can actually read them and restore from them.)

However, if the CHM source code is in fact 3.1 GM (as it seems like it might be?) perhaps this story is inaccurate, or perhaps it was recovered somehow. It would be interesting to see if the source code is complete and an entire working Lisa environment could be built from it.

It would be interesting to hear from someone with firsthand knowledge.


What hardware can this run on?


The Lisa emulator.

https://lisa.sunder.net/


allways good do see :-)

grep -ri fuck . ./Lisa_Source/Lisa_Toolkit/TK Sources 4/LIBUT-UUNIVTEXT2.TEXT.unix.txt: HALT; {Die rather than fuck up} grep: ./Lisa_Source/Lisa_Toolkit/TK Sources 4/LIBUT-UUNIVTEXT2.TEXT: binary file matches grep: ./Lisa_Source/Lisa_Toolkit/TK3/TK-ALERT.OBJ: binary file matches ./Lisa_Source/LISA_OS/APIN/APIN-OFFICE.TEXT.unix.txt: fucked up if he bails out or doesn't ever select a disk} ./Lisa_Source/LISA_OS/LIBS/LIBDB/libdb-SCANCODE.TEXT.unix.txt: valid position, thus we don't fuck with pscantable^[scanid]^ until ./Lisa_Source/LISA_OS/LIBS/LIBDB/libdb-SCANCODE.TEXT.unix.txt: valid position, thus we don't fuck with pscantable^[scanid]^ until ./Lisa_Source/LISA_OS/LIBS/LIBDB/libdb-SCANCODE.TEXT.unix.txt: valid position, thus we don't fuck with pscantable^[scanid]^ until ./Lisa_Source/LISA_OS/LIBS/LIBDB/libdb-LMSCAN.TEXT.unix.txt: badCheckPoint = 3424; { check point info is fucked }{##} ./Lisa_Source/LISA_OS/Linkmaps and Misc. 3.0/TKALERT.TEXT.unix.txt: if not fcheckhz(hz,i) then writeln('heap big fuckup 1'); ./Lisa_Source/LISA_OS/Linkmaps and Misc. 3.0/TKALERT.TEXT.unix.txt: if not fcheckhz(hz,i) then writeln('heap big fuckup 2');


> You may not and you agree not to:

> publish benchmarking results about the Apple Software or your use of it

Damn, there goes my Apple Lisa vs Amiga comparison video! I was going to get so many YouTube views! Meh, maybe I'll make it anyway. So sue me!

> the Apple Software may not be exported or re-exported (a) into any U.S. embargoed countries

I'm sure keeping Iran from getting this valuable software from 1983, which runs on a completely dead CPU architecture that Linux doesn't even support anymore, is very important!

(But seriously, as a license geek, and not a lawyer, the wording of this license is really interesting. It seems to have completely different disclaimers than you usually see in open-source licenses...)


> completely dead CPU architecture

Hey, take that back! 68k/ColdFire ISA is actually pretty well supported these days by compilers. Better than it was a few years ago. There are 68k backends for both GCC (again) and LLVM (maybe not as active).

It's probably the best supported "retro" architecture at this point.

Rest of your points are valid though ;-)


The LLVM code is fairly active for a retro arch. It just got accepted into mainline less than a couple years ago. I remember it because the m68k folk are pretty active about maintaining support for their arch in the Linux kernel, and for a while the lack of m68k support in LLVM (and therefore the lack of m68k support in Rust) was seen as a potential blocker for mainline Rust support in the kernel.


> which runs on a completely dead CPU architecture that Linux doesn't even support anymore

The 68k is hardly dead; Linux support isn't a viability indicator of an architecture. It was dropped from Linux because there are no 68k systems powerful enough to run it and it's modern userspace, so it was a waste of development and debugging time. But the architecture is thriving in embedded and custom device spaces and fully supported by GCC and LLVM.


"Thriving" is a stronger word than I'd use. NXP still manufactures a limited number of ColdFire microcontrollers, which are loosely based on the 68k architecture. But they're listed on NXP's web site under "legacy MPUs/MCUs", and many of the parts are 10+ years old and NRND. It's pretty obvious that they don't plan on continuing the product line much further than required for support lifecycles.


They're legacy in the same way z80 and MIPS are. They reached an evolutionary dead end of full feature parity for their use space. The 68k hasn't been developed further because it does everything it's target audience needs and it's power user sphere was supplanted by ARM, x86/amd64 and PPC/Power.

Sometimes, a technology is good enough and doesn't need anymore. That's the entire reason the Cortex-M (and R) series exists; because you don't need a Cortex-A715 to drive a motor and monitor a thermostat.


Even then, there's still some sustaining engineering required to maintain a design and keep it relevant. The (e)Z80s that are being made today aren't the same as the ones that were being built in the 1980s -- they're being extended with new peripherals and ported to modern fabrication technology.

I'm fairly certain that hasn't been happening with the ColdFire series. Every ColdFire part I see listed on NXP's site is from 2010 or earlier, before the Freescale acquisition. This puts a lot of those parts 2/3 or more of their way through their 15-year availability commitment; if NXP intended to keep the line alive, I'd expect to see a lot more new parts, and that isn't evident here.

(MIPS is a weird one to mention because MIPS Technologies actually declared it dead last year and started trying to rebrand themselves as a RISC-V IP core provider. The main niche that architecture was used in was wireless routers, but that's been taken over pretty thoroughly by ARM these days.)

Don't get me wrong -- I cut my teeth on 68k and I loved the architecture. But it's also clear to me that it doesn't have a future.


I’m guessing, although I don’t know, that they probably had an academic license laying around for another product that they cleaned up for this. A “it’s our source, you can use it for fun but not for money” license. Microsoft has shared source licenses for academics/bigcos that need to see the source for Windows or whatever.


It took them 4 or 5 years just to do this release, since it was first announced. I feel like "they just dusted off an old license" is not a likely explanation.

Probably took 5 years just to get through all the lawyers at Apple, and this is what we ended up with.

Remember, the default answer to almost any question poised to a lawyer is "No."

:-(


That's because you're not supposed to ask the lawyer if you should do something. You ask their boss if you can do it and ask them them how to do it.


I'm not sure why it took so long, but I think that lawyers dragging their feet probably wasn't it. I'm guessing that they would have never announced the release, unless the lawyers had already signed off on it. I think instead, it took them 5 years to make the landing page (something like this would normally take two years, but the pandemic wasted three years of that time.)


It's a fairly safe bet that the benchmarking clause was not inserted with knowledge of the Lisa and its relevance today as a historical artifact.


I'm still learning Swift (forever) and looking at the source code I wonder if Swift is getting too complicated to read now. I'm having a hard time reading simple snippets of Swift but the Lisa source code seems so comforting to read in comparison. Anyways, I'll soldier on.


I know what you mean regarding Swift.

In recent decades different languages have introduced interesting patterns, interesting notations. Swift seems to have adopted about all of them.

I have been told by teammates that I don't write "Swifty" Swift code, instead writing something more like an "Objective-Swift" style.

Oh, and they say that like it's a bad thing.


> I have been told by teammates that I don't write "Swifty" Swift code, instead writing something more like an "Objective-Swift" style. Oh, and they say that like it's a bad thing

Depending on what they mean by this, I agree that it can be a bad thing.

Years ago, in the days when Swift was still pretty green (around 2.x or 3.x) I knew someone who liked to try to write "Objective-C in Swift" — that is, they'd try to ignore the type system and optionality entirely, with lots of forced casts, forced unwraps, shipping data around in [String:Any] dictionaries, etc… constant fighting with the compiler that made SourceKit very unhappy and crashy and the app we were working on more crash-prone than it would've been had it been written in Objective-C.

On the other hand, if it's just a more verbose style that favors clarity and avoids e.g. unnamed closure arguments and breaks more out into well named variables and functions, I don't see anything wrong with that at all and in fact have been leaning further in that direction as time goes on.


Ha ha, yeah no I would never force a cast, I always unwrap optionals...

As an example, I was slow to adopt guard statements generally, preferring to strictly use them only for parameter checking (thus only at the top of functions).

Then there are crazy ways that case statements can be used in a switch that I still don't understand.


Lovely bit of history, but I laughed out loud at this bit of legal WTF in the license agreement you must agree to before downloading the software:

"You may not and you agree not to: ...publish benchmarking results about the Apple Software or your use of it"

Perhaps 40 years later Apple is still embarassed about the performance of the Lisa?

So, maybe somebody here can emulate a Lisa and run Geekbench :)


I can't believe they make you sign a fekking license agreement to access 40 years old source code (in a long dead language on top). That's so typical Apple.


Yeah I actually haven't downloaded this yet, despite being eager to look, because I actually wonder if I should be cross-checking my employment agreement with my current employer to make sure I wouldn't be violating any clauses in either document! Highly unlikely, but still, jeez.


A license agreement and they want your full name and email. Pass.

(Yes I know you can lie but my point is they are asking for this)


"False!" - Dwight Schrute

>FTFA: a majority of personal computer users interacted with their machines via command-line interfaces...in which users had to type arcane commands to control their computers.

ARCANE!?!? you say!?!?!

    https://en.wiktionary.org/wiki/arcane
    arcane:
    1. Understood by only a few.
the majority of personal computer users (your words, not mine) did understand them, in fact it was the defining feature of personal computer users

    2. (by extension) Obscure, mysterious.
    3. Requiring secret or mysterious knowledge to understand.
the commands were well documented and did what the documentation said

    4. Extremely old (e.g. interpretation or knowledge), and possibly irrelevant. 
not old, it was newly invented, current state-of-the-art, and nothing but relevant to the tasks at hand, they were the only way to accomplish the tasks at hand, and nary a byte of the command line lacked morphologic or semantic significance

The serious point I'm trying to make is, young and up-and-coming programmers who are learning textual programming languages (more or less the definition of a computer programmer at present) should not be scared off from learning command lines. They're very powerful, and not that obscure or hard. Unix had/has a feature that can be confusing, which is the command line is first processed according to the rules of the shell, and then it is reprocessed according to the rules of the app you are invoking, making command lines be a subtle mix of several languages at a time (by analogy, a bit like html+javascript+css). You need to learn how this works, despite being confusing it is the innovation necessary for unlocking the power of shells.


Arcane not to computer users but to the general population. Before the advent of GUIs, computer usage tended to be very much a niche thing. Heck, even well into the Windows era, it was still niche. I remember riding the train to work in the 90s and overhearing a conversation where an older man engaged in some sort of business (no idea what), was complaining about having to have a computer on his desk at work.


Arcane by today's standards I guess.

Also, many people in those days never consulted the docs and just memorized commands without understanding them so they probably felt arcane.


the average person doesn't understand the ancient notation used for calculus; that doesn't make "calculus" arcane; and the article was writing in the past tense

your point about those who copypasta-ed command incantations is reasonable, but they did get them from people who knew how to write them and conditions (like situational pathnames or dangerousness of side effects) are sufficiently variable to limit the scope of this style's applicability


Not all, but a lot of docs from the era were, for lack of a better term, "written by programmers" instead of "written for users".


The source code of an entire operating system with applications of its time is just 6MB.

Think about it. 30 years from now, Windows OS source code will feel like that, we'll download its zip in seconds and browse it. It'll feel miniscule. Linux too. Everything will feel so small after the AI Revolution of 2049, which will increase our code writing speed orders of magnitude, will write instant code for a device driver for a hardware by probing its input/output models based on your expectations from the device, even if the device doesn't support it, such as using a printer to measure ambient temperature by observing fluctuations in the ink capacity sensor. It'll figure that out on its own. Printers will have ceased to exist by then, of course, because of the great AI war against HP due to their cruel experiments on printers over decades to make them fail at unexpected times in order to maximize profits.

We'll never be able to comprehend how the humans of 2022 were able to do anything with computers at all.


In the future every application and its dependencies with all be 1PB binary ckpt blobs of some strange loop convnets served to us with ads over Neuralink Plug Pro via $1K a month subscription service with holo Elon standouts on our Twitter Platinum WristBand Ultra+

The future is dark


The same way there’s a minority pining for the days of OS X 10.4 to 10.6, there will be a group touting that iOS Xa was the last good OS.

Assuming Apple would have adopted a Roman hexadecimal system for their numbering scheme.


Pretty sure I'll still be running Linux on my PC with my phone on silent in a drawer somewhere and won't have heard about any of this bunkum and balderdash


I look at retiring developers and wonder how they built Windows applications with no internet access, MSDN on CD-ROM and Petzold's books.

The next generation will look at me with their AI tools and wonder how I built anything with primitive internet searches and StackOverflow.


We didn't have internet, but there were active Windows developer forums on CompuServe, BIX, and GEnie. I used to hang out on all of them answering people's questions and learning from other developers.

My usual monthly bill for dial-up access to the three services (mostly CompuServe) was around $300, close to $800/month in today's dollars.

Charles listed me in the acknowledgements for the first edition of Programming Windows as "the indefatigable Michael Geary." That was kind of fun.


>how they built Windows applications with no internet access

internet access is a massive distraction, instead you used documentation that was pretty good. It meant you needed a bit more of a base in place before you started--you had to buy a copy of the developer SDK/tookkit for whatever you were working on--but you had so many fewer distractions after that.

(not that it played a large role, but inter network communications between companies was pretty standard during the time that Microsoft was writing Windows, starting with microsoft!ucbvax!decvax etc. type addressing, and soon .com/.edu was added)


I agree. We just buckled down and studied the documentation, which (as you note) was usually good and often excellent. Microsoft's small textbook about C++ was the best I ever encountered. While I threw most of my obsolete books away, I kept that one.


Home computers came with printed documentation and programming manuals, and later, SDKs came on floppy discs with their own extensive offline documentation. Also: lots of print magazines and books, and dialup connections into 'Bulletin Board Systems' existed long before the web became popular.

This all means that you often got stuck on a problem for days, only to eventually 'invent' a clever solution that's also been invented by thousands of other programmers ;)


I just noticed this after typing my comment about this very thing.

In addition to those TechNet CDs, we established some rapport via CompuServe or phone with specific Microsoft support people. Occasionally they would fix a bug we reported and send us rebuilt libs, or send us an internal Microsoft tool to try. That was a different era, obviously with a much smaller population of developers with less exposure to those who would support them.


I already look at what I learned in the early '90s with no Internet. We had the massive binder of Microsoft TechNet CDs and an MS support forum on CompuServe to undertake the largest known Visual C++ project at the time (according to Microsoft).

I have forgotten how to do a lot of stuff I knew how to do back then... but does it really matter, when I can look it up in a few minutes? I'm not sure.


I felt way more productive back then, but then coding it's part of my full time job description, it's just something I do from time to time for work.

Before that it was the Borland C++ manual set.


Ha, yes I started with Borland too.

Before that we had to chisel our code into wooden tablets and feed them into a steam-powered Computing Engine.

OK, OK; there might have been some Atari BASIC and macro assembler in between those two.

Then there was Andersen Consulting's boot camp, where everyone had to learn COBOL. And yes, that was in the '90s.


[flagged]


And there's also a `.DS_Store` in every folder of that 40 years old software code. How futuristic!


You are the wind beneath my wings


Ironically > Macintosh competed with Lisa and ultimately became the favored computer for its lower price and open software ecosystem


It's not ironic at all.

The Mac originated as a cut-down Lisa.

It's not some ironic competitor. It was a later machine, built on the back of lessons learned from the Lisa's failure.

A lot of desirable stuff was lost: multitasking, app development in a high-level language (and a type-safe one at that!), expansion slots etc.

And less desirable stuff: the template-oriented template and creator paradigm. Radical, innovative, but confusing. The Mac went with a much simpler, more conventional, documents-and-apps model, in the traditional CP/M and DOS style.


I think the templates were a brilliant idea. They made somewhat of a comeback on the Mac and didn’t get huge traction, but still have remnants in current MacOS with its stationary pads (a poor implementation, and the Finder doesn’t even change the icon of a file when it is turned into a stationary pad)


* stationery

But you're right. :-)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: