not that I've done the LOC calculations, but I would guess Alan Kay, etc. include, e.g., Etoys, the equivalent of paint, and some games in what they consider to be their personal computing system, and therefore in the ~20 kLOC.
and hardware is one of the crucial points Alan makes in his talks. the way he describes it, hardware is a part of your system and is something you should be designing, not buying from vendors. the situation would be improved if vendors made their chips configurable at runtime with microcode. it doesn't seem like a coincidence to me that a lof of big tech companies are now making their own chips (Apple, Google, Amazon, Microsoft are all doing this now). part of it is the AI hype (a mistake in my opinion, but I might be completely wrong there, time will tell). but maybe they are also discovering that, while you can optimize your software for your hardware, you can also optimize your hardware for the type of software you are trying to write.
another point is that any general purpose computer can be used to simulate any other computer, i.e. a virtual machine. meaning if software is bundled with its own VM and your OS doesn't get in the way, all you need for your softare to run on a given platform is an implementation of the VM for the platform. which I think begs many questions such as "how small can you make your OS" and "is it possible generate and optimize VM implementations for given hardware".
also something that came to mind is a general point on architecture, again from Alan Kay's ideas. he argues that biological systems (and the Internet, specifically TCP/IP, which he argues takes inspiration from biology) have the only architecture we know of that scales by many orders of magnitude. other architectures stop working when you try to make them significantly bigger or significantly smaller. which makes me wonder about much of hardware architecture being essentially unchanged for decades (with a growing number of exceptions), and likewise with software architecture (again with exceptions, but it seems to me like modern-day Linux, for instance, is not all that different in its core ideas to decades-old Unix systems).
In this respect, Kay's methods seem to be merging with those of Chuck Moore: the difference lies in that Moore doesn't seem to perceive software as a "different thing" from hardware - the Forth systems he makes always center on extension from the hardware directly into the application, with no generalization to an operating system in between.
you can see for yourself, e.g. by looking at the Smalltalk emulators that run in the browser, reading Smalltalk books, etc.
I think it's the "blue book" that was used by the Smalltalk group to revive Smalltalk-80 in the form of Squeak. it's well-documented for instance in the "back to the future" paper. I haven't had the fortune of studying Squeak or other Smalltalks in depth but it seems fairly clear to me that there are very powerful ideas being expressed very concisely in these systems. likewise with VPRI/STEPS.
so although it might be somewhat comparing apples to oranges, I do think when, e.g., Alan Kay mentions in a talk that his group built a full personal computing system (operating system, "apps", etc) in ~20kLOC (iirc, but it's the same order of magnitude anyway), that it is important to take this seriously and consider the implications.
similar when one considers Sutherland's Sketchpad, Engelbart's mother of all demos, Hypercard, etc. and contrasts with (pardon my French) the absolute trash that is most of what we use today (web browsers - not to knock the people who work on them, some of whom are clearly extremely capable and intelligent - generally no WYSIWYG, text and parsing all over the place, etc etc)
like, I just saw a serious rendering glitch just now while typing this, where some text that came first was being displayed after text that came later, which made me go back and erase text just to realize the text was fine, type it again, and see the same glitch again. that to me seems completely insane. how is there such a rendering error in a textbox in 2025 on an extremely simple website?
and this all points to a great deal of things that Alan Kay points out. some of his quips: "point if view is worth 80 IQ points", "stop reinventing the flat tire", and "most ideas are mediocre down to bad".
guess I misunderstood your question, and also went on a bit of a rant.
morphle said "you can write a complete operating system and all the functionality of the mayor apps (word processing, graphics, spreadsheets, social media, WYSIWYG, browsers) and the hardware it runs on in less than 20000 lines of (high level language) code. They achieved it a few times before in 10000 lines (Smalltalk-80 and earlier versions), a little over 20000 (Frank) and 300000 lines (Squeak/Etoys/Croquet) and a few programmers in a few years."
to which you replied "did they?"
to which I replied something along the lines of "you can take a look at Smalltalk systems" to answer your question. to clarify, I meant you can look at what the extent of what they were capable of is, and look at their code. which, again, to me is a bit apples to oranges, but is nonetheless something that ought not to be dismissed.
and hardware is one of the crucial points Alan makes in his talks. the way he describes it, hardware is a part of your system and is something you should be designing, not buying from vendors. the situation would be improved if vendors made their chips configurable at runtime with microcode. it doesn't seem like a coincidence to me that a lof of big tech companies are now making their own chips (Apple, Google, Amazon, Microsoft are all doing this now). part of it is the AI hype (a mistake in my opinion, but I might be completely wrong there, time will tell). but maybe they are also discovering that, while you can optimize your software for your hardware, you can also optimize your hardware for the type of software you are trying to write.
another point is that any general purpose computer can be used to simulate any other computer, i.e. a virtual machine. meaning if software is bundled with its own VM and your OS doesn't get in the way, all you need for your softare to run on a given platform is an implementation of the VM for the platform. which I think begs many questions such as "how small can you make your OS" and "is it possible generate and optimize VM implementations for given hardware".
also something that came to mind is a general point on architecture, again from Alan Kay's ideas. he argues that biological systems (and the Internet, specifically TCP/IP, which he argues takes inspiration from biology) have the only architecture we know of that scales by many orders of magnitude. other architectures stop working when you try to make them significantly bigger or significantly smaller. which makes me wonder about much of hardware architecture being essentially unchanged for decades (with a growing number of exceptions), and likewise with software architecture (again with exceptions, but it seems to me like modern-day Linux, for instance, is not all that different in its core ideas to decades-old Unix systems).
reply