not that I've done the LOC calculations, but I would guess Alan Kay, etc. include, e.g., Etoys, the equivalent of paint, and some games in what they consider to be their personal computing system, and therefore in the ~20 kLOC.
and hardware is one of the crucial points Alan makes in his talks. the way he describes it, hardware is a part of your system and is something you should be designing, not buying from vendors. the situation would be improved if vendors made their chips configurable at runtime with microcode. it doesn't seem like a coincidence to me that a lof of big tech companies are now making their own chips (Apple, Google, Amazon, Microsoft are all doing this now). part of it is the AI hype (a mistake in my opinion, but I might be completely wrong there, time will tell). but maybe they are also discovering that, while you can optimize your software for your hardware, you can also optimize your hardware for the type of software you are trying to write.
another point is that any general purpose computer can be used to simulate any other computer, i.e. a virtual machine. meaning if software is bundled with its own VM and your OS doesn't get in the way, all you need for your softare to run on a given platform is an implementation of the VM for the platform. which I think begs many questions such as "how small can you make your OS" and "is it possible generate and optimize VM implementations for given hardware".
also something that came to mind is a general point on architecture, again from Alan Kay's ideas. he argues that biological systems (and the Internet, specifically TCP/IP, which he argues takes inspiration from biology) have the only architecture we know of that scales by many orders of magnitude. other architectures stop working when you try to make them significantly bigger or significantly smaller. which makes me wonder about much of hardware architecture being essentially unchanged for decades (with a growing number of exceptions), and likewise with software architecture (again with exceptions, but it seems to me like modern-day Linux, for instance, is not all that different in its core ideas to decades-old Unix systems).
In this respect, Kay's methods seem to be merging with those of Chuck Moore: the difference lies in that Moore doesn't seem to perceive software as a "different thing" from hardware - the Forth systems he makes always center on extension from the hardware directly into the application, with no generalization to an operating system in between.
and hardware is one of the crucial points Alan makes in his talks. the way he describes it, hardware is a part of your system and is something you should be designing, not buying from vendors. the situation would be improved if vendors made their chips configurable at runtime with microcode. it doesn't seem like a coincidence to me that a lof of big tech companies are now making their own chips (Apple, Google, Amazon, Microsoft are all doing this now). part of it is the AI hype (a mistake in my opinion, but I might be completely wrong there, time will tell). but maybe they are also discovering that, while you can optimize your software for your hardware, you can also optimize your hardware for the type of software you are trying to write.
another point is that any general purpose computer can be used to simulate any other computer, i.e. a virtual machine. meaning if software is bundled with its own VM and your OS doesn't get in the way, all you need for your softare to run on a given platform is an implementation of the VM for the platform. which I think begs many questions such as "how small can you make your OS" and "is it possible generate and optimize VM implementations for given hardware".
also something that came to mind is a general point on architecture, again from Alan Kay's ideas. he argues that biological systems (and the Internet, specifically TCP/IP, which he argues takes inspiration from biology) have the only architecture we know of that scales by many orders of magnitude. other architectures stop working when you try to make them significantly bigger or significantly smaller. which makes me wonder about much of hardware architecture being essentially unchanged for decades (with a growing number of exceptions), and likewise with software architecture (again with exceptions, but it seems to me like modern-day Linux, for instance, is not all that different in its core ideas to decades-old Unix systems).