I think the more appropriate quote to paraphrase would be one from Dennis Hopper's character in the film Speed (1994): "Oh, no. Poor people are pirates, Jack. We are tech innovators!"
The scene where Jack climbs down the hole under the garbage can into the subway, having figured out the ransom money has moved. He tries to hold up Payne, who reveals he's holding Annie hostage.
Well, "at the time" is a funny way to put it. That thread is from 2023. The first commit in the repository is from 2013, importing the already existing source tree.
I recall having played around with DSL/TCL (Damn Small Linux & Tiny Core Linux) around 2010 or so and IIRC they already had their minimalist X11-but-not-Xorg server even back then, which I found fascinating at the time. I'm pretty sure I took a curious peak at the source code back then, IIRC it simply didn't live in git yet (tarball and a few patches?).
The XVesa code base that this is forked off of is of course much older.
> I get redirected to a page telling me to activate javascript...
...which is hosted on a different domain (notion.so), so if you didn't notice and temporarily allow that domain in NoScript, the actual page (hosted on kranga.notion.site) will still redirect you to the "Please Enable JavaScript" text on notion.so
Wow, just seeing the domain name alone is quite a nostalgic throwback already. I remember finding Thomas Antonis website through some German text adventure, written in QBasic, back in 2003 or so.
I think I browsed through this at one point, but I don't have any experience with GW-BASIC myself. I had a copy of QuickBASIC and a print out of Thomas Antonis "SelfQB" at the time. He also runs http://qbasic.de/ from where I downloaded and (much to the annoyance of our schools librarian) printed out a bunch of other tutorials as well.
Back then, Antonis website served as a very useful entry point to other sites and tutorials. I remember spending a lot of time following rabbit holes of link lists and web rings from there.
1) System calls need to switch into kernel mode and back, this can be a massive performance hit.
2) This is especially bad if you want to do precise time measurements, a chunk of time is now spent just calling `gettimeofday`. The performance impact (benchmarked and shown in the article) is substantial.
3) Linux (the kernel) at some point added a "vdso", basically a shared library automatically mapped into every process. The libc can call into the vdso version of `gettimeofday` and use the system call as a fallback.
4) A 64 bit kernel that can run 32 bit programs needs a 64 bit and 32 bit vdso.
5) On x86, the kernel build process can "simply" use the same compiler to produce 32 bit code for the 32 bit vdso. GCC for ARM can't, for some reason. You need 2 toolchains, a 64 bit and a 32 bit one.
6) If you don't know that, the kernel kernel only has a 64 bit vdso, the 32 bit program will run but silently use the system call, instead of the vdso, causing unexpected performance issues.
Somehow we manage to piss away computing resources like nothing, requiring more and more to do the same work. So I guess for meaningful comparisons we need a way to inflation adjust computing resources too?
E.g. 5MB of 1965 disk storage, roughly equivalent to 10 TB in the eyes of 2024 programs?
Maybe we could draw a comparisons by how much a base installation of a recent OS would eat away, or how much of the available RAM an at-the-time-modern text editor would gobble up? E.g. 1980s Emacs vs whatever currently popular Electron behemoth to compare RAM on computers from those respective epochs.
5MB of 1965 disk storage, as the article helpfully points out, stores 5 million (EBCDIC) characters of text.
5MB of 2024 storage, stores 5 million (ASCII) or as little as 1.25 million (UTF-8) characters of text.
There's no inflation to adjust here. There are still people programming microcontrollers with 1KiB or 512 bytes of RAM, 4KiB EEPROM, and maybe 1MB Flash. Those computers can do about the same things which 1965 computers with the same specs can do, just 20-100 times faster. Just because we don't use an AVR to add up Olympic scores, doesn't mean we couldn't.
The ones with five orders of magnitude more resources do a lot more. Some of it is squandered, when we can afford to do that, but in applications like AAA gaming or simulation, not so much. They operate near-optimally, just much faster, and doing a great deal more than was possible at the time.
Are we really "pissing it away"? For example the fact that VS Code is an Electron app written in HTML, CSS and TypeScript opened the door for many extension developers. I prefer it this way compared to the older native/Java-based IDEs.
Meanwhile, the reverse is true for me. I’m sick and tired of desktop chat clients and other applications being, or turning into, single-purpose Chromium instances.
The editor I'm using today updates itself from a global network of interconnected computers, using strong cryptography to ensure no one's tampered with it. It can download plugins from that same inter-net and has a sandbox that keeps malicious code from taking over the rest of its process space or accessing the filesystem. Some of those plugins make their own encrypted connections to distant computers. It displays its user interface on a truecolor screen across 2 2560x1440 monitors. It understands a huge number of programming languages and has advanced reformatting, error checking, syntax highlighting, search, building, testing, and reference-finding capabilities that can easily handle version controlled codebases with gigabytes of code. Because all its pointers are twice the size of those in older systems, I can open individual files many gigabytes in size.
Yeah, it's bigger than Emacs on a PDP-11. It does a whole lot more than Pico does, too.
Given the tone and assumptions the article makes, and the things that are explicitly explained, this seems to be one of those articles where a novice learnt something new and then decided to write an article about it, despite not having fully grasped the concept yet.
As a result, the author has such strange, absolute positions, calling it a legacy that should be abolished (only tangentially knowing some actual use cases), or that strange quote about design principles.
Despite all the talk about security, the whole debacle that argc can be 0 (and argv[0] can be NULL), is completely left aside. This has caused actual security issues quite recently[1].
... as a private individual, you are toast.
I think the more appropriate quote to paraphrase would be one from Dennis Hopper's character in the film Speed (1994): "Oh, no. Poor people are pirates, Jack. We are tech innovators!"