Well, the VAX was there before any of these, but I wouldn't call that a "personal machine."
I suppose the argument could be made that the 68000 was first, as both it and MIPS ended up in gaming consoles (Sega Genesis vs. Sony PS2 and Nintendo 64).
However, MIPS eventually scaled to 64-bit, was well-known and heavily exploited in supercomputing applications, and was used to produce the film Jurassic Park. The 68000 had a far dimmer future.
Yes, the x86 line did supplant them all, but only with AMD's help. Had Itanium been Intel's final answer, MIPS might be much stronger today.
The difference here is that a preprocessor runs before parsing and semantic analysis. In C3 compile time if runs in the analysis step, so after parsing.
So the macros and compile time execution occurs after parsing in C3, but in C everything happens at lexing, before the code is parsed.
That strategy will backfire when the grammar changes in a later release. A pre-parse step is necessary for code that targets different compiler releases.
Imagine v2.0 introduces a new feature that requires a parser change—one that v1.0 wouldn't be able to parse.
$if $defined(C3_V2_PLUS):
// new feature
fn f = \ -> foo(); // this won't parse in v1.0
$else
Callback f = ...;
$endif
This is also an issue if a future version of C3 introduces language level support, allowing newer compilers to compile code as if it were written for an earlier version. While this approach works well when teams standardize on a specific version’s feature set, they may still want to take advantage of performance improvements in the latest compiler.
That said, this is a niche case and not something I’d consider a dealbreaker.
You do, that's the point here. You need ifdefs in this example to selectively compile the same code with different compilers, or different language levels with the same compiler. In either case C3 will fail while parsing a newer feature with an older compiler or lang level.
This exact situation is there in Java, using compiler args, after lambda functions were introduced. Older compilers would not be able to handle it, and the newer compilers would not break backward compatibility.
It's 155,000 lines of C code across 361 files. Not shown are the nearly 900 lines that make up the dependencies, but using `makedepend` (which came with my system) makes short work of that. I have a more complicated project that compiles an application written in Lua into a Linux executable. It wasn't hard to write, given that you can always add new rules to `make`, such as converting a `.lua` file to a `.o` file:
There is Mathematics for the Million by Lancelot Hogben, which not only covers math, but the history of math and why it was developed over the centuries. It starts with numbers, then geometry, arithmetic, trig, algebra, logarithms and calculus, in that order. It's a very cool book.
I was going to say the same! I got it years ago, it's hard to top a math book with a quote from a certain Al Einstein on the back cover singing its praises! Morris Kline's "Mathematics for the Nonmathematician" takes a similar approach, as I believe other books by the author do. Can also recommend "Code" by Charles Petzold and "The Information" by James Gleick, while not comprehensive they do cover the development of key mathematical insights over time.
Unless "platform engineers" includes "deveopers", they're not the only ones who can diagnose an issue. Once at my previous job (providing a name based on a phone number to the Oligarchic Cell Phone companies), our service just stopped. I wasn't there during the outage, only heard about it after the fact. The servers were fine (they were not out of memory, nor did they have an outrageous load). The network was fine. The program just wasn't serving up data. There had been no recent updates to the code [1] so it shouldn't have been the software. It took the main developer, who knew the code, to know that a "this should not happen" situation, did---that is, the name query we used for a health check had been removed, and thus our software took that to mean the name service was out of commission, thus shutting down.
Now, it could be argued that was the wrong thing for our software to do, but it was what it was, and no amount of SREs or "platform engineers" would have solved the issue (in my opinion).
[1] The Oligarchic Cell Phone companies do not move fast, and they had veto power over any updates to production.
You're welcome to your opinions, I don't stop engineers I'm working with from having opinions about the code formatting. It's still going to be formatted by the opinionated formatter.
Having one gets us into the "Well, it's not quite what I want but at least it's consistent", and it gets rid of arguments that don't really provide anywhere near the amount of value engineers feel they do. There are almost always significantly better and more productive things to be spending time figuring out.
At $PREVIOUS_JOB, the team I was on worked the best when we had no manager. Or rather, we had a director (who should not have been managing us directly) meet with us once a week to tell us "here's what we're headed ... good?" and let us go work in quiet until the next meeting (or meet with us sooner if something really important cropped up).