This particular April Fool's joke goes back at least 30 years. I remember getting a chuckle out of it a long time ago.
For context for younger readers, it might be worth pointing out that there was for a while a sort of rivalry between C and Pascal adherents. C was the more "modern" and "professional" language, while Pascal was a "teaching" language (or so some of the arguments went). Windows was coded in C, while MacOS -- before it was called MacOS -- was largely Pascal, with a lot of hand-coded 68k assembly for flavor. Pascal devotees would make fun of C in about the same way that a Python programmer might make fun of Perl. C devotees responded by writing an awful lot more code than Pascal programmers did, which eventually shut them up pretty good.
Pascal got a boost out of OOP, but by 1995 or thereabouts Pascal didn't really have much of a future left, which was sort of a shame.
In 1987 I went from a company where most work was done in C (and assembly) to Apple, which was a Pascal shop. The thing is, the version of Pascal used at Apple had been extended to the point where there were really few differences between the two languages.
- C used curly braces, Pascal used BEGIN/END. Whatever.
- C didn't have strings, really; they were broken. Pascal had strings, but of limited length (e.g., Str255), and thus they were pretty broken. Different pain points, but string handling in either language was not much fun.
- C had short circuit operators; Pascal didn't. This was the most painful thing to deal with.
- C had pointer arithmetic. Pascal had been extended to provide it, too (but didn't do type scaling, so you had to do this manually).
- C didn't have nested procedures. Pascal did, but people mostly used them for hiding (the equivalent of 'static' scope function in C).
After a couple of months it no longer mattered what I was writing code in. Apple's Pascal had been extended to be semantically so close to C that I barely noticed.
A couple years later Apple more or less stopped writing new code in Pascal (everything was in C++, or at least C) and by the early 90s Pascal was all but gone, except for a few holdouts such as the AppleScript group.
I don't have fond memories of the unextended Pascal compilers I had to use in college. To put it mildly.
You could have called me a "Pascal devotee" back then, but I had eventually come to realize verbosity kills creativity and dumped Pascal for good.
This
{
int a[10];
...
}
wins over this
begin
var
A: array[0..9] of Integer;
...
end;
quite simply because the latter takes longer to both read and write. The proof being that none of the more or less serious languages created in the past few decades dared to adopt e.g. begin ... end again.
> The proof being that none of the more or less serious languages created in the past few decades dared to adopt e.g. begin ... end again.
facepalm Except Ruby, OCaml, Erlang, Lua, etc.
Frankly, if verbosity is a significant limiting factor on your creativity, a) C is not the solution to your problem, and b) you weren't trying to solve any hard problems anyway.
Seriously, if you're doing anything really hard, stuff like thread management and algorithmic complexity is a way bigger limiting factor than minor syntactic differences.
Syntax limits everything, it only becomes more important as you're doing hard things. Being able to fit 10% more code on screen is a lot like being 10% smarter, which is the kind of thing you need to make the hard problems even possible.
> Syntax limits everything, it only becomes more important as you're doing hard things. Being able to fit 10% more code on screen is a lot like being 10% smarter, which is the kind of thing you need to make the hard problems even possible.
Yes, syntax limits everything, and being able to fit 10% more on the screen is a lot like being 10% smarter. But libraries, available techniques, and built-in features limit everything a lot more. Being able to do 500% more with the same amount of code is a lot like being 500% smarter. If you're piddling around with 10% increases like a switch from Pascal to C, you're wasting everyone's time. Switch to a higher-level language that's more closely suited to the problem you're trying to solve and you'll trivially be writing a fifth of the code you'd be writing in C[1]. So like I said: C is not the solution to your problem.
And that's being generous to your assumption that Pascal -> C actually even does give you a 10% boost: I think given that you'll be writing the same bounds-checking incantation everywhere in C which you wouldn't be writing in Pascal, the semantics probably negate the syntax very easily.
[1] There's one exception: where C is the language most closely suited to your problem. However, this exception is much more rare than most people think.
I think this is all somewhat related to how much you can keep in your head at once. More code -> less expression -> less that you can keep in your head -> you can solve smaller problems.
If a problem requires that you keep all of it in your head then being able to express that problem concisely will aid in solving it.
The poor mans solution is to reduce the scope: chop the problem into sub-problems that you can solve individually.
Divide and conquer is the most powerful tool in a programmers arsenal because it allows you to use the worst languages to solve some of the most complicated problems.
But there may be a subset of problems where that strategy no longer works and for those problems the more compact languages may have a very strong advantage.
It's a double-edged sword, though : your compact, clever solution is great when your write it but becomes unreadable when you (or worse, someone else) tries to maintain it. And then you have to understand both the hard problem and the way it was solved in the first time.
The main way to make code genuinely shorter is to actually make it simpler; replacing logic with a higher-order function call (e.g. map rather than a loop) makes it more readable and maintainable, not less. Or if we're talking about meaningless syntactic clutter like the extra lines of begin/end, they don't make the code any clearer.
> What is a hard thing that requires better syntax?
Any thing where, for the implementation you can come up with, those extra lines are the difference between fitting a function on a single screen or not. There's a huge difference in readability between a function or class that can fit on a single screen and one that can't.
t may just be that the learning curve for APL is so steep and so long to become truly proficient that most don't get there. I've long held the opinion that much of the criticism of Perl as "line noise" is more to do with people that aren't really proficient in it thinking they are because a subset of the language is very C-like and familiar, and it can be used for quite a while before encountering anything too foreign to someone that knows C. Once that happens though, it leaves those people scratching their heads wondering what they are seeing.
Compare and contrast to APL and Lisp, which are obviously different, and Python, which avoids the problem by having a fairly different syntax and a guide for exactly how to do things.
I've seen some pretty amazing things in APL, but I can't really understand them. I don't count that against the language, I just haven't spent the time to actually learn APL.
As for there being an optimum, I think there is, but I think it has a lot to do with how long and for how complex of things you plan to use the language. Front-loading learning time for a more expressive language may pay dividends later.
With regards to complexity theory, I'm inclined to assume that syntax is a constant factor in the equation. You only have to extend the language by writing a macroprocessor once, if you count that to the power of the language. That's why C is still very powerful, although other langs have half an STD-Lib toolbox hidden in syntactic sugar.
I agree. I don't understand the programmers who debate syntax all the time. Good syntax is important, but appropriate semantics are more important. Only after you've selected the typing, execution environment, memory model, paradigm support, available platforms, and libraries you need, should you worry about syntax. At that point there are usually 0 or 1 satisfactory languages.
The other day I was surprised to learn that the equals sign was invented in 1557. I wondered what the hell everyone did in algebra before then, and it turned out that a lot of the earliest examples of algebra are essentially word problems.
By the time the equals sign was invented, it was commonplace to use the abbreviation "aeq." for equality, for "all equivalent (or equal) to".
But the equals sign turned out to be a really important development in mathematics, because it helped to visually distill mathematics down into just their number and equation components, without the visual noise of written words.
So I wouldn't be too quick to discount the value of good syntax. I've never personally cared much about it in the past, but looking at it from the point of view of the equals sign, it's probably fair to guess that syntax does a lot more for code clarity and readability than we think.
When I first came across Python, I was really annoyed at the syntax, specifically at its use of indentation. Nobody tells me how to indent my code!
But then I realized that Python only "forced" me to do things I would have done otherwise. I got used to its syntax rather quickly.
The one thing I like about using braces for delimiting blocks is that Emacs and Vi allow you to jump from the beginning of a block to the end (and vice versa) in one keystroke. But otherwise, I have to agree with Bjarne Stroustrup (sorry if I mis-spelled his name): Programmers can learn to love absolutely any syntax.
I liked Modula 2 much more than Pascal or C. (for those, that don't know it:) Modula 2 was the successor of Pascal from Niklaus Wirth. It had cleaner syntax and many other good features. Its successor, Oberon, was object oriented. [1]
But the culprit (and that belongs to all Pascal descendants, as it seems (at least until Modula 2)) was, that the libraries where not very helpful. The standard libraries for Pascal (not the extended ones from Borland Pascal or others) and from Modula 2 where a pain to work with and did not cover the developer needs. I guess, that was one reason for the downfall of Pascal -- some niches (eg. Borland Pascal) where successful, but they lacked the broad covering. C had it all: A syntax, that was popular by many, because it could be typed fast (not everybody will agree that it is an advantage) and a library that covered everything you needed. Also with printf it had a neat solution (according to that time) for a problem, that was really cumbersome to deal with in Pascal or Modula 2. Also you could write C code for one machine and port it rather easily to another, because the libraries had nearly the same API.
I account some good part of the initial success of C to the standard C library. After that it was just a question of publicity.
Also one mistake of Wirth might have been, that he took different names for his languages. I guess, the languages would have done a little better, if he called it Pascal2 or Pascal++ and after that Pascal++15 or so .... but that is the other problem: Good engineers are usually bad marketeers.
I think, it was just one top feature of the first standard C library. It made printing a lot easier. In standard Pascal or Modula 2, you still had to print every single data type with its own special "WriteX" statement. Even there was a distinction between "WriteString" and "WriteLine", the second wrote a string and added a Newline after it (I am not sure in the moment, if it really took a string or just wrote the NL). In printing to screen, some languages of those days where really spartan.
That's a really interesting insight. Wirth, et al, spent a lot of time delivering on really deep libs that solved CompSciIsh problems (lot's of ADTs) and not so much doing stuff people really needed to do in the real world.
Ah, Free Pascal. It made me remember the first semester at my University where everywhere I looked I found Microsoft. I refused to do assignments and projects on Windows, preferring instead Linux (Slackware, then). In the first semester we studied programming in Pascal and I used the Free Pascal compiler on my Linux to learn and play with the language, while everyone else used Turbo Pascal on DOS/Windows.
At the company I work at, we had this guy who had written a program for our accounting department in Delphi. This spring, he quit, and I was given the task of maintaining that code.
Normally, I tend to avoid using classical IDEs and prefer Emacs, so it took me a while to get comfortable with Delphi (the IDE). But once I had gotten used to that, I was able to jump right in, despite never having touched Pascal before. Pascal is certainly not the sexiest kind of language around, but it does make it easy to write highly readable code. I can see why people would want to stick to it.
Delphi! You're right, I completely forgot about that. I knew a guy that was writing unholy piles of code for some company in Delphi up until at least 2005 or thereabouts. By then I thought it was really odd that someone was still using that, and good-naturedly teased him about it a little.
I learned Pascal when I was in my teens (not my first language, but the first "serious" one after Basic and LOGO) so to me it's always associated with my childish hacking attempts and very bad code IN ALL CAPS.
I remember a couple of my friends bought a C book and their code looked so cryptic and weird in comparison, so I didn't bother learning it, and then I kinda lost interest for a few years before picking up more serious programming in my 20s. In the mid-late nineties everyone was doing C++ and Java, so I never got back to Pascal. I had some good fun with it for a few years.
The perspective I had, having lived thru it, is once the professional / teaching separation happened, pascal was automatically doomed. There is a tension in american culture of educational vs vocational, and the vocational crowd disposed of pascal in a few short years. Very politically incorrect for adherents of that group to teach anything that isn't directly mentioned by name in job requirements or ivy league applications, and the educators realized you can teach "big O" and "quicksort" and "what is a finite automata" in any language, so they had no long term allegiance. So bye bye pascal.
(Another interesting analogy to Pascal is cursive handwriting... ever educator, ever, always claimed we'd be required to use it elsewhere although they personally find it a complete waste of time, typical emperor has no clothes or blind conformity to a dead belief system. My own kids are not being taught cursive in school, so progress is possible)
Nothing in IT is ever new, fundamentally the JVM concept is just a re-implementation of the "famous at that time" pascal p-code system. Compile pascal to p-code once, run that p-code on any system. And from memory, just like java, you end up needing customized sources for specific machines and specific versions of the p-code interpreter which eliminated that marketing bullet point in actual practice.
My twelve year old learned cursive, but I think they spent comparatively little time on it compared to what I spent. She's never had to write a paper in it. My nine year old hasn't learned it, at least not yet. They both have markedly better (regular) handwriting than me, mine has deteriorated over the years.
The source code to the original Bourne shell did that and a lot worse. It became one of the inspirations for the IOCCC (International Obfuscated C Code Competition).
(What that FAQ doesn't mention is the real reason the Bourne shell deserved to be in the IOCCC: The way it allocated memory. It trapped SIGSEGV (the signal the kernel sends you when you've provoked a segmentation violation or segfault by trying to access memory you don't own) so it would know when to request more RAM from the OS. This later became a problem for people looking to port the Bourne shell, for example to the Motorola 68000 CPUs which powered the first generation of Unix workstations.
> Have you ever bothered to read ANSI C specification?
You mean the one that states
> The free function causes the space pointed to by ptr to be deallocated, that is, made available for further allocation. If ptr is a null pointer, no action occurs.
> You didn't say which version of the ANSI C specification.
Which does not matter because all of them clearly define the behavior of free(NULL). The language has been the same since C89[0]:
> The free function causes the space pointed to by ptr to be deallocated, that is, made available for further allocation. If ptr is a null pointer, no action occurs.
I'm guessing pjmlp confused free(NULL) which is defined and double free which is UB:
> if the argument does not match a pointer earlier returned by the calloc , malloc , or realloc function, or if the space has been deallocated by a call to free or realloc , the behavior is undefined.
and didn't "bother to read the ANSI C specification" because he was so certain of what he misremembered.
I believe that Linus Torvalds said something[1] in the spirit of this article about how Git was only adopted by the community due to how arcane it looks.
A well known hoax. For that matters, the quoted source code is a part of Carl Shapiro's submission to IOCCC 1985 [1] which produces a maze. This alone is enough for debunking the story, right?
It seems that the GNU version slightly differs from the OP. It includes a joke on IBM RS-6000 and their virtual machine, and have different spellings (Nichlaus vs. Niklaus, AT&&T vs. AT&T). I wonder which one precedes.
Wow, it's been a long time since I checked this joke for the last time! At that time I read the version that was translated into my mother tongue (Korean). Now I read this in its native form.
For context for younger readers, it might be worth pointing out that there was for a while a sort of rivalry between C and Pascal adherents. C was the more "modern" and "professional" language, while Pascal was a "teaching" language (or so some of the arguments went). Windows was coded in C, while MacOS -- before it was called MacOS -- was largely Pascal, with a lot of hand-coded 68k assembly for flavor. Pascal devotees would make fun of C in about the same way that a Python programmer might make fun of Perl. C devotees responded by writing an awful lot more code than Pascal programmers did, which eventually shut them up pretty good.
Pascal got a boost out of OOP, but by 1995 or thereabouts Pascal didn't really have much of a future left, which was sort of a shame.