Hacker News new | past | comments | ask | show | jobs | submit login

> As the article states, it's resurgence is bc of restricted small devices, but that's not because it's a great language.

To be fair, it's an amazing language, but if you're coding to get things done with a deadline (or with an unknown set of specs), of course you should go for the nailgun instead of making custom hammers.




Syntactically I agree with you. I love the C style, I pretty much only use C derived languages for a reason. I think it was great for it's time...


Back in the mid 80s, the CS program at my Uni switched from Pascal to C as the standard course language. C always felt like a crippled death trap in comparison, even if real work would have had to have been done in Modula, rather than actual Pascal. C has a nice data literal mechanism, and return/break, but all the things you gave up: array bounds checking; references that couldn't be null; subrange types; non-int enums; arbitrary array indices; sets; nested subroutines.

How many people even realize that C++ (formerly "C, with classes") is largely just a rehash of the Simula preprocessor (for Algol) from the 60s?

That said, I think the Unix creators meant for C to be an assembly alternative bootstrap language, not the primary development tool for almost all applications.

In a sense, the rise of microcomputers was a disaster for programming language development, or at least a mixed bag in the short run.


Having worked in both, Pascal (at least in the 1980s) felt like trying to work inside a straightjacket.

Perhaps the worst example: The size of an array was part of the type of the array. This means that you couldn't access outside of the array. That was good. It also meant that you couldn't have a variable-sized array, because there was no possible type for it to have! That was really problematic at one particular point when we needed to deal with a variable-sized 2D array. We wound up having to create the maximum size we had memory for, and only use the subset that we needed, which to this day I consider a kludge to deal with a language limitation.

For more, Google for "Why Pascal Is Not My Favorite Programming Language". Despite being written by Kernighan, it's not a Pascal-vs-C evaluation - it's more Pascal-vs-Ratfor, with Pascal coming off rather badly for real-world use.

For university teaching, though, Pascal was probably better...


The fixed size array limitation of the 1970 CDC Cyber teaching version usually gets an extension in the (Modula or Modula inspired) versions that people actually use. E.g. -

http://www.freepascal.org/docs-html/ref/refsu69.html

https://www.modula2.org/sb/m2_extensions.htm

Yes, I've read "... Not My Favorite ..." (and "Worse is Better", and "Unix ... C ... Hoax")


Yes, I know there was an extension. Unfortunately, the one I had to use didn't have it, so I was trapped.


I'd suggest that it's still "its time", but not for daily use for most people.

Hardware is cheap and fast and there are safer languages (I hear Rust is good; I'd use Java if I wasn't carpal-tunnel concerned), and often a VHHL is more than sufficient for time/speed trade-off s or even obligatory (e.g. Python, client-JS respectively).

But that said, C is _great_ as a teaching/learning language -- not much magic, small --, as a "I have very specific/strict goals" tool, and as a very well-tested underpinning for making languages that are safer/magical (e.g. CPython).

Would I pay 60 people to write C for a .com/.io startup where 10 people writing NodeJS could get something out the door faster? No, but I'd love it if a couple of those 10 had a solid C background so when _shit gets weird_ or a framework's magic starts barfing on systemd's magic, the engineering team doesn't decide it's time to switch to golang.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: