If the students are of a different discipline (not aiming to be CompSci grads) or much younger then i think there are better choices e.g. Python.
In a CompSci academic setting i think learning assembly first (that is first programming language but in parallel with the other classes such as computer architecture etc.) is also a valid approach.
Programming from the Ground Up  is a very accessible introduction to x86 ASM and even supposing you don't go any further than the examples in this book, you'll still feel the benefits.
What I am rather certain of though is that just about everything in-between is not. You don't really see the best of both worlds, but rather the worse.
(I should probably clarify that I think there is a large disconnect between what makes a good language for teaching and what makes a good language for software development in a corporate environment.)
Whether the Scheme approach is the best theoretical introduction depends on part on how important you think type systems are to modern CS. Much of PLs thinks the answer is "very important", since they view types as the basis of rigorously specifying program behavior. Rob Harper has taken that view at CMU, for example, using Standard ML in his revamped intro course. (I'm not strongly opinionated on that subject myself.)
My friend said that UC Berkeley's undergraduate CS program followed this introduction. (In the 1990s, at least; I don't know about now.)
"If you are constantly looking for new and better ways of doing and thinking, you will make a successful programmer. If you do not seek to enhance yourself, 'A little sleep, a little slumber, a little folding of the hands to rest - and poverty will come on you like a bandit and scarcity like an armed man.' Proverbs 24:33-34"
People say x86_64 is much cleaner than x86, but I'd prefer ARM, MIPS or 68K. I learned assembly on an Apple II and the 6502, while anachronistic by today's standards, is a nice place to start.
Maybe one day I'll find the time to learn enough of VHDL to implement a 6502-inspired 64-bit processor and build a reasonable computer around it.
However, once a student starts groking C (or C++ for that matter), especially pointers, bit manipulation, and stacks, I think assembly is invaluable for really understanding what the computer and your code is doing. This is especially true when paired with the aforementioned systems course.
I guess I am agreeing with you in the end...
Most code can stay fairly high-level, but the tools are there and a standard part of the language if you need them.
Lisp aside, having this point of view seems to be a good goal to reach.
ps: There's also one very important lower layer, the memory/cache subsystem.
And, by the way, I think the comment at the bottom of page 27 is not correct:
The qualifier const can be added to the left of a variable or parameter type to declare that the code using the variable will not change the variable. As a practical matter, use of const is very sporadic in the C programming community. It does have one very handy use, which is to clarify the role of a parameter in a function prototype...
Actually, the use of const is encouraged as it helps the compiler to catch more errors as well as to enable some optimizations.
The qualifier const can be added to the left of a variable or parameter type
If you have the declaration
const char* str;
char* const str; // immutable pointer to mutable string
const char* const str; // immutable pointer to immutable string
The key issue for my project was being cross platform. Since Microsoft's compiler does not implement C99 and never will, my hands were tied.
Now that's just my library. Other programers might be willing to decide case by case which features of C99 to use. Still if the document dates back to 2003 then C99 support everywhere would have been even more primitive. C89 would thus be the right decision for a introduction to C to avoid confusion.
C89 is primitive but charming in the simplicity. For instance while I disliked being forced to declare a function's variables before any code I now see that it encouraged me to keep functions very small. I would find ways to use fewer variables. Instead of 'for(int i = 0; i < max; i++)' I would use 'while(--max >= 0)', anything to avoid moving my cursor to the top of the function.
On the other hand I miss C++'s exceptions. Checking for errors all over the place gets ugly and bloats the codebase. What would otherwise be a two line copy operation between two objects gets two if blocks added which handle the miniscule chance of memory allocation failure in some other deeper frame.
- telling students to make up types like Int32 or Int16 instead of using the fixed-width types defined in <stdint.h>.
- Claiming that // comments are "not technically part of the language".
- "C does not have a distinct boolean type."
This 'take' (more formally called the language semantics), determines the kind of runtimes, libraries and tools that can be implemented for the language.
With this mindset, it doesn't matter which language you start with. When comparing languages, you go beyond simple syntax differences and compare what are the implications of the semantics. This will allow you to make more informed language selections for your project.
tl;dr: learn multiple languages as well as compare them effectively to profit !
Hello World or my CD collection manager is no fun for anyone that regularly just glances at the obvious first chapter of any language book(variables) for basic syntax.
I don't hate new languages. I hate that fact that I need 27 different programming and scripting languages on a weekly basis to accomplish my tasks. And for contracts in the Microsoft realm, it's a moving target that changes every two years.
As for project ideas, I usually port smaller projects that I've done for home, work or charity into the new language, then try to refine them using the appropriate patterns/style of the language. I've rewritten some semi-complex utilities in C#, JScript, PowerShell, Python and Ruby.
Since I deal mainly with databases, one idea that I've been playing with recently is grabbing public domain data from the FCC or the Census Bureau and building desktop or mobile apps with the results. Example: take the Amateur Radio License information from FCC.gov and build a application.
I would love this. The biggest hurdle for me learning any new language is what to write. I mean, I can only rearrange my iTunes library so many times. :)
Except when it comes to cache, pipelining, registers, non-uniform memory access in general (not just cache), out-of-order execution and opcode pairing rules, SIMD architectures, and the existence of the processor status word. Other than that, yeah, C exposes you to a lot of worrying about memory allocation when algorithm design would be a better use of your time.
> other languages are easy peasy compared
The only way you could possibly say this is if the only languages you know are imperative Algol-derived ones with minimal type systems and no support for logic or declarative programming. Learning C doesn't make Prolog meaningfully easier. Learning C doesn't even make learning a mainstream language like SQL easier.
I'm not sure what you mean by "the existence of a status word", since much of the expression syntax of C is a mapping of the status bits.
You have to have a mental model of the cache and memory hierarchy in your head to write really efficient code in any language.
When I'm writing a bigint package in assembly, I can see whether the previous addition set the carry flag. In x86, I even have an adc opcode. There's nothing like that in C.
I'm biased since C is my first language. Do you recommend another better suited for the purpose?
Python and irb are nice starting points because you can start out by claiming they are just a calculator, where you have to press return instead of =.
From there, you can go to variables (prevent you from having to type, e.g. the gravitational constant or a VAT percentage over and over), then to looping (print multiplication tables), to arrays (store them for later use, or as input for a loop to print year lengths for the planets, computed from their distance to the sun), and then to functions.
And all of that without having to teach people the difference between source, object code, and executable.
You can learn C syntax for assembly language idioms later. Starting out by learning the concepts plus the syntax at the same time leads to inefficient learning because when you're learning concepts like pointers it doesn't help to have to learn confusing stuff and syntax at the same time.
This does not apply to languages that cleanly abstract the machine like Scheme/Python/Haskell/what have you, but C lets the low level stuff shine through so much that you end up having to learn that anyway; you can't really learn it as an abstraction.
I wasn't actually trying to refute that statement above; I was responding to the specific arguments used to support it.
Anyway, I agree with the other poster who replied to you: Pick a language with a REPL, as instant reinforcement of concepts is essential to ingraining them into the mind. Having a longer turnaround time means the lesson gets diluted by being interleaved with too much process (save the file, build the program, run it, look at the output, consider it, etc.).
Which language would you recommend using if you want to learn more about the machine?
It completely hides some of the most important aspects of any modern hardware from you. The only thing it really exposes you to is manual memory management, and even then the view of memory C gives you is grossly simplified compared to how memory actually works on any modern hardware.
> Which language would you recommend using if you want to learn more about the machine?
Pick a machine and learn that machine's assembly language.
I've never become comfortable with C's pointer/reference/dereference syntax -- I much prefer Pascal in this sense. I recently had a look at Ada and was pleasantly surprised -- it has a very straightforward syntax.
As other's mention here, I think C should be learnt with assembler (and computer architecture - especially the bit about cache hits/misses). I find that:
Another thing I find frustrating with C is that it's still a bit of a pain to work with unicode/wide strings -- on my todo list is writing a short post on "Hellø wørld (with unicode)" -- with some examples of wide strings in C, (possibly Pascal) and ADA -- along with assembler output and a "pure" assembler version.
There is surprisingly little good material on the web for this (that I managed to find, anyway).