
The C Programming Language - Part 0 - sekm
http://oncoding.posterous.com/the-c-programming-language-part-0
======
halayli
> C is an easy language to write code in. Especially if you are competent in
> some other block structured language - such as Javascript, Python, Ruby and
> (maybe) Java.

I disagree. You can be competent in Python,Ruby and Java and still have no
clue about how memory works internally, a strong requirement while writing C.
C is easy to learn but takes time to master to reach a point where you are not
shooting yourself in the head left and right.

> Pointers are bad only if you're an idiot.

You don't have to be an idiot to make a pointer mistake, off by one error,
overflow a buffer etc... People that wrote nginx, apache, bind, linux kernel,
and freebsd are not idiots yet such mistakes happen.

------
tptacek
_But what about buffer overflows? Buffers overflow when a fixed size buffer is
allocated, some data is copied to the buffer and the idiot who wrote the code
didn't check to see if the data was too big for the buffer!_

Researchers have found memory corruption flaws in both Daniel J. Bernstein and
Wietse Venema's code. You want to try this "de-myth-ification" thing again?

~~~
tomjen3
Why? These two people are clearly idiots.

It is a simple rule. Never trust user input, the issue has been known for
decades and everybody should be aware of it by now.

~~~
irahul
> Why? These two people are clearly idiots.

If you aren't deliberately trying to miss the point, TFA claimed "only idiots
make buffer overflow error" and the OP is implying it's not just the idiots,
but very smart people(look up the names) who make buffer overflow errors as
well.

> It is a simple rule. Never trust user input, the issue has been known for
> decades and everybody should be aware of it by now.

It's a simple rule. Don't write buggy code. The issue has been known for
decades and everybody should be aware of it by now.

/s

I am trying to find the buffer overflow in DJB's code. I am more interested in
the fact if it was a simple error(give too long an input and it goes kaboom),
or a complicated one. My intuition is it is the complicated kind. DJB is known
to actively use his own implementation of stdlib and strings which are coded
from scratch to be secure than the standard counterparts.

~~~
shrughes
Your post's parent post is a joke.

~~~
irahul
How do you know that? I see no indication that it is a joke.

~~~
shrughes
Now that he replied again I think I might have been wrong.

------
mfincham
"Wrong: use Assembler Language."

I'd be interested to see more research on this. I would hope that modern
compilers can generate better optimised code than hand assembly for most cases
by now.

Does anyone have any pointers to research on this topic?

~~~
ori_b
In general, compiler-generated assembly is on par with naieve hand-written
assembly. It's not going to beat something that anyone put any real time into,
but if time to market matters at all, you're not going to be able to put large
amounts of time into hand-coding assembly.

A good, clever human can generally beat a compiler.

There's a reason that the core of nearly any high performance multimedia,
graphics, or encryption library is written in assembly, or compiler intrinsics
(which are pretty much assembly instructions written in C syntax)

~~~
peapicker
Agreed. I have used C almost every day for the last 25 years... And not only
is the C more maintainable, it is more easily ported to new CPU architectures
than assembly (complete rewrites in that case) -- and yes, it is a fast, low
overhead language -- especially moreso when using the CPU vendors compilers
over GCC since they already know the best way to optimize for their own
architecture.

------
rcfox
> C generates compiled code. Code which is very efficient, but not as
> efficient as hand coded machine instructions.

... _written by a perfect programmer_.

> C is an easy language to write code in.

I think the word he was looking for was "simple". The field of land mines
known as "undefined behaviour" disqualifies C as being an easy language.

> So you have to use pointers - which are nothing more than variables which
> contain addresses.

Except that they have their own semantics and syntax as well as their own fun
set of undefined behaviour.

> This means you can define local variables which shadow external variables
> quite freely. This makes you code easier to read and safer to develop ...

Because everyone loves to keep track of which version of the 'count' variable
you're referring to! (Seriously, there is almost never a reason to define the
same variable name twice in the same function.)

> imperative: C statements are tasks to be executed in sequence. Hence
> imperative.

The word imperative doesn't imply a sequence...

> C doesn't support objects

Pretty much anything that's not a function is considered to be an object. C
doesn't support polymorphic classes. (Well... let's not get into that!)

Overall, this might be useful to the author if he finds that he can't remember
the details of the language for very long. Anyone not already familiar with
the concepts he's talking about will be lost at best, or damaged in the
average case.

~~~
andolanra
Minor semantic point:

    
    
        Pretty much anything that's not a function is considered to be an object.
    

In general, the definition of _object_ is incredibly wishy-washy and the word
really ought to be retired, but as it stands, I don't think many people would
agree that just any old piece of non-function data is an object. Different
languages/people/formal systems have different definitions; for example, in
Python, functions are themselves objects, while Java makes a clear distinction
between objects and non-object primitive types like ints.

Mind you, this is as much a criticism of the original article's usage as
yours; from a certain point of view (e.g. C++'s), C of course supports
'objects' inasmuch as you can store function pointers in structs and
consequently have run-time selection of functions, after a fashion. From
another point of view (e.g. Alan Kay's), even C++ does not have objects,
because it lacks true message passing. The real criticism is that the term
'object' is so vague as to be essentially meaningless without clear context.

------
codgercoder
I hope one of the parts deals with people's aversion to C declarations. The
declaration syntax allows vastly complicated data definition, and the fact
that declaration looks like use simplifies things immensely.

(I also have a personal style quibble -- lots of people seem to think that
putting spaces around every lexical element clarifies things; I think it just
makes gassy code.)

~~~
comex
As a fan of C, I have to disagree. To quote OS X's `man signal`:

    
    
        void (*signal(int sig, void (*func)(int)))(int);
    
        or in the equivalent but easier to read typedef'd version:
    
        typedef void (*sig_t) (int);
        sig_t signal(int sig, sig_t func);
    

In, say, Go, this would be

    
    
        func signal(sig int, newFunc func(int)) func(int)
    

which imo is cleaner (it's clear that the second argument and return value
have the same type, without having to use a typedef) without sacrificing
power.

------
comex
> So a Java program which is compiled to byte code is actually run as an
> interpreted program using the jvm as an interpretor.

Of course, this is not quite true; large parts of the program are JITted, and
the result is sometimes faster than C.

------
spullara
Anyone that thinks that Java byte code is just interpreted at this point
clearly has no right to talk about such things. This is about the third
article by C programmers in the last month where they believe that Java (and
C#) are not ultimately running compiled machine code and are not interpreted.
Why has this misconception lasted so long? The first JITs were released last
century...

------
donniezazen
I really need to learn some C to get my hands dirty with DWM on my Arch
laptop.

~~~
luriel
DWM is going to be rewritten in Go.

~~~
donniezazen
I am yet to hear about that.

------
Ideka
_1\. Use C if you want something to go as fast as possible.

Wrong: use Assembler Language._

That's ridiculous. I could say "Wrong: use machine code." and go on to reach
"Wrong: use wire and soldering iron." and go on and on even more.

