Hacker News new | past | comments | ask | show | jobs | submit login
Real men program in C (embedded.com)
15 points by Anon84 on Aug 2, 2009 | hide | past | favorite | 36 comments



There are two extreme approaches to "Real programmers do X".

One approach is rooted in electrical engineering, and declares that real programmers are the ones who get as close as possible to the hardware.

The other approach is rooted in mathematics, and declares that real programmers are the ones who get as close as possible to formal models of computation.

Both are narrow-minded and flat-out wrong. Real programmers understand that context is king, and that everything they will do is a trade-off decided by some factor of the project they're working on.


I agree with this. However, I would like to mention a somewhat implicit point in this. REAL programmers (TM) are usually ready to change their works context if it is necessary.

This observation struck me quite hard, especially now with the software project where I need to interact with lots of other students on a programming level. Usually, when there is a technical problem with the build scripts, library dependencies or other hard problems, the group divides into two groups (one small group and one large group).

The large group just stops doing anything and goes "well, someone will fix it. I don't know anything of this, I never touched this, I will just stop until someone fixes it."

The smaller group on the other hand grows highly active in such a situation. They ask around if someone is working on this, and if no one does so, they just go "Bleh, I hate this, I know nothing of this, where is the ant documentation, git's hard reset and coffee?".

More abstract, the better programmers (let's call them developers :) ) accept that their work context extends from the simple code cutting into fixing the build script, getting dependencies to work and handling other technichal things while the coders just don't accept this and go 'not my context. Stop work'

The funny thing is, very often, the larger group starts to complain about the smaller group, because the smaller group just knows such a lot of things already and they just assume that everyone knows these things (hey, a master's degree programmer who does not know the decorator pattern, despite having java as a primary langaige...), but no one from the smaller group actually starts learning things out of their context... meh.


I'm a mathematician, yet I'm still part of the do-most-things-in-C camp. I don't think this contradicts your point, though -- I find that C provides a far more well-defined model of computation than scripting languages like perl.

I use C more because I want to know what my code is going to do than because I want to know how it is going to do it.


As a query, have you looked into lisp at all? I've seen (through HN) examples of stuff like derivatives expressed quite elegantly using Lisp.

Using C for maths programming sounds fun and all, but I'm sure there's got to be a better way of doing things.


I've tried lisp; but for one-time hacks where I'm not aiming for performance or robustness I usually go to the extreme and use Maple.


lisp is also nice for testing arguments, like 0 < a0 < a1 < ... < an < 999, ai ∈ integer

(and (< 0 a0 a1 ... an 999) (apply and (map integer? (list a0 a1 ... an))) (foo bar))


IMHO there's one more thing that's often missing from the articles about "embedded systems". What kind of embedded systems are they talking about?

Your microwave controller is an embedded system. So is your mobile phone. And so is a crazy, industrial, programmable controller with hundreds of I/O ports controller for an assembly line. It takes about the same time to write the microwave program in C and in asm, but you'd be insane to program something like iphone only in assembler.

Also, it's a bit suspicious that there's no verilog and similar languages on those graphs...


I agree somewhat - always use the best tool for the job. But, if you CAN'T use C, then you are not a "real programmer".


I usually use the distinction of "programmers" and "engineers" between people who know how to use a programming language and people who know how to build a computer system.


Long long ago I worked for a bank, we had essentially two classes of programmers, Application level programmers and Systems level programmers, the barrier between them was pretty formidable.

These days the lines are much more blurred, and that's why this argument is even possible. The 'real men' of old were the systems programmers, they used to use assembly.

I remember reading something similar in dr dobbs years ago that had exactly the same argument going, 'real men' write assembly, everybody else uses C (which was still considered to be a high level language at the time).

Our definition of 'high level' is shifting, and apparently so is what we consider to be 'real programming'.

C is more like (old style) lego, the bricks are small and simple but you can build very complex stuff out of it once you master the principle. It will do very little 'out of the box' but once you get it your limits are your fantasy, there are no 'domain specific' bricks.

I wonder how much of the reasons for these preferences can be traced back to what you were playing with as a kid!


The reason why C is so big in the embedded/realtime world is because there are no 'layers' underneath you that can screw up the timing. Deterministic behaviour is one of the key goals in that sphere. (and it should be in every other but that's not economically feasible it seems).

On the web, where programmer time is expensive and most software is throw-away (sorry, but that's the way I see it even if that is not a popular attitude here) the equation is completely different and skewed towards scripting languages, with 'java' (basically a safer version of C++) taking up a good portion of the compiled segment.

Hardly anybody directly programs their web aware stuff in C.

Most entry level jobs are with companies that do web development, there you learn how to do scripting and quick and dirty development, usually with lots of loose ends in the final product.

When you program in C such undisciplined coding would be punished with lots of segmentation faults and data corruption issues, and before long you'd be a more seasoned - and more careful - programmer.

Even seasoned and careful programmers mess up though, witness the endless stream of buffer overflow exploits.

Another great thing about C is that it is not such a moving target, once you've learned it you're 'current' for the next decade or two or so.

The proliferation of languages and environments is quite bewildering at times. This years hottest technology is next years garbage. In the C world things move at a more relaxed pace and the technology you're interfacing with is generally speaking more mature, it doesn't feel so much like a work in progress.


Hardly anybody directly programs their web aware stuff in C.

I do. :-)

Even seasoned and careful programmers mess up though, witness the endless stream of buffer overflow exploits.

I'd dispute that. There were lots of buffer overflow vulnerabilities five years ago; but I can't remember seeing any buffer overflows in the past few years which weren't either (a) written by novices, or (b) in code which was written 5+ years ago but sufficiently obscure that nobody looked at it until recently.

I'd say that, especially given the proliferation of automated analysis tools, buffer overflows are pretty much a solved problem where seasoned and careful programmers are concerned.


Interesting! What is it that you do in C that talks to the web ?

I'm serving up a very large amount of jpegs as image streams to a piece of javascript and found that the only way to get enough performance was to code a multi-threaded special purpose webserver (I call it yawwws, yet another www server) in C. That's been going for a decade or so now, possibly if I had to do it today I'd use some other technology.


What is it that you do in C that talks to the web ?

I've written lots of client side code (a command-line pipelined HTTP client; library code for callback-driven HTTP for the benefit of web services). On the server side, the tarsnap website is a combination of static HTML and CGI programs written in C.


I can't remember seeing any buffer overflows in the past few years which weren't either (a) written by novices, or (b) in code which was written 5+ years ago but sufficiently obscure that nobody looked at it until recently.

Which category is the iPhone SMS vulnerability?


Which category is the iPhone SMS vulnerability?

I don't know -- I haven't seen any concrete details about what the bug was.


sorry maybe youre right. I was not able to confirm that from a source i would trust



Yeah I found those... yahoo and cnet is nice but not really specialized on security. And absence of such sites in the google results made me suspicious


Well, it seems that apple provided a patch for it so I'm guessing it's real. Also everywhere there was talk about the details of the bug it was cited as buffer overflow.

That said, this is only one example, I'm personally aware of 10's of examples ranging from character sets to configuration files and protocol buffers so one more or less isn't going to make much difference.

For instance, and this is just one of many, mod_rewrite for apache had a buffer overflow issue in 2006.


> There are no 'layers' underneath you that can screw up the timing

Well, there are, but they're also written in C. You still can have stuff that trips you up from your libc implementation or the syscall implementations in the kernel, but if you're working on an OSS OS and are familiar with low-level stuff anyway, you can just keep debugging all the way down to the hardware.


Adding to your “no 'layers' underneath you that can screw up the timing” it is also valuable that if there is a performance problem in your application, you can address it.

You won’t ever end up having to just blame it on the garbage collector, the string buffer implementation, or even the memory allocator. For software which works with huge quantities of data or is interactive, every microsecond often counts.

That said, C++ offer the same advantage, and I am surprised that it doesn’t see more use. Maybe C programmers think like Linus Torvards: http://thread.gmane.org/gmane.comp.version-control.git/57643...


I think that a big part of it is that C is simple. The thread of execution is dead obvious, it is almost - but not quite - like looking at assembler. In C++ you have more options to create interesting bugs, every operator has a potential side effect depending on the context.

Sure that can be done in C too, but you have to work a little harder for that.

When reliability - or lives - are at stake and plenty of systems software written in C controls mission critical stuff it helps to see what is actually going on.

By sticking to a very simple subset you make it easier to find problems before they cause nasty accidents.

I know of plenty of situations where even an OS would be a detriment, these run on totally stripped down kernels that probably the majority of programmers would no longer recognize as an operating system proper but would instead call a loader or something to that effect.

C++ offers the same advantages that C does but it also offers a single disadvantage that C does not have, it's more expressive. In other words you can make your C code do lots of stuff but you will not be able to say it in roughly more than one way.

In C++ there are endless ways to do the same thing. That means there is more to remember and that in turn means that programmer expertise is a much bigger factor in quality. I think systems level guys have learned the hard way that the KISS principle is the most important programming guideline there is.


> Hardly anybody directly programs their web aware stuff in C.

More than you might think. Especially a lot of Fortune 500 companies tend to have their own frameworks programmed in C. They're generally not interested in rapid prototyping, but rather in performance, tools and most of all code stability (i.e. you don't want your website to depend on the security/updates of an interpreter). Gradually this is moving to Java (and .NET) for the reasons you mentioned, but it's a slow process.


One point that the article fails to highlight is, while there is a great demand for embedded programmer, that demand is concentrated in a smaller set of companies than say web development or enterprise systems. That coupled with a lot of the hardware companies moving more and more of their operations to China and Eastern Europe, does not paint a rose picture for heading into the embedded space. Unless you plan to design and market your own products. The mobility of an embedded development career is not as good as a web or enterprise developer and the returns (pay, benefits, security) on some occasions tend to be less than that of other development opportunities.


Somehow Smalltalk did not make his list. Nor forth.

It would be a more interesting debate if there were some hard evidence of the relative efficiency, measured in total development/maintenance/opportunity cost comparing C with other useful languages/environments.

The discussion seems to be limited to "C is the only choice because it is close to the metal" and "so what".


I think the principal problem is: who the hell wants to work with embedded systems? I'm a CS graduate student and I don't see any reason why you would want to go in to that field. As far as I'm concerned, let the Chinese do it.


I love working with embedded systems. The nice thing is that it's usually embedded for some reason--it's talking directly to some hardware and that makes it interesting--motors, switches, leds, accelerometers, DACs ADCs, etc. Quite often those are hooked up to physical things so the program you end up with is controlling or is controlled by cool tangible things.

Then add to the mix that since embedded things are custom you generally get to drill all the way down to the processor itself to make it boot up correctly. When you are done you've generally got an app layer and hardware layer and a processor interface layer and it's all code you wrote yourself (since everything can be completely customized it's really hard to write reusable generic code and have it small and fast).

There's something really satisfying about standing back and realizing that nearly every piece of code running on the processor is yours.


Damn straight.

If the project is small enough, you might even design the hardware and write all the code. Nowhere to hide and all the bugs (and glory) are yours.


Because there's lots of good, interesting work available, primarily because of attitudes like yours?


Yes, but from what I've seen the pay is bad. Also, requirements on large initial investment makes it hard to create a startup. Sure it might be fun to tinker around with that kind of stuff, but it's no a field I see a promising career in. I might be wrong though.


Please provide some examples of job postings for low paid embedded system dev jobs. Or was that speculation?


I do.

Spend your time working with black boxes like I do, and you might dream of throwing away the technology stack to start talking to a processor directly.

Why do you feel the Chinese should do it? Because they're smarter than everyone else? Or because they should be doing the jobs no one else wants?


comparative advantage?


Ugh, not the "real men" argument again.

Real programmers use the tool that is best to the job, rather than using one that appeals to their bravado.


"Quiche-like phrase"? I couldn't read any further.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: