Hacker News new | past | comments | ask | show | jobs | submit login

> Your tendency to undervalue modularization and code-sharing is one symptom. Another is your refusal to use systematic version-control or release-engineering practices. To you, these things seem mostly like overhead and a way of needlessly complicating your life. And so far, your strategy has worked; your natural if relatively undisciplined ability has proved more than equal to the problems you have set it. That success predisposes you to relatively sloppy tactics like splitting drivers before you ought to and using your inbox as a patch queue.

I hang my head in shame at this. I'm lucky, though, that the other members of the D team talked me into using modern tools like github and bugzilla. I'm grateful for that.

P.S. yes, I really did use email as a patch queue and database.

I've seen this sort of version control done twice in a professional setting, and both were by programmers much more gifted than me.

I once saw a guy who wrote a robust e-store web app in C. Not C++, but C. Tons of complexity, and it worked fine, but then when I dived in to work on it, it was a nightmare. He told me "You want main_app.version3.old.c, That's the one that gets compiled and pushed to the production server. Not main_app.c, that one's old and we don't use it anymore." He built a build system from scratch, did his own version control in his head, and kept everything in one monolithic C file.

Of course, he was the shittiest software developer I've ever seen. Just a brilliant programmer. He needed a manager the way a jet needs a pilot. Sadly, he was 30 years my senior, and not amenable to being told what to do, so after a long talk with our CEO, we decided to let him go. He was the only one who knew how our system work, but I built a comparable one using best practices and open-source software in less than a month.

C isn't a terrible language to write a web framework in. (Not using version control is a terrible idea). About 15 years ago I wrote a simple high level framework in C for writing web apps:

http://git.annexia.org/?p=c2lib.git http://git.annexia.org/?p=pthrlib.git http://git.annexia.org/?p=rws.git http://git.annexia.org/?p=monolith.git

It actually does stuff that even current frameworks cannot achieve, such as providing long-lived stateful web apps entirely in HTML (no JS required), although you are entitled to doubt if that matters these days since JS has won and is available pretty much everywhere.

The best thing about it was that it was so fast, because the code + data of a small web app would fit entirely in the L1 cache of the CPU.

C is a terrible language to write a web framework in. The kind of high-level constructs modern programming languages provide are just not possible in C. It is also very easy to write buggy code and is too low level for the use case.

You should try looking at the code before declaring that. The C used a pool allocator, had some high level structures (vectors and hash tables, basically the same as Perl), loads of safe string functions, try/catch, coroutines, cooperative threads, and other stuff I've forgotten[1].

It was possible to write safe code in that framework, and in fact we did write a lot of code used daily by tens of thousands of UK schoolchildren until quite recently.

[1] Invoking https://en.wikipedia.org/wiki/Greenspun's_tenth_rule although I like to think our code was very minimal, elegant and not too buggy.

> You should try looking at the code before declaring that.

Attempting to write a web framework in C is great!

Even though you end up building the high-level programming constructs, C web dev is still just too hard compared to other alternatives. Because of this it is not possible to build a large community around it, which I would say is the most crucial part of today's web framework like Django/Rails.

C doesn't have tagged unions, which makes it a terrible language to write pretty much anything in. Most of the other pieces you need for good programming you can write yourself in C, but you can't make a safe tagged union out of thin air, and if you rely on helper functions or macros then your tools won't understand them.

It's also so bad at expressing literal values that I've seen a serious recommendation to write literal strings and parse them instead ( http://www.tedunangst.com/flak/post/string-interfaces ).

The type system doesn't even do generics, never mind higher kinds. Maybe you could keep a web framework small enough that that wouldn't be too big a deal.

These are all valid reasons against using C for anything at all, but they apply to many other languages that people use for web apps too.

It's worth noting that another web framework I wrote a couple of years later was written in OCaml[1].

[1] https://savannah.nongnu.org/projects/modcaml - also dead upstream, and there are now much better choices for OCaml web apps.

People still use untyped languages for some things, but for the rest I don't think so? I mean if you have virtual functions then you can effectively implement (inefficient) tagged unions, and every language I've seen used on the web has a reasonable syntax for basic values.

> C isn't a terrible language to write a web framework in.

Hey, I wrote my first CGI script in Pascal, so I know it's not impossible. And nothing I've written since then has been so fast. ;-)

But so much stuff like GZIP delivery, POST parsing, cookies, etc will require you to write enormous functions, with countless "gotcha" edge cases, just to reinvent the wheel.

Using it in production for anything complex (like a web store selling digital downloads), is just nuts.

Agreed. This was 15 years ago (although it was in service until quite recently). We used to front the thing with an Apache proxy, so that could deal with more modern web standards.

> It actually does stuff that even current frameworks cannot achieve, such as providing long-lived stateful web apps entirely in HTML

What? This was the norm back in the day. It's not like all those Perl programmers writing CGIs or later PHP programmers writing PHP apps were using JS to tie it all together.

Or am I misunderstanding what you mean?

It had two levels of persistence. There was a database layer for long term, persistent storage. But the clever(?) part was that the C threads persisted (in memory) beyond each single HTTP request. e.g. you could create a calculator web app where you would press buttons and have the calculator display numbers, and all that would be handled (across multiple requests) by a single C thread. The calculator display might be stored in a stack variable.

Of course you can imagine ways for this to fail - restarting the webserver wasn't very nice for your users for example. Also you needed a smart front end which would direct user sessions back to the same web server instance. (And since it used cooperative threads, each core would be running a separate web server -- but this was 15 years ago when multicore machines were not too common).

Ah, so this was possible, and occasionally done back then, but the real reason it wasn't was less to do with capability, and more to do with everyone wanting to ride on top of Apache. Apache handled a lot of stuff you wouldn't want to in a CGI (at that time), so it made sense to use it. Apache forked (and/or threaded depending on era), so variables shared in memory wasn't common. That said, people set up shared memory regions, or used file based sessions and /dev/shm, which is functionally the same, and had much the same thing.

This is a ground-up web server rewrite, so it didn't use Apache, although we did (sometimes) front it with an Apache-based proxy.

Oh, I understand that. I just used Apache as a reasoning for why most in the era weren't using memory based sessions.

Also I think it's worth saying that I'm not trying to belittle your accomplishment, as it may have come across that way. Writing your own framework in the language you used to handle logic proved to be somewhat prescient of you, it's what numerous other projects ended up doing (Rails, Mojolicious, etc) for multiple reasons, which I'm sure you were able to take advantage of. I remember my own tests using ApacheBench ca. 2005 to test simple CGI performance vs mod_perl performance vs PHP vs Perl's HTTP::Daemon implementation with forking. When the workload is small, and the routing needs are simple, it's amazing how much overhead the webserver includes.

I don't know about that statement. If someone can't understand basic source control, and the benefits it brings, what else in their software are they not thinking of?

You are not alone... on my first internship, we were transferring code by USB keys, I just can't believe I used to work this way at some point.

At least you didn't store the only copy on a floppy in your desk drawer next to your lunch.

We walked a 3.5" disc down to a 'build box' in the basement.

Happy days.

Scarily, I have single, hand-written source files that would barely fit onto a 3.5" disk.

Since that was downvoted for some reason, here's the source file I was thinking about. It's 762K, so about half of a 1.44 MB 3.5" disk. The total for the whole directory (itself a fraction of libvirt) is 9.2 MB.


Technically, that file wouldn't fit on the 3.5" disks I used in college (400K Mac and 720K PC).

Suppose you change one line, how long does it take to recompile that file?

That single file takes 0.56 seconds to compile on my i7 laptop. However deleting and recompiling that file triggers a relink of the library, and then a relink of all the utilities using the library, and that takes a bit longer.

Ahem. That brings back some shameful memories.

BTW, there are over 14,000 commits just on the dmd compiler itself, and 120 contributors. Not using a version control system would have caused a complete collapse.

The CTO of a project I worked on a couple of years ago refused to install a version control system. I basically threatened to abandon ship if he wouldn't relent and eventually he did, but until today I don't think he's fully convinced that it was a good move.

Minor usage correction, "until today" should be "to this day". "Until today" implies that something happened today that changed his mind, while "to this day" is saying "even now, as of this day".

(My apologies if this isn't of interest - I'm not trying to be pedantic, but the misusage suggests that English is a not your first language. Most people I know who have learned English as secondary language are interested in these kinds of usage corrections.)

> the misusage suggests that English is a not your first language

It isn't. I'll try to remember, thank you.

He's correct, but it wasn't something I noticed until he pointed it out. Your intent was understood.

The intent was understood, and it's a pretty minor quibble. But I agree that line threw me for a second as well, and I had to read it twice. I was expecting the next line to be "and then today our version control totally saved our butts!"

In my "big corporate job" of a few years back I introduced a policy of doing due diligence on suppliers who were writing applications for the company - was amazed to find that some people not only don't use version control systems but can argue with a straight face that they are a bad idea.

I just did DD somewhere and the CEO said with a straight face that 'testing is old fashioned' in response to me flagging their test procedures as inadequate. Some of the stuff you come across just borders on the incredible.

Fortunately most companies have their stuff in order.

We did have a salesperson at one company who complained about us testing stuff as "all it did was find bugs". I think he thought that testing actually created bugs and that if you didn't test the product it wouldn't have any bugs.

Sometimes small children will cover their eyes with their hands and say "You can't see me!"

Manual test cases are old fashioned. Type systems and generative tests are more efficient ways of reaching the same defect rate, IME.

(I assume that isn't what you/that CEO mean though)

I honestly cannot understand that mindset. What were his reasons?

Another great advance was the autotester. DMD has always had a test suite, but Brad Roberts wrote a plugin for github that essentially prevents merging pull requests unless they pass the test suite. I highly recommend it.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact