Hacker News new | past | comments | ask | show | jobs | submit login
This Is Not Your Daddy's OS (dadhacker.com)
24 points by comatose_kid on June 18, 2008 | hide | past | favorite | 23 comments



If you are afraid that today's crop of programmers don't understand the basics, then there are Open Source Tools that let you do something about this:

Short summary video: http://www.youtube.com/watch?v=JtXvUoPx4Qs

Google Tech Talk: http://video.google.com/videoplay?docid=7654043762021156507

You implement a virtual CPU starting from NAND, then an OO language and an Operating System. This sort of comprehensive information should be an essential part of any programmer's "mental furniture." The nice thing, is that the only prerequisite is that you know hot to program in the first place.


I love the gratuitous glasses removal at the beginning of that first video.


Maybe we should put James Bond theme music in back of the vid?


People simply confuse "operating system" with "platform" because in the past the platform has always been tied to the OS. Not any more.

When people say "the browser is the new OS" they don't really mean "OS", they mean "platform". Emacs is also a platform (though not a very popular one)

I believe the browser is the most important platform, but it still requires a real OS to run on.

Also, his rant kind of reminded me of this: http://xkcd.com/378/


I clicked that link expecting it to be http://xkcd.com/435/, picturing an EE making a similar rant.


Thanks for the link, Daniel. Excellent bulletin board material - for the far right hand limit of my bulletin board.


Personally, I think this is a very dated position to take.

All computer language save exactly what the processor reads is completely synthetic, a fantasy that we all mutually support to make our lives easier. We've seen machines based off of a variety of underpinnings that directly model languages above them, so even that underlying representation is somewhat fluid and subject to interpretation.

I agree that sometimes people forget about how much infrastructure surrounds them, but I also think that people working in lower levels of the code sometimes forget that their job is ultimately to make it so no one else every has to do what they are doing. Ideally, every layer completely fades away into the previous layer and you end up with a unified front which we call a runtime environment.

So maybe this complaint is like a Matryoshka Doll. As he hacks on his OS-level code in assembly and lectures people about how they shouldn't take his work for granted or steal his names, a lower level software or hardware engineer is saying exactly the same thing about his post.


If the age of a position negatively impacts its correctness, we should re-examine Newton's Laws :)

I guess that most anything is 'somewhat fluid and subject to interpretation', but I still find it difficult to understand how a software engineer can conflate the idea of an OS (which essentially manages computer resources) with a browser.


"Dated" doesn't mean old, it means old-fashioned or somewhat anachronistic by modern standards.

All he's really talking about is a "runtime environment" when he says OS. What we consider the bare bones of a modern OS is, by standards set 30 years ago, irreparably full of excess and waste. Even if you look back 10 years features have been added which were once impractical. The feature creep all software experiences means that over time, what's considered the "bottom" of the software hierarchy is constantly reaching up to the top.

And it gets really complex when you start talking about emulation. Is an OS suddenly not an OS when it's run in software emulation? What about virtualization? If the resources it manages are fictions, does that change what it is? And then you get into insular environments that certainly could be made to run on bare metal environments if you wanted to, like Lisp and Smalltalk. Where do they sit?

Maybe "Operating System" is a misnomer, but it's not far from the truth from the perspective of a web programmer deploying applications to a target over the network.


You have some interesting points. But just because something runs on bare metal doesn't make it an OS (many Amiga video games tossed the OS for example). An OS running in s/w emulation is still an OS. It manages resources for the apps, it doesn't matter what lies beneath - same goes for virtualization.


They were re-examined. Remember Lorentz and his transformations?


I think his thought process relates to the same people who think you should know everything.... I will never know exactly how a microprocessor works - but that doesn't impede my programming skills, I might not know electricity works (down to the electron) but I still use it everyday.

If you spent all your time trying to know everything about anything - you will know a little about a lot, instead I think I would rather be a specialist.


"I will never know exactly how a microprocessor works - but that doesn't impede my programming skills"

Yes it does.

To pick just a single example: if you don't understand how your CPUs cache architecture works, and how it interacts with hardware paging (and ultimately swapping), you aren't qualified to do performance analysis at all. But you will anyway, and you will get it wrong, and you will forever be wondering why your MySQL database doesn't scale properly. Or why adding a bunch of RAM to the box didn't work the way you expected. Or you'll write dumb "optimized" code with local copies of things that could be computed, and fail to understand the benchmarks showing why it doesn't work.

As a programmer, you are a user of a tool. If you want to be an effective user of that tool, you need to have some understanding of how it works. No, you don't need to be a fully trained ASIC design architect, or semiconductor process engineer, or solid state physicist. But yes: you should understand how a circuit works, and what the parts of your computer are, and how they interact.

And your statement that "specialists" can get by without knowing "a little about a lot" just confuses me. Every talented "specialist" I've ever known has an encyclopedic knowlege of all sorts of things outside their specialty. A better word for a "specialist" that only knows their specialty is probably just "worker", or "lackey".


You can do a lot of useful performance analysis with only a profiler and trial-and-error. Also sometimes mathematics seems to help.


You can only really get the lowhanging fruit that way. There are so many different methods of optimization that you have to know where to start. You have to have a good model of the underlying system. Otherwise you won't understand why boxed integers are inherently slow. Or why an array of tuples can be much slower than a tuple of arrays.

Modern languages completely abstract from fundamental things like indirection. Consequently, it becomes almost impossible to accurately estimate the performance of algorithms in high level programming languages.

Waiting for people with experience optimizing prolog programs to disagree with me...


Optimizing Haskell seemed to require quite deep knowledge about the way the compiler worked.


Sure, you can analyze, but there are cases where you will not be able to do much about your findings without an understanding of computer architecture.


I judge my skill level as a programmer by looking at the things I use then asking myself, "could I have created this?". Any time the answer is "no", I take that as a red flag and find some spare time to learn how it works or create a prototype implementation.

Knowledge is not a zero-sum game. Learning more in one area does not make you worse in another. In fact, I believe it's usually an advantage.


I have used a closed-source framework pretty heavily for a few years, and when I need to really understand how to use some part of the API, I will go home and recreate the observed behavior from scratch.

While it's sort of inefficient (why continue using a framework when I have many core pieces sitting around at home?), I think it's fun to approach the problem this way and I've learned plenty that applies to other problems.


I disagree. While you certainly don't need to know everything (for example, a programmer doesn't need to understand how electrons flow across the junction of a transistor), not understanding how the CPU works does impede a programmer's abilities. For example, how would you write performance critical code without this knowledge? Sometimes, just choosing the best algorithm isn't enough.


If you've never read the book "Code", do yourself a huge favor and go out and grab it tonight. It is in my top 10 computer books of all time.

I've never seen a guide so eloquently walk a person from a basic understanding of how you turn on-and-off light switches to how assembly language works.

Regardless, it would prove to be very helpful if you don't understand the basics of the hardware you are running on.


And what about knowing a reasonable amount of a few topics? In this case, knowing a little bit of operating systems, computer architecture and computer science (the more theoretical things as algorithms, complexity...) will make you a better programmer, if only because you will be able to think about your program at different levels if needed.


Awesome rant, however the movement to have the OS 'browser based' is more about the old http://en.wikipedia.org/wiki/Network_PC idea than it is about web 2.0/ajax or xml.

It's like a movie sequel with different actors and a crappy writer.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: