Hacker News new | past | comments | ask | show | jobs | submit login

What other than computer science concepts can save us from slow software? Computer science concepts (Amdahl and Moore, maybe Dennard, maybe-maybe Landauer) have proven that hardware improvements cannot. Is there some aspect of the software performance that can only be understood through another domain of human endeavor?



I am not trying to be anti-intellectual, but software currently has the opposite problem, where people decide some idea will Make Everything Better and it turns out that this idea does nothing of the kind. In fact some of these ideas have set software engineering back by decades (example: Object-Oriented Programming).

There are, of course, computer science concepts that are very smart. But we don't need these to save us from slow software, because today's slow software problem is just the result of people doing bad things in layer upon layer. We have to stop doing all the bad stuff and dig us out of the hole we're in, just to get back to neutral. Once we are back at neutral, then we can try thinking about some computer science smarty stuff to take us forward.


See? Now you're sounding like an old fart :-)

You give an example of Object-Oriented Programming as an idea that has software engineering idea back by decades but really isn't it the misapplication of that tool that does the damage? Consider that time and again software engineers have developed a code base of functions which all need a bit a shared state, and technique of calling all those functions while including a reference to the 'context', was more simply expressed as calling a function "from" the context itself?

The tools help with the cognitive burden of understanding the entire system. And tools that allow one to make durable assumptions about part of the system allow the engineer to 'free up' space in their brain for other parts of the system. That desire to add abstraction in order to 'move up' the conceptual tree and get a wider view of the overall system has been part of how humans think since they first started collecting into tribes[1].

Certainly "over abstracting" is a huge issue. There was a great story about the Xerox Star system (first word processor that was multi-lingual and multi-fonted) that Dave Curbow used to tell about how the 'call stack' to get a character on the screen had reached insane levels (and that made things very slow). All due to abstraction. And yet there are good examples too where at Sun the adding of support for the 3b2 file system was accomplished quickly, and everything still worked, due to the Virtual File System (VFS) abstraction.

My point is that it isn't the tools that set computer science back, it is the misapplication of them that does that. And what I liked about Raph's discussions on Xi is the exploring and testing whether or not a tool he had available was applicable to writing text editors.

[1] Can you imagine the challenge of having to talk to someone in the tribe for 10 minutes to determine what their role was and capabilities? So much easier to say "You're a hunter right?" and when they say yes just assume various hunter capabilities are available.


This just points to a problem that computer science concepts need to address: the conflation of subroutines (reusable components of programs) and runtime calls/returns. Or, stated more directly, programmer control over inlining and other costs, on the way to enabling more zero-cost abstractions.

One big problem is that one can't deploy the existing methods of zero-cost abstraction (templates/monomorphized generics) across process boundaries or the kernel/userspace boundary. Let's work on this, not act as if abstractions themselves are the problem.

(As an aside: I think the "implementation inheritance" aspect of OOP is at least as harmful as jblow suggests. "Associate related pieces of data with the code that operates on them as a unit" is a pretty reasonable idea on its own.)


> In fact some of these ideas have set software engineering back by decades (example: Object-Oriented Programming)

I don't want to sound like I want to oppose you here, but I would be genuinely interested about on your write up about OOP setting engineering back by decades. Could you elaborate, please?


is just the result of people doing bad things in layer upon layer.

You can make a decent argument that's the only realistic way extremely large systems like the web can possibly come about. It's not pretty but the track record of the alternatives is worse.


The context of this discussion is about "a high quality text editor".

Of course in the name of not using the same golden hammer for all problems, extremely large systems like the web and text editors should each be considered in their own rights for the best solution for each of those.


I know. But you can also substitute 'long-lived' for 'extremely large', etc. 'People have lost track of the importance of efficiency/performance' is a recurring point of jblow's. There's something to it, no doubt, but I think it also merits some pushback.


I don't think "long-lived" is a good substitute for "extremely large". The longer something lives, the better it should be, for multiple reasons -- more time to work on the code, more design iterations, more-thorough understanding of the problem gained over time. If the code is just getting more messy and decayed and hard-to-deal-with over time, then we are doing something wrong. (And we almost always are).

I'm not just saying that people have lost track of the importance of efficiency. I am saying they've lost track of how to actually do it. I think at least 95% of the programmers working in Silicon Valley have no practical idea of how to make code run fast. Of the remaining 5%, a very small number are actually good at making code run fast. It's a certain thing that you either get or don't. (I didn't really get it when I started in games, even though I thought I did ... it took a while to really learn.)


I think it's a decent substitute because what you say should happen is generally the opposite of what actually does happen, often for reasons other than ineptitude. But more generally, my wanky counter-point is 'you say all these things as if it's a given they're unequivocally bad and it's not at all obvious to me that they are'. There's an awful lot of room above 16ms.


Dude, Photoshop takes many seconds to start up, and as of the most recent redesign it now often takes multiple seconds just to display the new project menu. And this is not atypical of today's software.

Forget 16ms, I would be happy to get to an order of magnitude slower than that for much of today's software ... it would be a massive increase in human happiness.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: