Hacker News new | comments | show | ask | jobs | submit login

Related to this is the importance of deforestation. Some good links:

* http://en.wikipedia.org/wiki/Deforestation_%28computer_scien...

* http://www.haskell.org/haskellwiki/Short_cut_fusion

Deforestation is basically eliminating intermediate data structures, which is similar to what the "int(s.split("-", 1)[1])" versus "atoi(strchr(s, '-') + 1)" slides are about. If you consider strings as just lists of characters, then it's basically a deforestation problem: the goal is to eliminate all the intermediate lists of lists that are constructed. (It's something of a peculiar case though, because in order to transform into the C code you need to not only observe that indexing an rvalue via [1] and throwing the rest away means that the list doesn't have to be constructed at all, but you also need to allow strings to share underlying buffer spaceā€”the latter optimization isn't deforestation per se.)

I don't know if there's been much effort into deforestation optimizations for dynamic languages, but perhaps this is an area that compilers and research should be focusing on more.

On another minor note, I do think that the deck is a little too quick to dismiss garbage collection as an irrelevant problem. For most server apps I'm totally willing to believe that GC doesn't matter, but for interactive apps on the client (think touch-sensitive mobile apps and games) where you have to render each frame in under 16 ms, unpredictable latency starts to matter a lot.




Automatic deforestation can remove intermediate results in a pipeline of computation, but it can not rewrite a program that is based around the querying / updating of a fixed data structure to use an efficient imperative equivalent throughout.


Deforestation is easily done in lazy languages like Haskell.

As for GC, it would be nice to have good real time GCs in runtimes.


> As for GC, it would be nice to have good real time GCs in runtimes.

After decades of GC research, I think the conclusion is, "Yeah, that would be nice." Current state of the art gives us some very nice GCs that penalize either throughput or predictability. One of my favorite stories about GC is here:

http://samsaffron.com/archive/2011/10/28/in-managed-code-we-...


It has nothing to do with laziness. It has everything to do with the guaranteed absence of side effects.

Deforestation is /more useful/ in strict languages, because allocation of temporary structures costs more. So fusion on strict arrays is better than on lazy streams.

You just can't do it unless you can freely reorder multiple loops, and to do that you need a proof there are no side effects. Haskell just makes that trivial.


"Deforestation is easily done in lazy languages like Haskell."

You can also do it in stream-based or data-flow-based languages. Or in pretty much any DSL you decide to implement, if the semantics of the language itself is reasonable.


Try telling that to a dedicated C++ guy. Apparently the C calls are dangerous, dirty and just to be avoided.


Fully agree. Only when C goes away we have the possibility to have more secure software.


Yeesh.

When incompetent programmers go away perhaps. C in and of itself is not the issue. And neither is scaring people away from it with horror stories.

I've run into young engineers recently who thing that pretty much any 'C style' system call is necessarily dangerous, because OMG the developer has to remember to pass in the length of the buffer they're passing in as well as the buffer itself. No, you just have to not be a frickin' idiot.


Incompetent programmers are here to stay as they are cheaper to hire.

We need languages that are not designed to allow security exploits by accident, like Ada, Modula-2 or any other in the same school of thought.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: