Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I find this particular argument unconvincing, because those aren't real yet.

While the 'ideal' of writing any half-attempted but correct specification, then superoptimizing to the absolute minimal instruction count is unfeasible today, I counter that the attitude that these basic technologies are fantasy is the AI effect in play here. Compilers routinely employ techniques that were considered the holy-grail of program synthesis and superoptimization several years ago, but due to the moving target mindset, we've discounted how capable these suites are becoming.

One of the major issues is actually a PEBCAK: programmers don't realize what the standard actually affords.

What is truly undefined behavior in your code that will give the go-ahead to a compiler to fully work around what you wrote?

Or how about the way you wrote that loop creates a chain of memory dependencies that artificially constrains how the compiler is able to rearrange memory accesses and vectorize.

And especially important here is optimization coupled with testing. You wrote your algorithm with while loops and no explicit stopping condition, so even though you could transform the code into a properly bounded for loop and profit, now the optimizer and bounded model checker can't unroll your loop in certainty for checking if a different implementation is actually perfectly functionally identical. Now the compiler can't actually make the optimizing transforms it wishes it could because you failed at your job. And on top of that, you can't refactor while having the machine verify that nothing functionally changed -- a potential regression nightmare.

I'm not advocating we forget about how to optimize our code, in fact it's quite the opposite. Human programmers need to fully understand what assumptions and guarantees are built into a language in order to be proficient at giving a specification where optimization automation is possible.

Sticking to standard C presents a rather unchanging simple 'virtual machine' model (terminology abused there), rather than deal with the constant flux of small hardware architecture changes that chaotically renders your past wisdom obsolete.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: