Hacker News new | past | comments | ask | show | jobs | submit login

> Yeah I'm sick of that obvious strawman being trotted out

So, you think it that code should be measure by the size of the code?

a = b + c; is better than the other?

Care to explain why?




Either he's wrong about the straw man, or he's asserting that the number of characters used is an important factor in code quality. I'm searching for clarification of which he's asserting.


Sorry, I think my response was sacrificing clarity for rantiness.

I meant that everytime somebody refers to the value of controlling code size (like gruseom did in this thread), we have to deal with the inevitable strawman of:

    int a = b + c;
vs

    int hours = timesPerformed + hoursPerPeformance;
But gruseom's comment has nothing whatsoever to do with short vs long variable names. tlrobinson interpreted right that codebase size should be measured in number of tokens. Measured in tokens, both examples above have identical size.

(PG has also said this many many times with reference to arc.)




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: