Hacker News new | comments | show | ask | jobs | submit login

Your example from RethinkDB really struck home to me. The idea that superior technology might lose out due to poor marketing or (in this case) a system that is optimized for the real world rather than being optimized for benchmarks really disturbs me.

And (this is just my personality) I don't like being disturbed about something without trying to "solve" it. So here's my best thought on how to handle the situation where a team feels that they have a superior product which is losing out to another product that is optimized for benchmarks:

> Provide a setting called something like "speed mode". In this mode it is completely optimized for the benchmarks, at the cost of everything else. Default to running without "speed mode", but for anyone who is running benchmarks ask them if they've tried it in "speed mode". A truly competent evaluator will insist on trying the system with the options that are really used in the real world, but then the competent evaluator won't be using an unreliable benchmark anyway. Anyone running the benchmarks just to see how well it works will be likely to turn on something named "speed mode", or at least to do so if asked to. Forums will eventually fill up with people recommending "for real-world loads, you should disable 'speed mode' as it doesn't actually speed them up".

Hmm... sounds cool, but I'm not so sure it would actually work. The danger is that you would instead develop a reputation for "cheating" on benchmarks. This is why I'm not very good at marketing.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: