Hacker Newsnew | comments | show | ask | jobs | submit login

1. Well GPU's crush x86 CPU's in some areas so there is at least one competing technology that is a clear win. Also, Intel added both a GPU and video decoder to their CPU’s, but neither of them use anything close to x86.

As to the rest of it, I think you can look at microwaves for a perfect example of terrible software in wide spread use. You need to be able to select cook time and possibly power level or set the clock. Yet, most microwaves have such a terrible interface that few guests get embarrassed asking how to get a new one to work. And as long as it takes more effort to send it back than it takes to figure out the strange design there is little reason to build a better system.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: