Hacker News new | past | comments | ask | show | jobs | submit login

There are at least two periods 'before good search engines'.

I guess there was a time in the 1950s/1960s when you asked the guy sitting in the next room, who wrote the software, what the error meant, or where you simply read the source code or disassembled the code to figure it out.

When computers became more common, but weren't powerful enough yet for storing electronic documentation, you didn't crack open the tome; you already had it open (and it was more often a tome on the OS than one on the language) You also typically had a printed list of error codes on your desk.

When memory became less scarce, you would see header files with extensive (for the time) documentation, and there were tools to look up API calls, error codes, etc. (I think Windows or Visual Studio still ship with one)

When CD-ROMs became common, you had documentation on CD. Later, when hard disks outgrew CD-ROMs, that data moved to hard disk, because that was faster, and multi-CD setups became common-place.

Those CDs later moved to the web. That often made searches even faster (for example, I remember searching Apple's or Microsoft's site being faster than searching a locally installed MSDN (yes, MSDN was an application before it became a web site) or Apple's documentation for a while)

Also, I read release notes front to back, and used to read almost all API documentation. I think that was fairly common. Without the likes of HN to distract you, there was plenty of time to do that :-)

Since search engines weren't that good, it also was useful. If you didn't know the name of an API call or at least that of a technology, it was almost impossible to find it.




shudders




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: