Hacker News new | past | comments | ask | show | jobs | submit login

Maybe the standard should then take the opportunity to define such a thing, since it would be smaller and more useful than what they have today in practice.



Well, yes. Then again the committee took 10 years to standardize std::mutex, 14 years for std::shared_mutex. 17 for std::optional. We still don't have a good hash map.

We have to be realistic, the standard library will never be complete and you'll always have to get basic components from 3rd party or write them yourself.


One thing that’s nice about C++ is that you reimplement stuff that doesn’t have a decade of battle testing behind it.

That way, such ‘bleeding edge’ features evolve and improve a lot before being set in stone.


Unfortunately, they did standardize the bad hash map.


To be fair they standardized the hash map you'd have probably been taught 30 and maybe even 20 years ago in CS class. It's possible that if your professor is rather slow to catch on they are still teaching new kids bucketed hash tables like the one std::unordered_map requires.

I'd guess that while a modern class are probably taught some sort of open addressed hash map, they aren't being taught anything as exotic as Swiss Tables or F14 (Google Abseil and Facebook Folly's maps) but that's OK because standardising all the fine details of those maps would be a bad idea too.

On the other hand, the document does not tell you to use a halfway decent hash function, and many standard implementations don't provide one, so in practice many programs don't use one. The "bad hash map" performs OK with a terrible hash function, whereas the modern ones require decent hashes or their performance is miserable.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: