> What do you mean? Modulo is not a perfect hash function
It's a perfect hash function for the case where you work with ints and know the maximal min-max range beforehand; then you can modulo by the size of the range as long as it's not too large. In your example 33-21 --> mod 12
This comes up for me surprisingly often but obviously depends a lot on what you work on. It is often tempting to reach for a hashtable, but it's a lot less efficient in this case.
The challenge with big-O is you don’t know how many elements results in what kind of processing time because you don’t have a baseline of performance on 1 element. So if processing 1 element takes 10 seconds, then 10 elements would take 16 minutes.
In practice, n^2 sees surprising slowdowns way before that, in the 10k-100k range you could be spending minutes of processing time (10ms for an element would only need ~77 elements to take 1 minute).
Nobody cares if matplotlib is outdated. It served its purpose here, which was to show how plotting data can lead to further understanding data. You don't need a fully featured tool to do so.
As far as "fully featured" goes, there is nothing more fully featured than matplotlib. It just tends to take a lot more code to generate a plot, compared to some higher-level libraries, so for simple visualizations, matplotlib may not be worth the tradeoff of power vs ease of use.
> Each one of these needs to be scored (well, sorta - there are various tricks you can use to avoid scoring some docs, which again I'm not at liberty to discuss)
I suspect they do indeed use such URLs to track which link the user clicks. It's not that they log the IP with the click, but rather, the click allows implicit relevance feedback with respect to the issued query. This can be used for machine learning (to improve ranking, for example).
I also don't understand your first point. We can run n^2 algorithms on massive inputs given its just a polynomial. Are you thinking of 2^n perhaps?