

Why Peter Thiel is wrong about search being a natural monopoly - gentschev
http://www.brekiri.com/blog/405/why-peter-thiel-is-wrong-about-the-search-monopoly/

======
gersh
Monopoly search is bad because it is easier to game. People have gotten so
good at manipulating Google rankings, that there is room for competition.
Merely being small has some inherent advantages. Being bigger and more popular
can be inherently bad.

~~~
gentschev
Yeah, I think of it as the Google uncertainty principle (after Heisenberg). As
soon as Google measures a search signal, just the fact that it goes into the
rankings means that it degrades in quality. People start reverse engineering
the ranking and trying to game it. So it's kind of an arms race between Google
and the text ad industry they've created.

------
riffer
This video has Thiel's comments:

<http://bigthink.com/series/62#!selected_item=4845>

He starts at 8 mins in, and although the whole video is 2 hours long, he makes
a bunch of major points in the first 3 to 4 minutes of his talk, after which
he answers questions for 3 to 5 minutes.

His central point is that there is a lot of potential value in getting the
fixed costs of the search business down, and that he would prefer to invest
there, rather than in playing the zero sum game of trying to take away make
share in the $25 Bln search business.

Also worth noting: anybody who can lower the fixed costs of search by an order
of magnitude is also going to be in a position to make money from businesses
that have similar cost structures.

------
dspeyer
80/20 doesn't really apply here. The whole point of search is to handle the
obscure stuff.

~~~
gentschev
Yes and no. Pretty basic spelling correction is pretty good, and as Google
makes its spelling operation more sophisticated, I actually find myself
needing to reverse their corrections more often.

Indexing is more complicated. You could argue that 20% of Google's index
contains at least 80% of the information people need. The problem, like the
old advertising saying, is figuring out which 20%. So if you have a clever way
to address content quality and uniqueness, suddenly your crawling and indexing
costs plummet.

------
andraz
The idea that costs for a search engine go down with time is false. Bandwidth,
CPU, memory and disk sizes you have available to index and retrieve
information do get cheaper. And this means they get cheaper for information-
producers too. So the need to increase the amount of data you index eats away
the decreasing cost of hardware.

Naturally this holds true for general search engines. Vertical ones are a very
different beast. And I think Peter Thiel was not talking about specialized
vertical searches like patent search.

~~~
gentschev
It depends on whether you think you need to index the same fraction of web
content over time. I think that's one of the unexamined principles of the
search engine industry. Also, software costs are in some ways decreasing even
faster than hardware costs because of open source infrastructure.

To go back to the Blekko example, I'd say it's pretty clear you couldn't build
something like that with $10-15 MM (assuming they've gone through maybe half
their funding) a few years ago.

~~~
protomyth
It seems like you really need to keep up the index on the same %, but I guess
dropping all the "designed for Ads only" sites would drop your number. I keep
wondering if one of the shortening companies would come up with a search
engine based on their links and some metric on repeats / reputation. Also, it
might be work trying some touch setup with a person going good/bad for some %
of your index.

