Yet Google is trying to do it for everyone - and honestly does not seem to be doing a bad job.
So what is the advantages of the old idea of an almost-AI assisstant working for me when BFG (Big Friendly Google) does it?
Privacy. Yacy looks like a good start but it has miles to go before it can reach the same usability as popular services.
Is there any difference between the goals of an individual (person, company) vs. the average? Is information a competitive advantage? Today we provide unpaid feedback to improve private algorithms, tomorrow ..?
Edit: downvoters, what is a rhetorical question?
> local to you, probably configured and augmented by you.
There is a big difference between explicit goals and the reverse-engineering of intent, e.g. from search history.
We need both explicit goals and auditable algos.
However, I think there is a need for a more granular robots.txt such as:
Limit-display: /events 10words
Limit-display: /blog 30words
Right now, Google has its algorithms decide how much copying is legal and if you disagree you can either disallow Google Bot or sue them, so this would provide a middle ground.
Are the selected facts related to the user's query or search history?
This looks like it's per-page extracted data, if google has been able to extract data from that page, so basically a structured-data result snippet.
Things like this feel like the 90s and shouldn't even to be something worth speaking or caring about.