Hacker News new | past | comments | ask | show | jobs | submit login

It seems though that they now surface "authoritative" sites over clear user intent in their search.

For instance, search for "has Infowars ever been correct" on Google and then Duck Duck Go. For Google, you will see results that have nothing to do with the clear intent of the query, such as Wikipedia and mainstream news articles trashing Infowars. For Duck, you will get several results that actually answer the question.

Since Google pioneered natural language processing for search queries, and these sorts of queries worked fine years ago, the only conclusion is that Google is actively burying certain results to manipulate the narrative. Why does their natural language search break all of a sudden when I want to find out something that is unpopular?




Doing that search gave me a Quora link as the first result, and then two links to infowars...which is an awful result.


I'm seeing 2 out of 10 links as Infowars, links #3 and #4. The top two are Quora. mediamatters and rational wiki are examples of other front page results.

They all directly address the query. Why exactly are they "awful"?


You seem in your answers to be prioritizing "direct answer to the question regardless of reliability" rather than "useful information about the question's subject." I suppose there's an argument to be made for that, but it's an argument that leads to prioritizing a search engine that will return the answer "orange pinstripe" to the question "what is the color of the sky" over a search engine that returns a scientific article about how sunlight behaves in atmosphere.

(In any case, this all seems a bit orthogonal to whether Google is right or wrong to prioritize widely-recognized news outlets over smaller ones when it surfaces searches for news articles, doesn't it?)


I don't believe it to be orthogonal. There is a deeper philosophical question that has barely been touched on what search engines should return in response to user queries, which also applies to surfacing news. In the past with Alta Vista, word matches were the heuristic and it was extremely obvious as to how to measure the quality of the result. Now you've got very abstract heuristics, such as intent of the user, trustworthiness of the source, whether the source is healthy and good for society, the correctness or honesty of the content, how much revenue will be made with particular set of results, etc. and the public is being left out of the discussion as to which heuristics are important to apply.


How about as general rule of thumb, I tell search engines what to search for instead of the other way around?


That's wild. My results look like this: https://i.imgur.com/AtrMoLs.png


The person you are responding to is referring to DDG.


Infowars won't actually accuratley answer of it's been correct so Google is following the users clear intent and giving them good answers


You will get results that respond to the question. That's not the same correct answers.


How could a search engine ever know what the "correct" answer is? Relevance is more important.

In any case, Google's results for this particular query are certainly less "correct". They are more of a non sequitur.


> How could a search engine ever know what the "correct" answer is? Relevance is more important.

I strongly disagree, IMO actively surfacing a highly-relevant but incorrect result is worse than not surfacing anything at all.

As for determining what the "correct" answer is, the authority of the source is certainly a good place to start. Obviously nothing is fool-proof, but Yahoo! Answers is certainly less likely to be correct than Wikipedia, for example.


Let's continue with my particular query as an example. I already don't trust Alex Jones and Quora and Yahoo answers. I have my own opinion as to which sources I trust and which ones I do not. So first I would internally give different weights to results based on how trustworthy I find them. Then, I would look at the actual content, and look to cross reference the details to assert their validity.

Now with the Google results, there is nothing even remotely relevant to my search, so I don't even have candidates with which I could do further research into their veracity.

Once again I think Google just thinks they are smarter than the average user, but it's making their search engine useless for certain queries.

And I don't like the idea that a search engine should be doing the critical thinking for users. That's more dangerous than the content they are supposedly protecting us from.


So your problem is that Google considers what you consider to be trustworthy sources to be neither relevant nor trustworthy? Have you considered the possibility that what you consider to be trustworthy sources are neither relevant nor trustworthy?


The problem is that I don't trust Google to define what "trustworthy" means. I'd prefer to get results that match on more clear heuristics and make the decision on trustworthiness myself.


The Google algorithm: sites gain reputation through the number of links to them from other high reputation sites.

I don’t want Google to have an opinion. I don’t want Google to second guess me.

We are a long way, I hope, from, “Hey, Siri, what’s your opinion about....”


[flagged]


The Chinafication of the US is complete. The gatekeepers know better than us and should give us the results that reinforce their echo chamber rather than the ones we are looking for, and the public has been convinced this is the right thing to do.

If this were happening 15 years ago, people would be outraged instead of supporting it. What kind of nonsense is this that the search engine should reinforce political stances? My political beliefs are my choice, and if I want to reinforce them, that's my prerogative. The search engine should not make that choice for me. Well unless you are baidu.com and backed by a one-party system.


It's not the responsibility of search engines to engage willful ignorance with a plurality of perspectives or debunk fringe political nutbaggery that stands opposed to the precepts of civilized society, but I'm glad they mostly do.


Regardless, that should be a personal choice to make.


Google has to rank its results somehow, in order to limit the number of results returned to some sensible amount (rather than returning literal hundreds of GBs of links for common queries, if it considered them all to be equally "first.")

Other search engines are no different.

You make your "personal choice" of ranking algorithm, by choosing which search engine to use.


"if you don't like it then leave" is not helpful. There's no good reason a search engine couldn't provide an interface to let you choose how you want to rank and filter results. In fact there used to be an advanced interface in the past the provided a crude version of this.

A great benefit to Google would be that SEO would be hard to game because everyone has different filters. That is already sort of the case but it should be the user, not Google, deciding which narratives and aspects to filter.


The trick is to include a variety of results early. Sources often can be grouped into blocks which return mostly the same results. Better to reduce the quantity of returns from the top block in order to include on the front page a few from the second and third tier.

Or maybe a text search equivalent to Yelp’s “re-search in this area” (after narrowing the map). Perhaps to be able to select some results and hit a “more like this” button.


>For instance, search for "has Infowars ever been correct" on Google and then Duck Duck Go. F

I suspect Duck duck go's results come from Bing based on the result similarities.


"For instance, search for "has Infowars ever been correct" on Google and then Duck Duck Go." Wow. Completely different results.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: