
Google is improving searches by using BERT and understanding language context - bratao
https://www.theverge.com/2019/10/25/20931657/google-bert-search-context-algorithm-change-10-percent-langauge
======
rkagerer
_The way BERT recognizes that it should pay attention to those words is
basically by self-learning on a titanic game of Mad Libs. Google takes a
corpus of English sentences and randomly removes 15 percent of the words, then
BERT is set to the task of figuring out what those words ought to be. Over
time, that kind of training turns out to be remarkably effective at making a
NLP model “understand” context_

Great example of "explain it to your grandpa" ability.

------
nailer
> In one example Google discussed at a briefing with journalists yesterday,
> its search algorithm was able to parse the meaning of the following phrase:
> “Can you get medicine for someone pharmacy?”

Does that phrase have meaning?

~~~
haggy
It's not correct grammar but it has meaning. "Can I pick up medicine for
someone else at the pharmacy?" is a proper way to ask but that's not how a lot
of people end up searching (with correct grammar)

