No, because they already have LaMDA which is the same thing, a language model for dialogs.
Google could implement that into Google today, but the reason they don't is Google is a 200b/year business. You don't want to mess around with a business of that size for the fun of it.
We will probably see it in Google Assistant before they add it too Google.
OpenAI doesn't have this worry so they can put their model out into the public and if anything goes bad they don't lose anything. Google does.
That was my thought process too, particularly the business part. I find it hard to believe Google would want to give that up so easily. Will be interesting to see how it unfolds.
I think that could explain exactly why it's a threat to Google (thus yes to deep waters). Google likely has the tech/ability, but they have an existing business to protect which is a strong disincentive to experiment. It creates a strong opportunity for the upstart trying to carve out new territory.
Wonder if Google is skunkworking something like this at arms' length as a backup plan?
Also some Google employees have mentioned on HN that Google has been analyzing similar things and the compute is currently too expensive to make it profitable.
sounds very monetizable. any question that can be solved with a product.. replace that with an affiliate link. maybe only a 5th of the queries will be monetized, but the ones that are are so specific/targeted that the ROI/click through will be great
Google could implement that into Google today, but the reason they don't is Google is a 200b/year business. You don't want to mess around with a business of that size for the fun of it.
We will probably see it in Google Assistant before they add it too Google.
OpenAI doesn't have this worry so they can put their model out into the public and if anything goes bad they don't lose anything. Google does.