Hacker News new | past | comments | ask | show | jobs | submit login
Gemini Postponed, "in some respects" as good as GPT-4 (reddit.com)
32 points by behnamoh 5 months ago | hide | past | favorite | 24 comments



“In some respects” is such a weakening qualifying statement, that it could be made about anything.

In some respects, this comment is like Pulitzer-winning writing. (Pulitzer writing is in English, too)

In some respects, I am as good as George Clooney. (I am as good if not a better software engineer in my niche)

In some respects, $1 is worth far more than $10. (Earning the first $1 is in business is a bigger milestone than earning $10)

“To the best of my ability” is another funny one. To the best of my ability, I am the rubberhose variant of Mickey mouse. To the best of its ability, this comment is Diet Coke.

Such completely meaningless statements, but strictly speaking, unfalsifiable.


In some respects, Phind's model is as good as GPT-4 at coding. (or so they claim)


Strictly speaking, nothing is falsifiable once you actually define things.


You should repost this as link to https://www.theinformation.com/articles/google-postpones-big... with the real title "Google Postpones Big AI Launch as OpenAI Zooms Ahead"

I'd do it myself but don't want to "steal karma" and I see you're active in the comments here (as of 12 mins ago..)


Only for a month, could believe it's about making sure it's polished.

More generally though, the fact that matching GPT-4 is the marketing blurb as opposed to a GPT-3 -> GPT-4 level transition has me wondering: are we seeing the limits of current "just scale the data" approaches to ML? Like, we've basically scraped everything worth scraping off the web, and when you do that with any of our current techniques, it lands you somewhere around GPT-4?

I don't have the link anymore, but i saw a quote from a LinkedIn connection recently at one of the FAANG's about how data matters so much more than architecture for ML today. To paraphrase, it went something like "it turns out that when your data is large enough, architecture doesn't matter. GANs converge to about the same thing as diffusion models. Convnets converge to about the same thing as ViTs. Data size and quality matter a lot more".


> has me wondering: are we seeing the limits of current "just scale the data" approaches to ML?

This was my thought too. You'd think Gemini would set higher goals (higher than GPT-4) given that it will come out one year after GPT-4, but no, they are just achieving GPT-4-level performance in some respects.

Maybe this says less about Google's lack of AI capabilities and more about OpenAI's first-mover advantage that made them pour billions into what was (like you said) the asymptote of AI intelligence using current arch.


The bulk of the GPT-4 training was concluded back in 2021, before the billions of dollars thrown at them by Microsoft.

It's notable that the scale, design, and architecture of GPT-4 is a closely guarded trade secret of "Open"AI. This strongly hints at some critical discovery, some new technique, or some aspect dialed up to 11.

Whatever it is, the competition hasn't figured it out, or managed to copy it.


> pour billions into what was (like you said) the asymptote of AI intelligence using current arch.

What a wild change in fortune that would be.

OpenAI/Microsoft looked to be forever ahead, but maybe this is the break everyone else needs to catch up.

I'll be really happy if we wind up with more companies owning different parts of the market rather than just FAANG at the top. Or more competition in general.


Apparently that is correct, this did the rounds recently where Bill Gates suggests current GPT has hit a plateau:

https://techstory.in/bill-gates-5-cents-on-generative-ai-gpt...


Why does Gates get credibility? He hasn't programmed for decades. His relevant business decisions were 2+ decades ago.

He's a proponent of a plant-based diet - looks very unhealthy.

He's the Oprah Winfrey of technology.


Gates is still one of the major stakeholder of MSFT. Combined with his previous position in MSFT, he probably has a good level of access to its long term strategy. Prediction of how OpenAI model will evolve is pretty essential information there?


Because he tends to speak about topics on which he is very well informed, like vaccines, climate change and malaria.


he seems to get access to these models way before anyone else. they showed him GPT-4 in the August of 2022!


The same reason Elon Musk does, they have billion-dollar think tanks advising them.


it turns out that machines trained on literature are only half intelligent much as human linguists may seem deep at first but philosophically prove to be barren


Just link directly to the article. No one wants to be exposed to Reddit comments.

On the article itself, yeah I don't believe a word from Google and they've clearly delayed it because it's going to be nearly as embarrassing for them as Bard has been. We know it's not because it doesn't handle some non English queries, because Google has never been shy about releasing products to only some regions or in some languages before.


>Just link directly to the article. No one wants to be exposed to Reddit comments

Humility is a virtue


Google are a formidable force in AI. OpenAI will have to continue to innovate at speed to stay ahead. Q* could be what they need to do just that, hopefully it works out. The more competition in AI the better.


Comments like this make me think the bitter lesson has not internalized.


I hope they upgrade Duet AI to use this model then, because the current models in Google Workspace are quite lackluster. On the positive side, the UX is integrated quite smoothly.


Doesn’t sound good and it translate to - it sucks


I can also create a model that only output the best response to “what is 1+1?” and still say in some aspects is as good as gpt4.


For a moment I was confused at the headline about what AI has to do with the Gemini protocol.


GPT-4 LLM + planning algorithm will head to AGI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: