Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

uh, what I'm saying is that all of that could've begun in 2014, not just now. clearly they're playing catch-up. Before OpenAI what language models did they release, exactly?

how is segment anything playing into their metaverse strategy exactly?

it's very clear that they're relasing this stuff to make it known that they're not going to fall behind to Microsoft (OpenAI) and Google. it clearly was not part of their original strategy around the metaverse, which is why it's all coming out now.



https://ai.facebook.com/blog/roberta-an-optimized-method-for...

As just one example, which was (is?) one of the most used flavors of transformers from the BERT era.


I'm not talking about papers, I'm talking about a product offering, ala ChatGPT. Even Segment Anything isn't a real product. It's a glorified tech demo. Is it integrated into say, Facebook photos? no.


idk about llm, but they had released multimodal wav2vec, some variations of bert, transcoder if we talk pre gpt3 release and a lot more stuff after. Is it because of gpt or because some general advancements in AI&GPU power, we can't say




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: