Hacker News new | past | comments | ask | show | jobs | submit login

It could have been strategy instead of insanity. By starting conversations at $7T you anchor high and potentially drive a greater outcome than starting an order of magnitude lower.



That strategy only works if the anchor is in the realm of reality. If I'm selling a 20 year old Toyota Corolla and initially ask for $900,000 that's not going to help me get a higher price.


Anchoring bias probably works with numbers outside of the realm of reality as well. But I doubt it's very useful in these cases (or even any stronger than asking for say 300B), otherwise everyone would always start a funding round by asking for a 7T investment, right.


in a negotiation where you have experts with good intuition for costs and practicalities on one side of the table and a non-technical failing-upwards-megalomaniac-kid on the other side, i doubt that's a sensible strategy


It's also probably less-effective when the other side has their own teams of accountants making projections.


there are millions of corollas there’s only one openai


It's become a running theme that there are in fact a lot of OpenAIs, with how frequently their models get matched or leapfrogged by competitors.


Not really that many fabs either. They do this on that scale, they only need the AI to fill demand lol


Usually when doing anchoring you want to end up at a result less than what you originally asked for but crucially more than zero, and OpenAI immediately folded on building any fabs whatsoever, so I don't think it worked.


When you do something that stupid - starting at $7 trillion - you end the conversation before it really begins because you lose all credibility with the people that matter (eg TSMC and other investors).

If he had said $250 billion and six fabs, it would have been a lot to ask but people wouldn't think he was ignorant or irrational for saying it. Big tech for example has that kind of money to throw around spread out across a decade if the investment is a truly great opportunity.


Asking for $7T is - seriously - only slightly more absurd than asking for infinity dollars.


I guess he thinks his glorified markov chain will lead to ASI if scaled up sufficiently. Even if we get ASI, the likelihood that anybody will ever make any money from it is so delusional. This isn't going to be your average brainwashed peasant, crushing these capitalist pigs is probably the first thing it's gonna do.


This comment honestly feels delusional.


I guess a valid criticism is the idea that ASI would care about the ideologies of literal ants, or that it would be so alien that it doesn't even make sense to anthropomorphize it like that. But I guess the delusional part you got offended about was either the criticism of capitalism or to describe LLMs as glorified markov chains (which they technically are, look it up).


The technical term for this is “zone of possible agreement”. If your opening is outside the counterparty’s ZOPA, then they’re likely to disengage/exit.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: