Hacker News new | past | comments | ask | show | jobs | submit | letitgo12345's comments login

Think more is made of this asterix than necessary. Quite possible adding 10x more GPUs would have allowed it to solve it in the time limit.

Very plausible, but that would also be noteworthy. As I've mentioned in some other comments here, (as far as I know) we outside of DeepMind don't know anything about the computing power required to run alphaproof, and the tradeoff between computing power required and the complexity of problems it can address is really key to understanding how useful it might be.

They found tons of oil

Maybe it is but it's not the only company that is


It's an excellent way of getting your best people who have options to quit while the worst ones who don't are forced to stick around


Those "best people" may also be some of the higher paid - so if you only care about short term results that may be more of a feature than a bug.


Does Amazon do anything that requires best people anymore? They could probably survive on decidedly mediocre


This is a site setup by ppl unaffiliated with OAI it seems and has wrong claims -- ex O1 doesn't solve 83% of IMO problems -- it solves 83% of AIME problems which are significantly easier.


One question is how specific the binding is -- what's the level of off-target effects, etc.


This might be the largest seed round in history (note that 1B is the cash raised, not the valuation). You think that's an indication of the hype dissipating?


At the height of the Japanese economy in the 80s the about 2 square miles of land on which the Imperial Palace stood were worth more than all property in California. Clearly a brilliant moment to get into Japanese real estate!


Tensorflow losing has nothing to do with Google getting bored -- it's vice versa.

Tensorflow is a symbolic framework, which is less intuitive to work with for most people than the Pytorch. Not to mention the errors Tensorflow generates are more annoying to debug (again more an issue with the fact that it's symbolic than any lack of effort on part of Google)

Google tried to fix it by introducing an eager mode in Tensorflow but by then it was too late.


> Google tried to fix it by introducing an eager mode in Tensorflow but by then it was too late.

And the fix was "new major version with a fundamentally different programming paradigm"

But it turns out when your users are irritated with your product, and you tell them to change to a fundamentally different programming paradigm, the new programming paradigm they change to might not be yours.


Intuitive has nothing to do with it. Developers will tend to prefer things that make their lives easier. Debuggability is a huge part of that. Tf 1.0 having a static execution graph was a major pain. No wonder people switched to PyTorch and didn’t look back.


Feels like the next generation of models could truly start replacing lower level ML and software engineers


Looks like her argument is that code of conduct is not legally enforceable and that Amazon itself has argued it cannot be used by employees to sue Amazon. Hard to feel sorry for Amazon here for me despite Amazon seemingly being morally in the right in this case


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: