Hacker News new | past | comments | ask | show | jobs | submit | nabla9's comments login

Boltzmann machines and associative memories originate in physics.

But, the starting point of Neural Networks in the ML/AI sense, is cybernetics + Rosenblatt's perceptron, research done mathematicians (who became early computer scientists)

This is price in physics. Not price in Neural Networks. Starting point of Hopfield's and Hintons work in recurrent networks was physics analogy.

Neural networks and physical systems with emergent collective computational abilities https://www.ncbi.nlm.nih.gov/pmc/articles/PMC346238/


Their work does not advance the field of physics in any way, unless you insist to extend physics to each and every discipline out there.

That's why I wrote that it was unexpected.I'm not taking position of if this was deserved or undeserved, but this was clearly in the realm of physics and inspired by it.

Accepting wrong arguments in support of positions you have is not good way to live your life. It leads to constipation.


Most commenters here don't know that Boltzmann machines and associative memories existed in condensed matter physics long before they were used in cognitive science or AI.

The Sherrington–Kirkpatrick model of spin glass is a Hopfield network with random initialization.

Boltzmann machine is Sherrington–Kirkpatrick model with external field.

This is price in physics given to novel use of stochastic spin-glass modelling. Unexpected, but saying this is not physics is not correct.


I'm a condensed matter/statistical physicist and am very aware of the connections to statistical physics I still think that the committee has completely lost it with this choice. There is a sharp line for me between things that are inspired by physics and those that are physics (and I really don't buy that physics is anything physicists do) -- and this clearly falls on the "inspired by" side.

I know plenty of physicists that would be very pissed off by all this drama.

There is so much more than fundamental physics and there is much more in physics than breakthrough discoveries. Medical physics just to name a fun field everybody always forgets about has been studying and using neural networks since about forty years. Applied physics, biophysics, atmospheric physics. Even particle physics is mostly data science these days.

This idea that physics should only be about fundamental theories and discoveries is really detrimental to the field and leads to the false idea of stagnation that permeates this whole thread.


So if they used a genetic algorithm, they could have got the prize for biology?

There is no nobel prize for biology

I see lots of potential here.

Agree completely, being in this field.

However, it is weird for the committee to give a prize for theoretical physics without an experiment. It is doubly weird when they already made this "mistake" in 2021 with Parisi, who was the odd one out among the geophysicists, and are giving another prize in spin glass/stat phys... why?


In summary, it's definitely related to physics, but kind of weird choice.

Why David Sherrington and Scott Kirkpatrick did not share the price for the Sherrington–Kirkpatrick model? Hopfield is referencing their work?

Multiple theoretical physicists working with black holes (Hawkin's and others) didn't get Nobel, because black holes were not confirmed or theory could not be tested.


The methods may be inspired by physics, but they have made no contribution to understanding physical laws or phenomena.

It's mathematical/CS work. The connection to actual physical laws or phenomena is even more tenuous than the prize for exoplanets a few years ago.

The Nobel prize physics committee has made itself a joke, and probably destroyed the credibility of the prize.


> and probably destroyed the credibility of the prize.

From now on I'll always see it as just another nobel peace prize.

This is beyond ridiculous.


> but they have made no contribution to understanding physical laws or phenomena.

Neural networks are used in tons of data pipelines for physics experiments, most notably with particle accelerators.

The Nobel Prize is also occasionally awarded to engineers who develop tools that are important parts of experiments. 2018 for example was awarded for chirped pulse amplification, which is probably best known for being used in LASIK eye surgery, but it is also used in experimental pipelines.


> Neural networks are used in tons of data pipelines for physics experiments

With this argument you could even say Bill Gates should get an award for inventing Windows and popularized the desktop computer... Or at least Linus Torvalds since those pipelines are probably running Linux...


No you couldn't. Windows doesn't have any bearing on outcomes, whereas machine learning methods directly impact the data and probability inference.

Yeah, well, those pipelines are running on HPCs that are using linux. Particle physicists kind of hate Windows.

The techniques highlighted in this prize are not really that useful for deep learning.

You mean besides bringing it into existence at all?

Please explain how Hopfield network influenced modern deep learning models based on supervised differentiable training. All the "impactful" architectures such as MLP, CNN, Attention, come from a completely different paradigm, a paradigm that could be more straightforwardly connected to optimization theory.

They did not bring it into existence. The MLP is older than the Hopfield network. The invention that made it practical was back propagation, which wasn't used here at all.

Also double descent was discovered already by physicists in 80s-90s

In curious what's the context for this?

Your experience is against insurgency and irregular force. Near-peer enemy with better weapons is different case.

Kosovo war for example. US deployed two battalions of AH-64s in Task Force Hawk into Albania, but never send them over the border. Their task would have been just what they were designed for, but risk reward ratio was just too high. Serbs would have been able to cause them massive casualties.

For most cases combining drones for targeting with field or rocket artillery achieves the same.


My experience is not just against insurgents. I fought uniformed Iraqi military, and some black pajamas guys later. I was there in 2003.

The conflict in Kosovo was nothing like the ROE in Iraq, and if you’re on the ground calling for fire you damn sure would know the difference between arty and us.



See my comment https://news.ycombinator.com/reply?id=41774909&goto=item%3Fi...

This is 101 microeconomics. Not very controversial


The basic point of the article is correct despite everyone here in comments coming up with ad hoc arguments against it.

This is Basic 101 microeconomics (pick some undergraduate text from economics and look it up.) There is also a whole subfield of economics called industrial organization that deals with this stuff.

Firm size matters for productivity. Larger firms are on average more productive than smaller ones. Partly it is because gains from increasing returns to scale but better access to resources, organizational capabilities, and international reach also matters. Large companies tend to offer higher compensation. The average pay per employee increases with company size. This is good for the economy.

Take for example Greece. People in Northern Europe like to think that Greeks are poor because they are lazy. However, they are among the hardest-working people in the EU—insane hours on average. But Greece has no large-scale industry. It can't compete within the rest of EU or internationally.


> This is Basic 101 microeconomics

Interesting. I've only read one intro to microeconomics book, but I remember it having a hand-waved graph with a clear peak on some unspecified point. And an explanation that the peak's position depends on a lot of factors.


>Larger firms are on average more productive than smaller ones

Do you have any references for this that demonstrate it empirically? Theoretically, larger firms have economies of scale, but they also run into the same internal coordination/incentive problems that communist countries do, due to internal resource allocation being driven by internal politics rather than a market. I.e. command economies (and the average corporation is a command economy internally) face diseconomies of scale.


This is super interesting. So much interesting details once he starts going.

There's a part 2 from his VMWare days

https://www.youtube.com/watch?v=MxZe1i8z-8Y


If the car model has CoC number, it has been type approved. EU keeps approval register. The CoC is not limited to EU It works in every European country.

Tesla has announced that they are in process of making Cybertruck version that complies with EU regulation, but Cybertruck is hard to adjust. Lack of crumple zones is a big issue.

Getting it approved as commercial vehicle should not be a problem, but nobody wants that. Not Tesla, and not the consumers.


Cybertruck's 3mm stainless steel panels are insurance problem. Stainless steel is hard to repair. Normal PDR does not work. The cost of repair for even the most minor accidents is very high. In many cases replacement panel is the cheapest way.

Tesla’s supply chain does not keep up with demand for repair parts.


IIRC, it is also precipitation hardened stainless, so a weld repair would anneal it, cutting the affected area's strength in half.

Gelman has contributed to Bayesianism, hierarchial models and Stan is great, but that's not even close to what Rubin has done.

ps. Gelman was Rubin's doctoral student.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: