Hacker News new | past | comments | ask | show | jobs | submit | more scottmsul's comments login

Everyone here seems totally lost on the physics connection. Suppose you have a box of atoms, each atom can be in one of two states, a low energy E1 and a high energy E2. If the box has a temperature T, then the probability that any atom is in state E1 is e^(-E1/kT) / [ e^(-E1/kT) + e^(-E2/kT) ], and similar for E2. As you lower the temperature most of the atoms gravitate towards the lower energy state E1, and as you raise the temperature they gravitate towards a 50/50 mix of E1 and E2.


Since you bring up physics, this has a name: the Maxwell-Boltzmann distribution. You might have trouble getting physicists to describe particles as “gravitating” to it, though — the particles are doing their own thing, and this is the resulting probability distribution.

But this is only for distinguishable particles. If you have a bunch of indistinguishable particles, you get the Fermi-Dirac distribution or the Bose-Einstein distribution, depending on whether they are fermions or bosons.

You can find all of these distributions on Wikipedia.


Fun fact: not only you get closer to a 50/50 mix of E1 and E2 (increasing temperature) but if you continue to add energy to the box you will find yourself with more atoms in the E2 state than in the E1 state.

The temperature will go from infinity to -infinity and as you keep adding energy you will approach zero temperature from the left (increasing temperature). The zero value is reached when the energy of the system can no longer be increased and all the atoms are in the E2 state.


How are you defining temperature? I assume it's not average kinetic energy of the particles. Is it that definition I learned once upon a time where T = d entropy /d energy? Is this a useful definition of temperature if it leads to this scenario?


Yes, this is in accordance with the definition of temperature as d entropy / d energy, which is the more generally-applicable definition. It's easy to see from this that temperature becomes negative if adding more energy into the system causes the entropy to decrease. In most systems that doesn't happen because there are always an increasing number of higher-energy states that open up, but in a very carefully constrained system, the added energy forces the system into a smaller number of excited states away from a higher-entropy ground state. A real-world example of this is a laser's population inversion.

A system at finite negative temperature is actually considered "hotter" than a normal system at any positive finite temperature; if you put them in thermal contact, heat will flow in the direction that increases entropy, which you get by taking energy out of the negative-temperature inverted system and adding it into the ordinary positive-temperature system. This increases the entropy of both systems.

The definition where temperature is the "average kinetic energy of the particles" is a special case, and it only really works when that energy is evenly distributed over all degrees of freedom. For example, you wouldn't consider an icy comet to be at a high temperature just because it's moving quickly, even though its particles have a great deal of kinetic energy on average!


The two-state system was the first example I learned in undergrad stat mech, and really helped me understand the first-principles definitions of entropy, temperature, etc. My parent comment was more for the casual HN reader, but if you really dig into the two state system, negative temperatures aren't all that weird.

The first thing to understand is microstates, which is just the number of ways a system can have a certain energy. Eg in a two-state system with ten particles and energies +/-(E/2), there's one microstate where the energy is -5E (all negative), ten states with -4E (one elevated), etc. Then entropy is just the log of the # of microstates, which is much easier to deal with, since microstates tend to behave exponentially. Eg entropy(E=-10) is log(1)=0, entropy(E=-9) is log(10), etc.

Then temperature like you said is d(entropy) / d(energy). Two systems with different temps brought into contact will exchange energy until the temps are equal, since this configuration maximizes entropy.

The two-state system can have negative temperatures since entropy starts decreasing with energy once more than half of them are in the higher-energy state. This can't happen in more familiar scenarios (eg ideal gas, blackbody, etc) since usually entropy always goes up with energy.


The most fun part of learning statistical mechanics was un-learning everything I had learned about temperature in high school!


Just because there is a physics connection doesnt make this a good name for the parameter.

I do understand the benefit of not having 10 different names for the same concept under different scenarios, however. Even if that name isnt the best.

But note that, even in the physics scenario, temperature isnt really the name providing the most intuition either. You need to be aware of the connection between higher temperature -> higher excitability/mobility to begin with, for it to make sense; and temperature isnt the only way to modify this underlying excitability in the first place.


Also the novel rulesets make it so much more interesting than vanilla sudoku. Oftentimes the puzzles have really abstract/mathematical/beautiful break-ins. Definitely adds a lot of variety and replayability.


In general relativity space-time is a 4-dimensional (3 space + 1 time) surface that itself can expand, contract, or move around. It's perfectly reasonable in GR to have a distance be L at time t, then L+dL at time t+dt. No extra spatial dimensions needed.


There was a point in time where everything in the visible universe was the size of a potato, but the entire universe (beyond the visible edge) is most likely just more matter in every direction indefinitely.

Because the universe follows general relativity, space itself behaves like a fluid. When the universe "expands", that's just space itself growing bigger over time.


I would add the connection to GR. In a Newtonian world, there would be nowhere for matter to expand into, and our intuition would be correct. But in General Relativity, space itself can grow or shrink like a fluid.


Something that's always bothered me about many-worlds is that it states the number of worlds should always be increasing exponentially. Which means from an anthropic point of view, the most likely time to be born is the last ever human, with insanely exponential odds. But many people have been born in my lifetime, ergo many-worlds seems unlikely.


The many-worlds interpretation is badly named: the "worlds" are just particularly classical-looking regions of the wavefunction we've chosen to draw some lines around for our own convenience. They're part of the map, not the territory.

It's true that a decohering system will have exponentially more classical-ish parts over time - but the parts get smaller at the same rate, and the total size of the system remains the same.


This reasoning is fallacious. Assuming the theory, the number of worlds in which humanity doesn't exist or already died off also increases exponentially, as does the number of worlds where you are not the last human. These exponential terms balance out to the result that many worlds cannot affect the probability of conditional events at all, once you take account for the anthropic bias.


I've been curious to learn theories where only a subset of scenarios beget alternative world, and many of these ultimately merge into more probable branches of reality. Keeping the total number of worlds growing at less than O(n*2) with respect to time.


In the MWI there is an infinite number of worlds (as many as there are real numbers). So the total number of worlds grows at O(1) :)


Well, not quite. You'd actually have an infinity of worlds with the cardinality of real numbers. So it would never increase or decrease, in the same way as there are as many numbers between 0 and 1 as numbers between 0 and 2.


This kind of thing is exactly why you shouldn't take those sorts of anthropic argument too seriously.


Does that argument hold for all of us?


When I was 5 my parents put me in a "normal" kindergarten, where I failed to do literally anything - I wouldn't make any marks on any of the sheets, not even my name. They were worried I had some kind of developmental problem.

Somewhat out of desperation they put me in a Montessori school. By the end of Kindergarten I was reading at a 3rd grade level and understood the basics of multiplication.

We ended up moving by 1st grade and I went back to "normal school", where I did ok since I already knew everything.


I also knew everything in first grade, so I was bored and acted out and eventually the teacher put a cardboard box on my desk so nobody could see my and I couldn't bother anyone. Some say this explains a lot ...


Yup, boredom makes a teacher's nightmare. Spitballs, tweeters, paper airplanes, animal noises, the full litany.


I'm guessing he means there's no algorithms that decide what you see and what you don't, you just see everything from everyone you follow. But I don't use Tumblr so I'm not 100% sure.


They still do. There are algorithms to fetch whatever will be shown on your feed. They may be as straightforward as fetching a user's follow list and displaying their posts chronologically in reverse. That's still an algorithm.

I'm sure Gaiman means specifically the kind of "smart" feed algorithm like FB etc employ. That said, I really dislike this trend of equating "algorithms" to something wicked. I write them for a living, and I'd rather we kept writing and employing them.


> They may be as straightforward as fetching a user's follow list and displaying their posts chronologically in reverse. That's still an algorithm.

Correct, it's absolutely an algorithm. Crucially, it involves extra steps to make it scalable without needing a materialized inbox. Or at least that's how Tumblr's reverse-chron dashboard worked from 2009ish through at least 2018, maybe still today.

There were a bunch of optimizations in there to keep the queries fast, and minimize the number of rows that need to be examined and sorted. One key step involved cross-referencing the blogs you follow against their latest post timestamp. For example, say you're fetching posts 11-20 on the reverse-chron feed. The worst-case is that posts 1-20 all come from different blogs, so you can sort the list of followed blogs by latest-post-timestamp, and then only examine/sort posts from the top 20. (That's a slight simplification; you actually need to look at the timestamp from post 10, and examine posts from however many followed blogs posted since then, plus 10 more. I spent months of my life tuning this stuff back in 2011-2012...)

That all said, in the context of dashboard feeds, "algorithm" often just means the opposite of reverse-chron. But even then, the "no algorithms" statement is completely incorrect! Tumblr has a "Best Stuff First" setting which controls whether or not your feed is reverse-chron. For a while in late 2017 (iirc) onwards, this setting was even enabled by default... maybe still is?


I'm about 200 hours into a Space Exploration game with some friends, and we just started building spaceships! One thing I really like about this mod is that it forces you to properly learn the circuit network, I never really used it in the base game but it's an absolute must for interplanetary logistics.


I took this exact class exactly one decade ago (Fall 2012) when it was taught by Jeff Foster! One of my all-time favorite classes. The main language used was OCaml, and we ended the course by writing a compiler for a made-up minimalistic OOP language.

Funny story, I mixed up the due dates for one of the projects and didn't realize the correct date until the day before. Somehow I managed to do the entire project in one sitting, working from about 3pm-midnight. I basically took over an unused room in the math building and scribbled all over the chalkboards until I internalized everything, then the coding wasn't too bad. I even remember ordering DP Dough about halfway through, those were the days haha.

OCaml was one of those languages where I would stare at the problem for an hour, then realize the solution in a sudden epiphany, and it would only require a couple lines of code.


Got a favorite 'zone from DP Dough? I used to get the Fancy Zone a lot.


That overnight implementation is badass


Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: