Hacker News new | comments | ask | show | jobs | submit login
Kilogram conflict resolved at last (nature.com)
294 points by ColinWright on Oct 14, 2015 | hide | past | web | favorite | 131 comments



The best part about this batch of changes is they push the mole and Avogadro's constant out on their own where the belong, not linked to any other units. Now we'll have only a single mass unit (kg) instead of two (kg and unified atomic mass unit) that we do now. This will knock carbon 12 of its perch as the definition of the "other" mass unit that's been essential to use SI's mole but was not actually SI itself.


But wouldn't the most straightforward definition of a base mass unit (kg) be linked to the mass of an atom?


Yes. I believe Asbostos is commenting that they currently are not linked.

("Yes", with the proviso that the proposed silicon atom definition is a more obviously direct link than the watt balance.)


What I mean is, the proposed new definition defines kilogramm using a relationship to Planck constant. Instead, I think that a more intuitive definition would be something like "the mass of {huge number} of C or Si atoms".


The proposal is to define all fundamental physics constants, and derive all the units from them.

The Avogadro number isn't a fundamental physics constant, it's just a unit conversion constant, and thus, didn't get to define any basic unit.


> The Avogadro number isn't a fundamental physics constant

That's a pretty arbitrary assertion. Who defines what "fundamental" is? The number of periods of <something> of Caesium atoms used to define the duration of one second doesn't sound very fundamental either.


In fact, the second and candela aren't choices aren't that fundamental, mostly reflecting how those are measured. All the others come from the speed of light, fundamental charge, Plank's constant, and gases constant; those are pretty fundamental to physics.

Avogadro's number is used to define the atomic units, what makes a lot more sense than pushing it into metric calculations.

The wikipedia page is great: https://en.wikipedia.org/wiki/Metric_system#Future_developme...


You seem to be assuming that the atom is fundamental. It's not. Why not base things off the mass of an electron instead?

What you want, is to go down to the smallest unit possible, which is apparently Planck's Constant.


It's an arbitrary physics constant.


The mass of a chunk of atoms depends on their temperature and their purity, which are things you can't measure very accurately the more atoms you have.

Essentially, that's what they were doing with Le Grande K. It's unstable because molecular properties are unstable in aggregate.


The second depends on a specific isotope at a specific temperature, 0K, so it's all theoretical anyways. How you define something doesn't need to coincide with how you measure it.


It's far easier to measure a single isotope at a specific temperature limit than it is a whole rod of them. Just because they're both theoretical doesn't mean there isn't a huge practicality component to the decision.


How exactly mass depends on a temperature?


E=mc^2

You may be thinking "Oh, you fool--E=mc^2 is for fission and fusion, and certainly doesn't apply to things like the energy in the heat of an object". But weirdly, it does. https://physics.stackexchange.com/questions/87259/does-decre...


It also applies to things like gravitational potential energy, or velocity. Climb a flight of stairs, and you gain energy. Gaining potential energy really means that you've gained mass, and so you're heavier at the top of the stairs than at the bottom. The amount that you are heavier depends on the amount of energy you gained, converted into mass. Similarly, as you gain velocity, you gain kinetic energy, which also makes you heavier. If you wave your hand in front of your face, your hand's motion causes it to gain mass. Velocity also dilates time, so time passes ever so slightly slower than for the rest of your body.

This typically only matters in a relativistic context, and doesn't impact day to day life. The difference in passage of time can be measured by an atomic clock if you put it on a rocket into space, but is otherwise insignificant. It's unlikely any scale could measure your weight-gain from climbing stairs, since the amount of energy you gain is insignificant when converted into mass. You can compute the mass gain by solving for "m" in the formula E=m*c^2, so m=E/(c^2).

The speed of light c is a really big number, and so to compute the mass you gain, you're dividing the energy by c^2, which is a much bigger number. Thus a gain to kinetic or potential energy does not noticeably affect your mass in day to day situations. Conversely, if you can convert any meaningful part of your mass into energy, then it's an absolutely tremendous amount of energy: atomic weapons.


> Gaining potential energy really means that you've gained mass

I've never heard this before. Can you link to some further explanation? Intuitively, if anything, you'd lose mass, because you're in a place now where space is less curved than where you were before.


The system composed of you + planet gains mass. You can not measure a different rest mass for yourself at either situation, so you'll probably attribute the extra mass to the planet (and the planet to you).

And yes, there's also some change due to changes in gravity. I'd expect that to be much smaller.

But IANAP, and not that good with relativity.


Mass is constant in this equation. It is absolutely incorrect to talk about a "rest" mass and a "relativistic" mass, only energy and momentum are relativistic, while mass is always a constant. Algebraically it may make a tiny bit of sense, but there is no physical meaning behind it.


Citation, please. Everything I've ever read says “E = mc2 applies to all processes that release or absorb energy.”


Basically, there are two schools of thought. An outdated one, which merges γ into m, and the current one (e.g., the Landau lineage) which leaves m alone. You won't find any "relativistic mass" in any decent source published since the famous theoretical minimum ( https://en.wikipedia.org/wiki/Course_of_Theoretical_Physics ).

There is a very good reason for this approach. In the simplest, most classic way of deriving the special relativity from the first principles, the 4-velocity is introduced before the 4-momentum. I.e., γ and m are coming from different places and are not connected in any ways whatsoever.


"In general, relativistic and rest masses are equal only in systems which have no net momentum and the system center of mass is at rest". This is the case of a system being heated. I'm still led to believe that the mass is still increased by the increased motion of the particles in that system. https://en.wikipedia.org/wiki/Mass_in_special_relativity


Btw., even the Wikipedia article contains a nice explanation of what I'm talking about (and, yes, I'm of the Landau-Kapitsa-Okun school of thought, naturally): https://en.wikipedia.org/wiki/Mass_in_special_relativity#Con...


As I said, algebraically both approaches are equivalent, so there is not much harm in using this notion. But the "relativistic mass" does not have physical meaning and does not make any sense because of the way relativity theory is defined. And I would not count wikipedia as an authoritative source, anyway. 2nd volume of the Course is a bit more legit.


Measured mass is a function of total energy.


It is not a mass (or a "rest" mass, if you want to use this incorrect and unnecessary notion) that you measure.


That's what the new definition does, using Si(28) as the "reference atom". In the current definition the kilogram is not related to atomic mass.


No, the new definition defines kg using Planck constant (according to Wikipedia).


silicon-28 sphere(s?): https://www.youtube.com/watch?v=ZMByI4s-D-Y (yep, they let him palm it)

watt balance: https://www.youtube.com/watch?v=VlJSwb4i_uQ


Surprised that the cotton gloves or particles that might float onto them or be coughed or breathed onto them aren't abrasive.


I was amused to read this:

"They never found the cause for the disagreement, but in late 2014 the NIST team achieved a match with the other two"

at a time when this story, also from Nature, is also on the front page: https://news.ycombinator.com/item?id=10383984


I was a bit more concerned then amused.

Having worked with some folks who do these high precision measurements, it's concerning that they never found the cause for disagreement. Pinning down systemic error is really, really, really hard.

As Feynmann pointed out in reference to the oil drop experiment(https://en.wikipedia.org/wiki/Oil_drop_experiment):

"We have learned a lot from experience about how to handle some of the ways we fool ourselves. One example: Millikan measured the charge on an electron by an experiment with falling oil drops, and got an answer which we now know not to be quite right. It's a little bit off because he had the incorrect value for the viscosity of air. It's interesting to look at the history of measurements of the charge of an electron, after Millikan. If you plot them as a function of time, you find that one is a little bit bigger than Millikan's, and the next one's a little bit bigger than that, and the next one's a little bit bigger than that, until finally they settle down to a number which is higher.

Why didn't they discover the new number was higher right away? It's a thing that scientists are ashamed of—this history—because it's apparent that people did things like this: When they got a number that was too high above Millikan's, they thought something must be wrong—and they would look for and find a reason why something might be wrong. When they got a number close to Millikan's value they didn't look so hard. And so they eliminated the numbers that were too far off, and did other things like that..."


Regarding the Feynmann quote, I suspect that behavior itself is intrinsic to humans. Few people want to be the guy coming up with a wildly different answer from a consensus/authoritative answer. We see the same thing in election polling results exhibiting 'herding'. If a pollster has a result wildly off from the polling average (especially, near the election when the variance is expected to be lower), they'll sometimes 'put a thumb on the scale' as fivethirtyeight[0] put it. This makes some sense, from a CYA perspective even if not from a scientific one. If they publish an outlier and are wrong, they look bad. If they publish close to the average and are wrong, well, at least everyone else missed it too.

[0] - http://fivethirtyeight.com/features/heres-proof-some-pollste...


I don't think it's a human thing; I think it's a problem whenever you rely on previous knowledge or sharing knowledge, which science is exactly about. There is no practical experimental set-up that systematically reestablishes all prior knowledge from scratch, so there has to be trust in other scientists and reevaluating that trust must be grinded out the same way we grind out new scientific results: methodically, reproducibly.

What is problematic when it comes to humans is how our social structures are organized for doing science. They are hierarchical, resources are controlled centrally, and scientists are forced to compete with each other instead of cooperating with each other. Science is a career. We injure science and scientists by tying up their economic prosperity with their ability to convince the rest of the world that their work is worth anything. This creates a huge incentive to push forward and a huge disincentive to reevaluate past results: Your reputation can be damaged because you might be undermining the legacy of a high status scientist, and if you confirm the past result then you haven't done anything new and that reflects poorly on your 'performance'.

Science succeeds in spite of status, institutional monopolies, and hierarchical social organization. It would flourish in a more egalitarian society.

Is it intrinsic to humans to be hierarchical and status based? I want to say no. I don't think so. It is in the interests of the prevailing powers of the world to convince people that it is the case though, because they'd rather we not imagine a world where there isn't power to accumulate and hold on to.


Another thing is is if you do work in fields considered "fringe". No matter how diligently you follow the scientific method, you'll be ridiculed if you find the "wrong" results.

I've come to the opinion that scientists are ideologues, but the ideology is based in the current understanding of physics rather than the results of experiments and the scientific method. A famous example is Arago's dot (aka Poisson's dot), where Freshnel was ridiculed for his wave-based theory of light by Poisson despite the latter not even bothering to do an experiment.


> I suspect that behavior itself is intrinsic to humans.

Sounds like Anchoring https://en.wikipedia.org/wiki/Anchoring


so the proposed definition was set by fixing the numerical value of the Planck constant to 6.62606X×10−34 s−1·m2·kg

and the conundrum was that they still needed to have a precise enough measurement of that constant because it's an experimental measurement.

https://en.wikipedia.org/wiki/Proposed_redefinition_of_SI_ba...

https://www.quora.com/How-is-Plancks-constant-derived


Are metric measurements all derived from the value 1kg? If so, does this mean that the entire metric weight range can now be officially based on mathematics?


There are seven so called "base units" defined in the SI; meter, second, mol, ampere, kelvin, candela and kilogram [1].

Of these, mole and kilogram are dependent of the kilogram.

edit: this is a good, if maybe a bit misleading, illustration: http://www.nist.gov/pml/wmd/metric/upload/SI_Diagram_Color_A...

[1]: http://www.nist.gov/pml/wmd/metric/si-units.cfm


The NIST chart is interesting. I'm surprised that charge is defined in terms of current, rather than the other way around.


> Of these, mole and kilogram are dependent of the kilogram.

Did you make a typo? While it's technically true that the kilogram is defined by the definition of the kilogram, I just wanted to make sure.


Not a typo, but maybe a bit vague. I meant to say that the SI units both are dependant on the real world kilogram object.


No, at least second and meter are defined with math and elementary physics (e.g. second is the time it takes for a specific atom to oscilate a specific number of times).


Not the atom, the electromagnetic radiation emitted by a specific quantum transition of the atom. (Cs 133 and 9192631770 Hz)


for everyday applications you can assume that 1kg = 1l H2O = 1dm³ if that's what you mean.


Wouldn't that change though based upon the isotopes of the atoms in it? For example H2O made with Deuterium (1 proton 1 neutron and 1 electron) or Tritium (1 proton 2 neutrons and one electron) vs Protium (1 proton 1 neutrons and 1 electron), could all give different weights, not to mention different isotopes of oxygen 16 17 and 18 for the most common/long lived.


Hence "everyday applications."


In a (much more rational than SI) system "h=c=1" there is only one fundamental unit left - kg (or ev, which still depend on a kg definition). So getting rid of kilogram is absolutely essential.


More completely, ħ = c = G = kB = ke = 1.

That system is mostly irrelevant for cultural and legal purposes. Even in the specific philosophical framework where getting rid of the kilogram 'absolutely essential', the current importance of the kilogram would simply migrate to the importance of the specific conversion factor to traditional units.


Also known as Planck units

https://en.wikipedia.org/wiki/Planck_units


Any such conversion would not need a physical object - just a set of arbitrary constants (which is fine).


Reminds me of this movie: https://www.youtube.com/watch?v=5dPnFO_JCdc (haven't seen it, but it looks interesting)


The kilogram is still the only base unit that contains an SI prefix in the base unit's name.


Yeah, that's still a bit wierd. There's also the [centimetre-gram-second](https://en.wikipedia.org/wiki/Centimetre%E2%80%93gram%E2%80%...) system that's used in some areas that has the prefix on the length unit.

On a related note, tonne (i.e metric tonne) is widely used, though not generally in the scientific arena. This is partly for historical reasons, but I think there's an element of it being a little awkward to apply prefixes to the kilogram, as one must multiply the "kilo". Nonetheless, we should exploit the full power of SI prefixes and use "megagram" instead. With two short syllables rather than one long syllable it takes about the same amount of time to say and is far less ambiguous.

This works well elsewhere; Fat Man wasn't 21 kilotonnes of TNT, it was 21 gigagrams.


As a student in physics, I always used to joke about how we should say "3 megadollars" and "5 gigadollars" instead of "3 million dollars" and "5 billion dollars".


I think it would help if news reports always used the same units. "This project would cost 0.1 Gigabux out of a total agency budget of 18 Gigabux." Switching between millions and billions and trillions can obscure how small, or large, something actually is in relation to the whole.


There's a backup service that lists its pricing in picodollars per byte (or byte-month).


http://www.tarsnap.com/picoUSD-why.html

  There are three major reasons why Tarsnap pricing is defined in terms of picodollars per byte rather than dollars per gigabyte:

    Tarsnap's author is a geek. Applying SI prefixes to non-SI units is a geeky thing to do.
    If prices were listed in dollars per GB instead of picodollars per byte, it would be harder to avoid the what-is-a-GB confusion (a GB is 10^9 bytes, but some people don't understand SI prefixes). Picodollars are perfectly clear — nobody is going to think that a picodollar is 2^(-40) dollars.
    Specifying prices in picodollars reinforces the point that if you have very small backups, you can pay very small amounts. Unlike some people, I don't believe in rounding up to $0.01 — the Tarsnap accounting code keeps track of everything in attodollars and when it internally converts storage prices from picodollars per month to attodollars per day it rounds the prices down.


I had a physics teacher who joked that instead of “gram”, we should say “millikilogram”.


What is the level of accuracy they are aiming for? If it entails have some uncertainty over the precise number of atoms in the silicon sphere, then how did they choose this level of accuracy?


The possible accuracy is limited by uncertainty over mass of IPK(International Prototype Kilogram), which is 20 ppb. You cannot measure the number of atoms in a kilogram of silicon better than that, because a kilogram is defined by IPK.

20 ppb translates to uncertainty of about 10000 trillion atoms. The number will be set to a round number, because any number within trillions is compatible with the legacy standard.


I don't know what fundamentally sets the accuracy, but it's definitely a lot coarser than the number of atoms in the silicon sphere. G is known to 4 or 5 significant digits, while atomic accuracy would be 26 significant digits.


G doesn't matter if you use dual-pan balances, and they do.


Ah, they went with the "electric kilogram". The other plan was to build up a regular structure with a known number of silicon atoms. That idea was to make a perfect crystal and count the number of atoms on each face. Apparently that's almost possible, although hard to do.


According to the article, they did both; the definition is in terms of physical constants, and the electric-scale and silicon-sphere experiments were separate ways of measuring according to this definition. This let them verify their work by checking that two separate methods of measurement came out with values that were similar to each other with enough precision.


The article says they did that, too. (Actually, it says they used silicon spheres, but close enough.)


Silicon will oxidise the moment you put it in air, were they going to keep it in ultra-high vacuum or just deal with the oxidisation?


Ultra-high vacuum is way harder to maintain than simply keeping it in nitrogen.


Was that the plan then? You'd need to have one hundred percent purity nitrogen with no oxygen and no water, is it really possible to keep it at the level of purity you need?


I don't know, I was just presenting an alternative solution to the problem you were talking about. I imagine that the sides of the encasement would have to be pretty impermeable to keep oxygen out, but it's probably harder if the oxygen-containing air is pushing in and nothing is pushing out.

Playing around with the chemistry more, it might be possible to use a gas that oxidizes more readily than silicon, so that if any oxygen does get in it will be neutralized before it gets anywhere near the silicon. I'm not sure if that's possible to do with a gas, though.


Here I thought a kilogram was defined by water... oh well, looks like that definition is slightly outdated.

https://en.wikipedia.org/wiki/Litre#Rough_conversions


The used to be defined as one kg of water, but it is not an official SI unit (it is part of the SI accepted metric system, which does not require such rigorous definitions).


> The used to be defined as one kg of water

I think you meant 'one litre'


One kg of water is definitely one kg.


One kg and one litre of water are the same thing.


Inofficially, it has been. Technically, it's not quite true. Water has a density of 999.97 kg/m³.


That's unfortunate. How did it come to be that way?


Water is too hard to keep constant, and too hard to reproduce. Keeping it exactly in 1dm3 is tricky, for example. Not for "regular" use, with a few significant digits of course; but when you're defining fundamental physical constants, atoms count.


At what temperature?


"water at the melting point of ice" from an old definition


Exactly, so you need to add that or it won't be true most of the time.


Not really. Because of temperature, usitipe composition, etc, the weight of a liter of water is not precise enough for physicists.


That depends on the temperature ...


At 0C


The litre is defined as 1dm^3, the mass-based definition was dropped in '64.


i was hoping this was going to explain the kg differences between the original and the copies. instead it just resolves it by changing the standard. good for science i guess, sad for my curiosity


We actually know why. Check the paper series "Stability of reference masses" published in Metrologia. Summary: mercury contamination.


Before I was disturbed by the change in apparent mass without explanation.

Now I'm disturbed that the people working in these laboratories were exposed to mercury themselves. Nasty stuff.


Duplicate, very close in time: https://news.ycombinator.com/item?id=10385743.


If you are interested in hearing more expert commentary, NPR Science Friday did a piece on this in July:

http://www.sciencefriday.com/segment/07/17/2015/redefining-t...


A planet money podcast from a few years ago on the kilogram: http://www.npr.org/templates/story/story.php?storyId=1120033...


I wonder: the article states that the SI unit for Kg up to this point was defined using a single object. Doesn't this definition also involve the fact that it's placed on earth, thus requiring two objects for its definition?


Nope. Mass != weight. The weight of 1kg on Earth is about 10N (and varies by location). The weight of the same 1kg object on the moon would be much less, but the mass remains the same.


Mass is constant across all gravitational fields (and anywhere there isn't) for any given object.

People interchangeably use lbs <-> kg but the actual equivalence is lbs <-> newton (N). The difference doesn't matter in the average person's life since we're all down here where gravity is homogeneous enough for most applications and people.


Nothing prevents you from measuring it on Mars.


I didn't understand what then will be used: the Si sphere or the Watt balance?


The 2011 proposal is to define the kg by the Watt balance, and use the Si sphere to define Avogadro's number.


And what was then actually the conflict now being resolved I also couldn't figure out from the fine article if the experiments anyway had the different goals since 2011. Anybody knows?


Not everybody agreed with the decision, because nobody knew if the experiments would agree.

Now, both experiments have agreed to a precision high enough that current metrology best practices will not have to change.


That's the beauty of it! Since they are now yielding the same result you can (in theory) use whichever one you want.


I can't imagine the result of the different methods will always be the same even if they are getting "close enough" now? It's an advance in measurements, as far as I understand, but measurements always have some range, and how can they know the ranges for different methods won't change as the experiments advance?


(forgive me if this is unnecessary review. hazard of message board communication)

My understanding is that up until now they've been using a reference kilogram (Si sphere or, for the watt balance, a reference object) to measure Planck's Constant.

Now that they are getting repeatable, converging results for Planck's constant they can turn the process around; define Planck's constant and then use the apparatus to generate a reference kilogram.

And so the beauty of it is that any sufficiently motivated team could assemble the equipment (either apparatus!) and manufacture a reference kilogram that would be just as good as anyone else's.

Unlike the present state of affairs, where no matter how hard anyone wants to try, they simply can't manufacture a reference kilogram better than the one in Paris.


You could say the same about multiple measurements using a single method. In the event that we do find the results by different methods diverging, that is a better situation to be in than having picked one method and not being aware that the results are drifting.

However, as the methods are based on very well-established physics, there is good reason to expect that, as more measurements are made and any discrepancies are investigated, the results will continue to converge.


Lived next door to a PHD NPL physist who was working on this a few years ago. I think they ended up handing the project over to Canada or somewhere like that IIRC. Fascinating project and guy.


There's a good Radiolab episode related to this .

http://www.radiolab.org/story/kg/


That site is terribly broken until Flash is enabled. There's no obvious way to play the episode, just a tiny "stream" button that doesn't stream. Why does it have to use Flash anyway?


Previously, why didn't they have a reference gram instead of a kilogram? Seems like it'd be easier to create and maintain, and transport.


The base measure for the KG was originally the gram, wiki has a good explanation why it was changed.


Now even the U.S. can adopt it.


The U.S. adopted it in the 19th century.

https://en.wikipedia.org/wiki/Metric_Act_of_1866


Unfortunately, this is not the reason the US isn't adopting SI.


SI and metric are actually subtly different. Do any countries actually use SI rather than metric?


At least the whole EU uses SI units by Directive 80/181/EEC (so called "Units of Measure Directive"). From everything I know, most other countries of the world use SI as well. Not surprising, as the system is now 55 years old and explicitly replaced former definitions.

Even US laws explicitly refer to SI, for example the Metric Conversion Act of 1975: http://www.gpo.gov/fdsys/pkg/STATUTE-89/pdf/STATUTE-89-Pg100...


UK uses miles on road signs, pints in pubs, etc. Many people still use stones and pounds for human weight, even if they're in a medical setting.


However, those are explicitly defined in terms of SI units for use in the UK in that directive.

The question of GGP was, whether some countries use other metric systems which are not SI. I don't think such a country exists.


The Imperial volume units used in Britain are vaguely metric, and different from the US customary (i.e. Queen Anne) units. The Imperial gallon is the volume of 10 lb avoirdupois of water at s.t.p. The other similarity to French metric units is they replaced a number of different application-specific volume measures.

At about the same time as the Imperial unit reform Britain introduced the "florin" 2 shilling coin, 10 per pound, another vague attempt at decimalization.


What is?


Inertia and the fact that alternatives are "good enough".


American exceptionalism.


Ronald Reagan


That's from Wikipedia:

https://en.wikipedia.org/wiki/Metrication_in_the_United_Stat...

>In 1981, the USMB reported to Congress that it lacked the clear Congressional mandate necessary to bring about national conversion. Because of this ineffectiveness and an effort of the Reagan administration — particularly from Lyn Nofziger's efforts (http://www.washingtonpost.com/wp-dyn/content/article/2006/03... ) as a White House advisor to the Reagan administration, to reduce federal spending — the USMB was disbanded in the autumn of 1982.

https://en.wikipedia.org/wiki/United_States_Metric_Board

>The metrification assessment board existed from 1975 to 1982, ending when President Ronald Reagan abolished it, largely on the recommendation of Frank Mankiewicz and Lyn Nofziger. Overall, it made little impact on implementing the metric system in the United States.

https://en.wikipedia.org/wiki/Frank_Mankiewicz

>According to Mankiewicz, he prompted Lyn Nofziger's efforts to halt the 1970s U.S. metrication effort, who convinced President Ronald Reagan to shut down the United States Metric Board

http://www.washingtonpost.com/wp-dyn/content/article/2006/03...

> So, during that first year of Reagan's presidency, I sent Lyn another copy of a column I had written a few years before, attacking and satirizing the attempt by some organized do-gooders to inflict the metric system on Americans, a view of mine Lyn had enthusiastically endorsed. So, in 1981, when I reminded him that a commission actually existed to further the adoption of the metric system and the damage we both felt this could wreak on our country, Lyn went to work with material provided by each of us. He was able, he told me, to prevail on the president to dissolve the commission and make sure that, at least in the Reagan presidency, there would be no further effort to sell metric.

>It was a signal victory, but one which we recognized would have to be shared only between the two of us, lest public opinion once again began to head toward metrification.


Apathy


The \N in 'NIST' is for "National" and the nation to which "national" refers is the United States. I'm not against switching to metric, but after forty years since I read about it in fourth grade, I am pretty confident that the switch won't make most people's lives better.


I'm with you on most of the metric system: but there's a good argument that Celsius is decidedly inferior to Fahrenheit (of course, neither are SI).


If you like angles or big numbers? 0/100°C trumps 32/212°F in my book; and body temperatures – 37°C vs 98°F – are equally difficult to remember.


Let's turn that on its head. Do you prefer 0/100 F or -18/38 C for measuring typical weather temperatures?

And Fahrenheit's higher unit precision is a distinct advantage in taking and interpreting temperatures of people, particularly children.


Celsius vs Fahrenheit seems to be the same as Aluminium vs Aluminum: the one you grew up with is the one that seems to make more sense.


I don't think a good argument can be made for Celsius. Like Fahrenheit, Celsius exists as a measure of temperatures within the typical experience of humans. There is one primary use of such a scale, and that is measurement of air temperature (weather, indoor temperature). A distant secondary use is in cooking.

Faherenheit's 0-100 range is basically a close match for the extremes in weather temperature experienced by typical human beings. Celsius is not even close. Additionally, Fahrenheit's scale has roughly twice the resolution of Celsius, and humans are good at distinguishing this precision. Unit precision is also helpful for body temperature.

Regarding the distant secondary use: obviously Celsius's range matches freezing and boiling: but because water's phase change is consistent and obvious at these temperatures (modulo altitude), cooking rarely involves measuring temperatures in this range except for unusual cases like candy. Celsius has no advantage at all for the primary use of temperature in cooking: baking.


Interestingly, the pound is already defined as a multiple of the kilogram, and has been for a few decades.


They never would've had this problem if they'd just stuck with the pound.


Strange to see Planck's constant used that way, defining a kilogram. Planck's constant usually only shows up when you're doing quantum mechanics and the things you're working with are really small.


Really? A downvote?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: