Hacker News new | past | comments | ask | show | jobs | submit login
2019 redefinition of the SI base units (wikipedia.org)
85 points by tony on July 1, 2019 | hide | past | favorite | 92 comments



So I'm glad they fixed the problems of depending on kilogram sample object for anything, but the definitions are now bizarre if you want to learn them in reference to real-world things. How will they be taught now?

The Ampere went from being the thought experiment of two infinite wires 1m apart creating a certain force to being something nearly unintelligible to the average person.

The Kelvin was just turning 0C into absolute units, carefully formalized as the triple point of water which was actually 0.01C, and making it into absolute units. Now its something nearly intelligible to the public.

Another problem is I'm not entirely sure the atomic mass is quite defined anymore. That number at the bottom of each box of the periodic table of elements may not have a valid unit anymore??


> So I'm glad they fixed the problems of depending on kilogram sample object for anything, but the definitions are now bizarre if you want to learn them in reference to real-world things. How will they be taught now?

This hasn't changed, really.

The kilogram started as the mass of one litre of water (under defined conditions, etc). Since every time people reproduce this they will get a slightly different measurement because of the various experimental errors they changed to a physical artefact, which isn't ideal either.

So now we have a definition purely based on fundamental constants.

The best way to represent and approximate it in daily life is still 1 litre of water. 1 litre is also very simple to visualise if you don't have a measuring jug or bottle handy since that's a cube with 10cm edges.

For teaching purposes that also looks much less random than an artefact.


The old definitions are still essentially correct, so for teaching nothing really changes except the word "approximately" being inserted somewhere.


Right. So in other words, the SI unit definitions are for scientists wishing to affirm the base units, but the old definitions are superior for education purposes (well, except maybe for the prototype kilogram...)


The pre-prototype kilogram is essentially valid- mass of liter of water. But now in terms of physical constants.


> How will they be taught now?

The same as they always have for practical use, since the new definitions are chosen precisely because they make virtually no difference in practice while being more sound.

Most people won't notice, just like they didn't when US customary units were redefined as derived values from SI units.


I must strongly disagree about ampere, the new definition is much simpler conceptually. The wording is bit obtuse, but that is a side matter, practically they fixed the value of coulomb (to a constant multiple of elementary charge) and now ampere follows from that by the basic definition of "current of one ampere is one coulomb of charge going past a given point per second". That is much more concrete than the previously imho very abstract definition which involved infinities and forces over a distance


> How will they be taught now?

Most likely how they've always been taught. For most people, it's enough to know that a kilogram equates to roughly a liter of water (H2O). It's not like the Ampere was ever actually equivalent to "two infinite wires 1m apart creating a certain force" either, since that's not a scenario repeatable in the real world. Teaching science involves lots and lots of simplification. Just like you learn that electrons don't really orbit around the nucleus when studying science in academia, you'll learn the proper definition of the kilogram.


They've definitely improved atomic mass and Avogadro's constant. Chemists and students have been fooled into believing they have some special importance but in fact they're nothing more than ugly accidents of history that somehow survived standardization.

We used to have two independently defined mass units! The unified atomic mass unit, and the kg. Now we have only one - the kg, and the other is defined in terms of it.

Avogadro's constant is now clearly identified as an arbitrary number with no illusion of importance since it's no longer part of such an interconnected web of dependencies. Hopefully this will help students to realize that it's not some important chemical quantity but just a way for old people to count big numbers because that's the only way they ever learnt. Nothing more special than the number of feet in a mile.

As for periodic tables not having valid units. They didn't anyway. Chemists, even text-book writers, often neglect to include units for atomic masses. The numerical values will be identical though. The difference is that as we measure them more precisely with future technology, they'll diverge from what they would have been under the old system.


Avogadro's number is special because it's what relates fundamental physical constants to SI units.


That's a common misconception but all the things that it relates to SI units are themselves redundant. There's an entire parallel system of constants and units built around it which only exists for legacy reasons, as well as people's desire to avoid very big and very small numbers.

If we started from scratch, Avogadro's number would be a simple exact power of 10 or nothing at all and we'd just tolerate extreme numbers the way computer people tolerate terabytes and electronics people tolerate picofarads. There's nothing natural or fundamental about it.


Charge of an electron is a concrete fact, unlike infinitely long imaginary wires inducing an electromagnetic pull.


> Charge of an electron is a concrete fact

It's only a fact in one's imagination, as concrete as a fact about properties of any other thing that can only be measured indirectly.

We have no way to directly measure the charge of one electron, so it's a fanciful claim, or at best an approximation, and not a concrete fact.

Claims about wires can be verified.


> We have no way to directly measure the charge of one electron, so it's a fanciful claim, or at best an approximation, and not a concrete fact.

Are you suggesting the oil drop experiment was invalid?


The elementary charge is impractically small, while it is easy to approximate the "imaginary wires."


So you make the calculations and use some wire length that places the error within your acceptable margin. You use the completely precise, but impractically small value of the electron charge on those calculations.

You also use the completely impractical value of the cesium emission frequency on your calculations for your tabletop amperameter, as well as the speed of light on vacuum (good luck having any "vacuum" around). So, why are people complaining specifically about the electron charge?


Both the cesium frequency and the speed of light are practically reproducible and are used to calibrate instruments. Not so with the elementary charge. (Similarly, in chemistry, nobody would count the quantity of stuff in molecules, they use moles instead.)


Well, I do know that infinite wires aren't practically reproducible.

In physics people talk about measuring single electrons all the time. Electrons can be counted with tabletop equipment made with home-like budget. You can easily add them to up to trillions, what is far from a Coulomb, but still a practical amount of them.


> The Ampere went from being the thought experiment of two infinite wires 1m apart creating a certain force to being something nearly unintelligible to the average person.

The ampere is defined by taking the fixed numerical value of the elementary charge e to be 1.602 176 634 × 10−19 when expressed in the unit C, which is equal to A s, where the second is defined in terms of ∆ν

Indeed, I can't even parse the sentence.


"The kilogram is the mass of a body at rest whose equivalent energy equals the energy of a collection of photons whose frequencies sum to [1.356392489652×10^50] hertz."


Glad that is a deprecated formulation. One never has collection of photons of exact frequency of net mass 1kg. Also 1kg weight can't be realistically entirely turned into photons.


Your sarcasm was a bit hard to distinguish ;)


There wasn't any sarcasm intended.


Isn't the new definition completely equivalent with the GP's one?


Why is ampere a base unit instead of charge? Charge seems the more fundamental one to me, like mass is, and ampere feels more derived, as if volumetric flow rate were made a base unit instead of mass.


Because the standards are made for people who makes measurement devices. It is much easier to measure electric current than charge.

They also did make charge more fundamental, as they derive the Ampere by fixing the charge of an electron.


None of them are really fundamental. Some people feel that force is more fundamental than mass since force seems to be a "real" thing that happens while mass is just an abstract quantity that helps us predict forces. It's all human psychology, practical issues with measurement, and legacy. Quite a few of the base units are entirely redundant (candela, kelvin, mole) but we have them because of our human weaknesses.


You cannot change the basic dimensionalities after the fact. Electric current was chosen at the time, rather than charge, hence it has been defined in terms of electron's charge.


Think of Charge as a collection of Amperage across a period of Time, or a supply of Charge that can yield a particular Amperage of current for a sustained amount of Time.

https://electronics.stackexchange.com/questions/23449/why-is...


Seems like a positive change. But somehow (perplexingly) we still use inches, feet and miles in a major part of the world.


My argument is mostly for things like day-to-day practical measurements, there's no difference between the two units, so it doesn't affect people.

Temperature – absolutely no difference in _practicality_ in communicating about the daily weather in C or F. People that claim otherwise are usually just arguing for the system they grew up with.

Weight – no difference in measuring most larger weights in pounds vs kgs. (weighing people, furniture, cars, trucks, comparing the weight of laptops, etc)

Distance – Whether DC to NY is 220 miles or 352 kilometers makes no difference.

In most daily use, the differences in units probably don't make any difference to most people, which is why the cost of switching (so that it looks good on paper) just hasn't caught on.

However, in the sciences, in cooking, areas where you need precise measurements, grams, KG, etc all give you nicer units to work with.

When measuring things for DIY and home improvement, God, I wish I could do everything in meters and centimeters than 1/8 or 1/6 of an inch (mainly for doing calculations).

And when cooking, I cringe with "cups" and "tbsp" – I wish it were all grams and ml.


I can't think of any good reason to use volumetric measurement in cooking. Even if you only need approximate measurement, it's still better to use weight because you can put the cooking container directly on the scales and avoid having to wash your measuring cups. And if you do need accuracy, measuring by weight wins every time.


> I can't think of any good reason to use volumetric measurement in cooking

A good reason is because you rarely need exact measurements in cooking (vs. baking), and it's easy to eyeball approximate volumetric measurements, especially with help from the known volume of a container; also because many recipes need quantities of some ingredients (most frequently water) that would require a much larger and more expensive kitchen scale than most people would like to keep.


Small quantities under a tablespoon or 10ml by volume can be quite challenging to weigh, and is much easier to fill up a fixed size tsp/tbsp measure. If your ingredient is a commodity and has a known density ahead of time (water, olive oil, salt, whatever) its mechanically simpler. There are also times when you can't measure into your cooking container directly - maybe it's hot.


I would disagree about weight and distances, having multiple of 10 units make computation easy, it is unfortunate that time is different. Though I agree most people this days would maybe avoid calculating things and use a calculator/software1


The old British system is good for one thing: tea. Want to brew some tea? 1 teaspoon per cup.


Regarding temperature, Fahrenheit offers more granularity. I, for one, can definitely perceive the difference between 22, 22.5, and 23 degrees C (71.6, 72.5, and 73.4 degrees F, respectively), but on a typical metric thermostat, fractions of a degree aren't allowed.


Temperature - at what temperature does the water freeze? 0C, 32F. At what temperature does the water boil? 100C, 212F. Which ones are easier to remember on a day-to-day basis?

Weight, Distance - conversion is a lot easier in the metric system

You just mentioned that there were no differences, than why would you use units which have strange relations between each other instead of units that are easy to remember, easy to convert are based on math instead of historical/common knowledge?


How often are you boiling and freezing water when deciding what clothes to wear?

Ok so when measuring water freezing, you have a nice 0. But how do you measure hot weather? 30C? 35C? Meaningless numbers, correct?

I am talking about daily weather. Celsius is no more intuitive than Fahrenheit when discussing it.

Even with cooking, when you are around a certain system, you're just used to it.

I bake a cake at 350F, I fry things at 350-400F. 175C is no more practical than 350F. I need to cook meat to at least 140F-160F to kill the bacteria.

The equivalents in Celsius are no easier to remember from a practical standpoint.

Even body temperature, from my understanding, almost everyone uses 98.5F, correct? (At least in India, where everything was KG and KM, people still used to measure body temperature in F).


Well fridge should not go below 0 Celsius or you have trouble. Teach your children that.

When thinking about boiling water it is easier to think how close to it you are in Celsius, 50% if temp is 50 Celsius.

Personally I'm from Europe but live in US and have trouble with weather forecasts in Fahrenheit. They don't make sense especially when it gets cold.

Even IF Celsius and Fahrenheit were equally good for most purposes, it would be easier if every continent used the same system. Why, because people travel, often from continent to continent.

So if we (the people of Earth) should choose we should most probably choose Celsius, even if it were just slightly better than Fahrenheit.


> Well fridge should not go below 0 Celsius or you have trouble

There are things in the world other than fridges.

> When thinking about boiling water

I only really do this about 1% of the time I need to use temperature.

> Even IF Celsius and Fahrenheit were equally good for most purposes, it would be easier if every continent used the same system. Why, because people travel, often from continent to continent.

This is really the only reason to use Celsius: it works well with other things.


> Even body temperature, from my understanding, almost everyone uses 98.5F, correct?

98.6 is the approximation most people memorize in Fahrenheit. That number implies more precision than you can reasonably assume in practice, however.


Sure, my point was that the unit of measurement for body temperature in many places that use metric is probably still Fahrenheit, not Celsius. (And that it doesn't really hurt anything, since the boiling and freezing point of water is irrelevant in checking whether a human is ok or not)


Seriously you don’t see any difference in defining hot weather as 1/3 of the boiling temperature of water? Try to do that in Fahrenheit and you have to do something crazy like (212 - 32) / 3 + 32, and I’m not even 100% sure that is correct... And I guess that for you is much better to remember that at 32 F is very dangerous to drive outside rather than at 0. I really can’t understand how can someone would advocate for maths gimmicks when you have a much easier solution that is even an international standard (or to be precise it’s extremely easy to convert to and from that standard)


In Fahrenheit, hot is 100 and cold is 0.

In Celsius, hot is 30 and cold is -10.

When talking about it for what you're wearing and subjective experience, Celsius is a little more awkward and compressed.

Sure, scientifically celsius is more useful, but how often does that overlap with the weather?


No, cold is 0 in Celsius. You risk your life in icy roads at 0.


> Seriously you don’t see any difference in defining hot weather as 1/3 of the boiling temperature of water?

This comparison is utterly meaningless to me, and I say this as someone who prefers to use Celsius.


"1/3 of the boiling temperature of water" is just as arbitrary as "32".

Generally speaking, you get freeze warnings and black ice warnings when necessary instead of checking to see whether or not the current temp is literally below the freezing point of water. I think OP was just saying that it's good enough for a layperson to use day-to-day.


At what temperature does the human body rest? ~100F. At what temperature is frostbite a concern to uncovered skin? ~0F. I feel like these benchmarks are more useful in my day to day life than water temps.


Yup, that’s one of my favorite things about the Fahrenheit scale: it’s a good approximation of “on a scale of 1-100, how cold/hot is it outside?” where a broad range of human climates would agree on the answer.

0F is pretty dang cold out, and 100 is pretty dang hot out. Sure, there are temperatures in human habitats outside that range, but most people would consider then “extreme” temperatures, hence values less than zero, or greater than 100.


> 0F is pretty dang cold out, and 100 is pretty dang hot out

That is pretty subjective and no better than Celsius. Anything below 30F is pretty dang cold for me and anything above 85F is too hot


You mean at what temperature does water freeze at sea level, right? Pure water, I mean, not normal tap water.

Or, you know, just keep the freezer cold enough to take care of it regardless.


The water freezing/boiling point calibration points seem interesting until you realize that you're never actually at 1 atmosphere (due to elevation and weather) or working with pure water.


Until hardware stores in the US start stocking measuring devices longer than a foot with cm and mm markings, I'm going to disagree with the DIY aspect. Yes I can use a tape measure or buy one online. Discussing lumber here is always inches even when precision is necessary, still inches. It's all very frustrating imo.


Yes it is. I bought a tape-measure maybe part of a tool-kit or something and lo and behold it is ONLY in inches. It could easily have cm on the other side but it doesn't.

I guess its cheaper to produce a one-sided measuring tape.


"day-to-day practical measurement" as distinct from what?

Theoretical physicists might be able to skip the units, but the rest of us farmers, engineers, cooks, are doing something practical when we measure.

I'm not really sure what you are arguing.


Look at my arguments. I'm saying exactly what you're saying. Areas that need precise measurements (especially the sciences) are already using metric. I also prefer metric when cooking.

However, think of it this way: In conversational transfer of information, it makes zero difference. If I'm telling you something is 300 miles away, it isn't more useful for me to tell you it's 480 kilometers away. What extra information do you get from that? That it's 480,000 meters away? Humans can't properly visualize probably more than 100 meters. So what practical use when calculating driving distance is knowing that Kilometers vs meters the distance is?


You are making a distinction between daily use and specialist use and then suggesting it doesn't matter, I suppose because everyone has a daily use.

Day-to-day using miles or inches for distance, gallons for volume, and now we have specialists using miles-per-gallon for fuel efficiency, cubic inches for stroke volume ...


> And when cooking, I cringe with "cups" and "tbsp" – I wish it were all grams and ml.

"tsp", that's tablespoons, right? Teaspoons? Argh!


Tsp is teaspoon; tb/tbl/tbsp is tablespoon


tsp (or just t) is teaspoon, Tbsp (or TB, Tbl, or just T) is tablespoon. The little one has a little t, the big one has a big T.


Except in science. Where the inconsistency itself can cause problems:

https://en.wikipedia.org/wiki/Mars_Climate_Orbiter#Cause_of_...

I have no idea how you'd convince a large country to convert to SI aside from just forcing it. There are quantifiable upsides (consistency, conversion, not needing two sets of tools, etc) but momentum is hard to overcome.

Interestingly Metric is "the preferred system of weights and measures for United States trade and commerce" according to the 1975 Metric Conversion Act[0]. But it had no teeth, so everyone just continued using US Customary Units.

[0] https://en.wikipedia.org/wiki/Metric_Conversion_Act


Yup this is what I always tell people – in the US, all science is taught in Metric. (at least in my schooling in the DC area). All physics, chemistry, etc units were in metric. So where it truly makes a difference, it seems to all be metric.

That's why I'm always surprised when I read a story about some scientific endeavor in the US that didn't use metric.


The US uses some metric units for some liquids (e.g. a 2 liter of soda).

For me a big stumbling block is cooking; I like using volume units for cooking, but metric recipes tend to use weight.


Don't you think that weight (or, pedantically, mass) generally makes more sense for many ingredients? 100g of flour is a well-defined quantity, one cup of flour can vary significantly depending on how tightly packed the flour is.


The packing is mostly a function of how you yourself put the flour into the cup/how it was handled and stored before I got to my house. In practice there's not much variation of this sort. And you still want to taste and adjust as needed, so I don't think you would really need to-the-gram precision since there are so many factors like humidity that can impact how much of an ingredient you should use anyways.


Why not use liters?


I think the way it'll happen is by America's loss of manufacturing. That's a big user of units and it's declining. Imported stuff is going to be more and more often metric unless it was also designed in America. So people are going to encounter it more and more. At some point, there might be a price penalty for specifying the less common imperial threads and material thicknesses on Chinese made parts so American designers in competitive industries might start preferring metric.

But it might never go away from daily life. The Chinese still use "jin" as a unit of weight/mass. The communists unified the customary definitions to be exactly 0.5 kg, but even they weren't powerful enough to force everybody to use kg directly. If a communist dictatorship is too weak, there's no hope for a democratic government!


Probably something to do with this type of propaganda... https://www.reddit.com/r/cringe/comments/bxugvv/a_fox_news_s...


Funfact: US customary units are metric by definition. Everything is in grams and meters and liters, it’s just using weird numbers so it roughly fits with customary units roundedly.

The switch happened in 1893 according to wikipedia.

https://en.m.wikipedia.org/wiki/United_States_customary_unit...


When you get right down to it, it's still just arbitrary stuff that amounts yo "about yay long", "about yay heavy", and so forth... but with a means of describing what we mean to people (or whatever they choose to call themselves) who aren't present and have no access to an official artifact. The metre is about a cloth yard (a very old "standard") with an excuse (one ten-millionth of the distance from the north pole to the equator through the Paris meridian... or thereabouts; we should really work on a better way of defining that at some point), the litre is close enough to a compromise quart, which can be sort of defended by monkeying with the metre (hey, a decimetre's about a hand, right? Clever of us to have thought of that), and the kilogram's a bit too big and the gram's a bit too small, but you can't get 'em all right all of the time. Defining them all in a way that doesn't rely on artifacts and/or special moments in time is a good idea, but it doesn't make them any less arbitrary. There is nothing fundamentally more scientific about SI.


Base-12 is a wonderful number system. 12 is divisible 2,3,4 and 6. What's 10 divisible by? 2 and 5. In an ideal society, maybe they'd use a kind of metric base-12 system.

That'd offer the best of both inches/feet and the simplicity of the metric system's scaling.


Base-16 would've been more useful imho, from a computer science perspective :)

What does one need fractions for actually? You can measure with digits behind the point.

I find 3.5 and 4.6 easier to compare then 7/2 and 23/5.


Now that you mention it, I suppose one of my favorite things about inches are their base-2 subdivisions (e.g. 16ths). The orders of magnitude aren't nearly as drastic as with base 10. They're much more intuitive and easier to estimate visually.

Fractions are great for mental math, especially division, in situations where you might be working with wood for example.

6 months of carpentry made me appreciate feet and inches in a new way.


You have (probably) 10 fingers, thus base 10.


If you ignore the thumbs, each finger has 3 subsections, so in total you have 12 "slots" on your hand. And you can use them to count with the spare thumb. It's even more convenient than counting on fingers because you're only using one hand, and the other one is free!


wouldnt base 11 make more sense then?


Customs stick around for a long time, but I think metric will gradually find its way in, it will just take generations. When I for example look into discussions about hiking and outdoor life right now, there is a mix of units being used conventionally - US customary and metric depending on what quantity is being measured. And this is a good thing, it mean the metric units are creeping in, and some people are getting used to them. Even if it's not a full conversion.

If I'm placing bets, the liter will be the metric unit to take over next in the USA (winning out over gallon and fl. oz), maybe just because the gallon has a good size, not too small or too large - stuff you buy liquid is often around 0.5 - 2 liters.


I'd bet a Troy ounce of feathers we just end up dual-stack forever.


What will make the change easier is smartphones, smart-phone measuring stick can tell you cm. I haven't seen one but I assume a smartphone could also be used to measure weight?


AFAIK imperial units are defined in terms of SI units.


Since 1959: https://en.wikipedia.org/wiki/International_yard_and_pound

> The agreement defined the yard as exactly 0.9144 meters and the pound as exactly 0.45359237 kilograms.


In many private industries where it matters in the US, they've switched over to metric/are increasingly moving over to it/using it side-by-side with imperial units. The biggest hurdles are all levels of government and the massive cost of converting all measurements over to metric. And given how much we already neglect public infrastructure, I don't see changing to metric units happening any time in the next several decades.


I think it depends on how you define major part...

Burma, Liberia, and the US are the only countries in the world that have not adopted the International System of Units


> in a major part of the world

http://blog.sciencescore.com/blogss_uploads/All-about-Metric...

Imperial: USA, Canada, UK, India. Metric: The rest (190~ other countries).


Imperial: USA, Liberia

Burmese: Myanmar

Metric: ~200 other


"Major", i.e. three countries in the world.


Can anyone help me understand why the base SI unit of mass is the kilogram and not the gram? Why is the ×1000 prefix included in the base unit of mass, but not in any other units?

When I asked that question to my science teacher, a long time ago, he said that was because it was easier to manufacture a precise prototype kilogram than a prototype gram, and that made sense at the time, but what's the reason now?


For an international standard of universal constants they sure change a lot.

Starting to get like the problem of OS updates changing everything, except what if it was the foundations of all scientific measurement constantly changing instead of some OS you don't even have to use.

Some might say oh they haven't really changed. Well if they haven't really changed why change them.


What does candela still do there? It is extremely unfundamental imho, and sort of vaguely based on human physiology which seems out of place for SI unit. It is not even fully defined because you need also a definition for luminous efficacy. Of course candela is utterly useless outside the visible spectrum which should be a huge alarm bell against it.


Someone will save money by no longer having to store the prototype artefacts ... unless we're in any danger of running out of caesium atoms.


I am glad the definition of kilogram is corrected; I always thought it was no good, and now it is good.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: