The Ampere went from being the thought experiment of two infinite wires 1m apart creating a certain force to being something nearly unintelligible to the average person.
The Kelvin was just turning 0C into absolute units, carefully formalized as the triple point of water which was actually 0.01C, and making it into absolute units. Now its something nearly intelligible to the public.
Another problem is I'm not entirely sure the atomic mass is quite defined anymore. That number at the bottom of each box of the periodic table of elements may not have a valid unit anymore??
This hasn't changed, really.
The kilogram started as the mass of one litre of water (under defined conditions, etc). Since every time people reproduce this they will get a slightly different measurement because of the various experimental errors they changed to a physical artefact, which isn't ideal either.
So now we have a definition purely based on fundamental constants.
The best way to represent and approximate it in daily life is still 1 litre of water. 1 litre is also very simple to visualise if you don't have a measuring jug or bottle handy since that's a cube with 10cm edges.
For teaching purposes that also looks much less random than an artefact.
The same as they always have for practical use, since the new definitions are chosen precisely because they make virtually no difference in practice while being more sound.
Most people won't notice, just like they didn't when US customary units were redefined as derived values from SI units.
Most likely how they've always been taught. For most people, it's enough to know that a kilogram equates to roughly a liter of water (H2O). It's not like the Ampere was ever actually equivalent to "two infinite wires 1m apart creating a certain force" either, since that's not a scenario repeatable in the real world. Teaching science involves lots and lots of simplification. Just like you learn that electrons don't really orbit around the nucleus when studying science in academia, you'll learn the proper definition of the kilogram.
We used to have two independently defined mass units! The unified atomic mass unit, and the kg. Now we have only one - the kg, and the other is defined in terms of it.
Avogadro's constant is now clearly identified as an arbitrary number with no illusion of importance since it's no longer part of such an interconnected web of dependencies. Hopefully this will help students to realize that it's not some important chemical quantity but just a way for old people to count big numbers because that's the only way they ever learnt. Nothing more special than the number of feet in a mile.
As for periodic tables not having valid units. They didn't anyway. Chemists, even text-book writers, often neglect to include units for atomic masses. The numerical values will be identical though. The difference is that as we measure them more precisely with future technology, they'll diverge from what they would have been under the old system.
If we started from scratch, Avogadro's number would be a simple exact power of 10 or nothing at all and we'd just tolerate extreme numbers the way computer people tolerate terabytes and electronics people tolerate picofarads. There's nothing natural or fundamental about it.
It's only a fact in one's imagination, as concrete as a fact about properties of any other thing that can only be measured indirectly.
We have no way to directly measure the charge of one electron, so it's a fanciful claim, or at best an approximation, and not a concrete fact.
Claims about wires can be verified.
Are you suggesting the oil drop experiment was invalid?
You also use the completely impractical value of the cesium emission frequency on your calculations for your tabletop amperameter, as well as the speed of light on vacuum (good luck having any "vacuum" around). So, why are people complaining specifically about the electron charge?
In physics people talk about measuring single electrons all the time. Electrons can be counted with tabletop equipment made with home-like budget. You can easily add them to up to trillions, what is far from a Coulomb, but still a practical amount of them.
The ampere is defined by taking the fixed numerical value of the elementary charge e to be 1.602 176 634 × 10−19 when expressed in the unit C, which is equal to A s, where the second is defined in terms of ∆ν
Indeed, I can't even parse the sentence.
They also did make charge more fundamental, as they derive the Ampere by fixing the charge of an electron.
Temperature – absolutely no difference in _practicality_ in communicating about the daily weather in C or F. People that claim otherwise are usually just arguing for the system they grew up with.
Weight – no difference in measuring most larger weights in pounds vs kgs. (weighing people, furniture, cars, trucks, comparing the weight of laptops, etc)
Distance – Whether DC to NY is 220 miles or 352 kilometers makes no difference.
In most daily use, the differences in units probably don't make any difference to most people, which is why the cost of switching (so that it looks good on paper) just hasn't caught on.
However, in the sciences, in cooking, areas where you need precise measurements, grams, KG, etc all give you nicer units to work with.
When measuring things for DIY and home improvement, God, I wish I could do everything in meters and centimeters than 1/8 or 1/6 of an inch (mainly for doing calculations).
And when cooking, I cringe with "cups" and "tbsp" – I wish it were all grams and ml.
A good reason is because you rarely need exact measurements in cooking (vs. baking), and it's easy to eyeball approximate volumetric measurements, especially with help from the known volume of a container; also because many recipes need quantities of some ingredients (most frequently water) that would require a much larger and more expensive kitchen scale than most people would like to keep.
Weight, Distance - conversion is a lot easier in the metric system
You just mentioned that there were no differences, than why would you use units which have strange relations between each other instead of units that are easy to remember, easy to convert are based on math instead of historical/common knowledge?
Ok so when measuring water freezing, you have a nice 0. But how do you measure hot weather? 30C? 35C? Meaningless numbers, correct?
I am talking about daily weather. Celsius is no more intuitive than Fahrenheit when discussing it.
Even with cooking, when you are around a certain system, you're just used to it.
I bake a cake at 350F, I fry things at 350-400F. 175C is no more practical than 350F. I need to cook meat to at least 140F-160F to kill the bacteria.
The equivalents in Celsius are no easier to remember from a practical standpoint.
Even body temperature, from my understanding, almost everyone uses 98.5F, correct? (At least in India, where everything was KG and KM, people still used to measure body temperature in F).
When thinking about boiling water it is easier to think how close to it you are in Celsius, 50% if temp is 50 Celsius.
Personally I'm from Europe but live in US and have trouble with weather forecasts in Fahrenheit. They don't make sense especially when it gets cold.
Even IF Celsius and Fahrenheit were equally good for most purposes, it would be easier if every continent used the same system. Why, because people travel, often from continent to continent.
So if we (the people of Earth) should choose we should most probably choose Celsius, even if it were just slightly better than Fahrenheit.
There are things in the world other than fridges.
> When thinking about boiling water
I only really do this about 1% of the time I need to use temperature.
> Even IF Celsius and Fahrenheit were equally good for most purposes, it would be easier if every continent used the same system. Why, because people travel, often from continent to continent.
This is really the only reason to use Celsius: it works well with other things.
98.6 is the approximation most people memorize in Fahrenheit. That number implies more precision than you can reasonably assume in practice, however.
In Celsius, hot is 30 and cold is -10.
When talking about it for what you're wearing and subjective experience, Celsius is a little more awkward and compressed.
Sure, scientifically celsius is more useful, but how often does that overlap with the weather?
This comparison is utterly meaningless to me, and I say this as someone who prefers to use Celsius.
Generally speaking, you get freeze warnings and black ice warnings when necessary instead of checking to see whether or not the current temp is literally below the freezing point of water. I think OP was just saying that it's good enough for a layperson to use day-to-day.
0F is pretty dang cold out, and 100 is pretty dang hot out. Sure, there are temperatures in human habitats outside that range, but most people would consider then “extreme” temperatures, hence values less than zero, or greater than 100.
That is pretty subjective and no better than Celsius. Anything below 30F is pretty dang cold for me and anything above 85F is too hot
Or, you know, just keep the freezer cold enough to take care of it regardless.
I guess its cheaper to produce a one-sided measuring tape.
Theoretical physicists might be able to skip the units, but the rest of us farmers, engineers, cooks, are doing something practical when we measure.
I'm not really sure what you are arguing.
However, think of it this way: In conversational transfer of information, it makes zero difference. If I'm telling you something is 300 miles away, it isn't more useful for me to tell you it's 480 kilometers away. What extra information do you get from that? That it's 480,000 meters away? Humans can't properly visualize probably more than 100 meters. So what practical use when calculating driving distance is knowing that Kilometers vs meters the distance is?
Day-to-day using miles or inches for distance, gallons for volume, and now we have specialists using miles-per-gallon for fuel efficiency, cubic inches for stroke volume ...
"tsp", that's tablespoons, right? Teaspoons? Argh!
I have no idea how you'd convince a large country to convert to SI aside from just forcing it. There are quantifiable upsides (consistency, conversion, not needing two sets of tools, etc) but momentum is hard to overcome.
Interestingly Metric is "the preferred system of weights and measures for United States trade and commerce" according to the 1975 Metric Conversion Act. But it had no teeth, so everyone just continued using US Customary Units.
That's why I'm always surprised when I read a story about some scientific endeavor in the US that didn't use metric.
For me a big stumbling block is cooking; I like using volume units for cooking, but metric recipes tend to use weight.
But it might never go away from daily life. The Chinese still use "jin" as a unit of weight/mass. The communists unified the customary definitions to be exactly 0.5 kg, but even they weren't powerful enough to force everybody to use kg directly. If a communist dictatorship is too weak, there's no hope for a democratic government!
The switch happened in 1893 according to wikipedia.
That'd offer the best of both inches/feet and the simplicity of the metric system's scaling.
What does one need fractions for actually? You can measure with digits behind the point.
I find 3.5 and 4.6 easier to compare then 7/2 and 23/5.
Fractions are great for mental math, especially division, in situations where you might be working with wood for example.
6 months of carpentry made me appreciate feet and inches in a new way.
If I'm placing bets, the liter will be the metric unit to take over next in the USA (winning out over gallon and fl. oz), maybe just because the gallon has a good size, not too small or too large - stuff you buy liquid is often around 0.5 - 2 liters.
> The agreement defined the yard as exactly 0.9144 meters and the pound as exactly 0.45359237 kilograms.
Burma, Liberia, and the US are the only countries in the world that have not adopted the International System of Units
Imperial: USA, Canada, UK, India.
Metric: The rest (190~ other countries).
Metric: ~200 other
When I asked that question to my science teacher, a long time ago, he said that was because it was easier to manufacture a precise prototype kilogram than a prototype gram, and that made sense at the time, but what's the reason now?
Starting to get like the problem of OS updates changing everything, except what if it was the foundations of all scientific measurement constantly changing instead of some OS you don't even have to use.
Some might say oh they haven't really changed. Well if they haven't really changed why change them.