This uses a measure on the completely wrong abstraction level.
An accidental production process now defines a wire unit instead the end result the consumer/engineer/technician wants to know (mm^2 or inch^2 oh no no it's pica^2 ?).
There must be some sociological phenomenon going on that keeps the US from adopting modern units. Maybe can't admit a shortcoming and fix it?
If you are brought up with metric units, the units are built around the number system.
A good way to understand the problem by for the HN audience is the problem of using IEEE floats where precise numbers matter in programming. The problem with IEEE floats is that we usually think about them in base 10 but they represent something in base 2, and no lossless conversion across the domain is possible. American units bridge that sort of gap much better than metric ones do because they usually focus on divisibility than on quick representation and conversion.
There are certain areas where metric makes sense. If I have prices in kg and I want to know how many g I can buy for a certain amount of money, because the money and weight systems coincide we are well optimized for that problem.
But, imagine calculating the angles of a triangle or a hexagon if your degree system was base 10.
In other words, American units are abstracted around actual application.
There is of course a happy medium. We could change to duodecimal numbers and come up with a duodecimal metric system. Then everyone would get what they want, right? ;-)
Arbitrary divisions of angle aren't very interesting, since the radian is so fundamental. But there is an angle system using multiples of 100: gradians. 100 is a quarter-turn and the internal angles of a triangle add up to 200. A lot of electronic calculators still offer them (the DRG button = degrees, radians, gradians).
There are two cases where linear divisions of arclength-measured angles make sense: (1) the angles to make regular polygons of low numbers of sides, (2) repeated binary divisions, whose cartesian coordinates can be computed easily because we can just add two vectors and renormalize using an inverse square root which we know how to efficiently compute, and which are then nice for fixed point representations using binary integers in computers. For the first case, I think rational fractions of a turn work best; for the second case, I like binary angles (“brads”) https://en.wikipedia.org/wiki/Binary_scaling#Binary_angles ; losing the ability to precisely represent a thirds of a full turn is just one of those trade offs you sometimes need to make.
Otherwise, treating angle as a linear quantity with precise measurements (instead of some kind of approximations) is for most problems less useful than just using a cartesian coordinate representation for an angle (possibly keeping track of the coordinates squared if we want to stick to rational arithmetic.)
Radians are only really useful for solving calculus problems by hand, or writing academic math/science papers where it’s important to stick to established conventions. For practical computation radians are almost always an inferior choice, more computationally expensive and less precise (they’re built into lots of existing libraries, so can often be convenient despite the inefficiency).
Well just multiply your fraction with τ = 2π and you get radians. E.g. a whole revolution is τ radians (or 360°), half a revolution is ½τ radians or (180°), etc.
Each finger has three numbers, starting with the index and ending in the pinky, and you count pointing to the relevant division with your thumb.
I too wish base-12 were used more. Given how much a penny is worth, getting rid of decimal money and substituting a base-12 coin would solve a lot of pizza / dinner cost split problems.
The other day, right next to the problematic awg definition, I discovered the kcmil. Which is a kilo pi/4 Micro square inch.
four score for some things, six score for others, or for onions, a hundred was 15^2
The real reason why the US hasn't switched is that it is really painful to change something like this.
Hey boss, we seem to be all out of 3.31mm^2 romex, all I got left is 2.08mm^2. And why did you mention cross-sectional area? Why not radius? or diameter? Or better yet, circumference? I'll take awg12 and awg14 please.
There's nothing more natural about an inch than a centimeter, and nice, round, easy-to-remember numbers can be had on both scales for "natural", commonly found distances. Ditto for pounds and kilograms, Celsius and Fahrenheit, acre and hectare etc.
Objectively, metric wins because it uses the same decimal scale as our number system, and because the units are designed to establish the most straightforward relations between different quantities.
Yes, using 12 for a base has some advantages due to more divisors, but not being consistent with decimal wipes them all out (and traditional units don't consistently use 12, either - consider units of volume, for example). In an ideal world, we'd have 6 fingers on each hand, and use base-12 everywhere; alas...
Recently I was building a roof with an old, very experienced builder. Turns out that on construction sites, nails and planks are always discussed in inches, even when they're actually metric. So a 60mm nail would be "a two point fiver" ("kakspuokki" or such in Finnish).
Note the inconsistency in 1/4".
What's the purpose of having mm^2? You can't measure it any easier (how the heck are you gonna measure area without calipers? you're gonna need a gauge with holes in it to identify wires), nor is it easier to express the number easily.
For manufacturing purposes, expressing the radius or diameter might be good, but for using them, the AWG number is really nice and streamlined.
This doesn't really preclude a logarithmic scale, but it should be the kind that's easy to convert (i.e. increasing numbers denote increasing area). Looking at AWG, it could actually even be decimal, like dB. Consider: 17 gauge is almost exactly 1 mm^2 in area, so if we pick exactly mm^2 as 1 on our hypothetical scale, then 10 would be 10mm - close to 7 gauge, and -10 would be 0.1mm - close to 27 gauge. And there are plenty of industries that already know how to work with dB scale, and use the shortcuts that it offers.
By the way, while looking up related things, I've discovered the existence of a weird unit called "circular mil" (basically, cross-section of a wire 1 mil in diameter) that is, apparently, already used in US for wires that are out of bounds on AWG gauge scale. Which seems to indicate that cross-section area is, indeed, the preferred metric.
And when I say natural, I don't mean elegant. They often tend to be thoroughly arbitrary. But they match the needs, which is usually reasonably pragmatic. For example, 360 degrees is ideal for simple in-head directional geometry, but radians are by far simpler in algebra or very precise measurements (because Pi can often be neatly factored out).
Once you start using enough decimal points, all units are lousy. I remember using angstroms in astronomy because it fit better into the optics theory and the distances are already absurd it didn't matter. So you may as well use the units that are convenient, and just get good swapping.
(PS: glass sheets are sold by the square foot, but in thicknesses measured in millimeters. Turns out to be pretty convenient that way.)
Another thing I've encountered: in South India, there is a measure of distance & time called a "nazhika". It is 24 minutes. 2 1/2 nazhika is one hour, 60 nazhikas make a day. A nazhika is also roughly the time taken for a normal person to walk 1 mile. Hence a nazhika is also used as approximately one mile. Seems a bit weird until you remember that light-years involve the same identification of time and distance.
Non-metric units like these and others ("foot") have probably very good reasons behind them, and they could be used by people in their day-to-day activities while not carrying measuring instruments with them. Metric System is more systematic, but people "lose touch" with intuitions of quantity.
The Sumerians did, in fact, periodically insert intercalary months in order to offset the difference, which suggests that this is in part close approximation and on the other part nice for application.
Of course, at some point it becomes a huge pain and you buckle down and choose something just as arbitrary but easier to handle. And that's why we don't count angles in the milliseconds since 1970. Or days in a year; though we do sometimes measure 3D angles in solid minutes, which seems like a metaphor taken too far :)
Even in metric-using countries, nobody uses it for time.
For actual applications of spherical trig though degree, minute, second makes a lot of sense, primarily because the earth rotates about one arc second every 4 minutes, and if you know this then you can do manual navigation via the stars and many other things.
However for many things I do use metric time, just not for the human aspect of it. For example, for one customer (admittedly in the sciences), we had to help them estimate how much hardware they needed for additional load. So you do the work in seconds because at that point the math is easiest, and convert to ratios following.
Of course for practical use using SI second wouldn't be very good solution. Traditionally second is derived from the length of day, and I think that would make sense for metric time too. 1 milliday would be somewhat close to 1 minute and 50 millidays (or maybe half deciday) would be close to one hour. Of course the name probably should be something else than "day" to reduce confusion.
So I could be working something like 16 decidays next week.
The number of sides depends on the divisiblity of 360 by the number of angles so:
3, 4, 5, 6, 8, 9, 10, 12, 15, 18, 20, 30, and so forth.
For angle measurement where you want a closed geometry figure at the end, degrees are extremely elegant, and 360 is 2^3x3^2x5
You'd also lose the small-angle approximations sin(theta)=tan(theta)=theta which in my field are used extensively to convert nonlinear to linear equations.
Nonetheless, your proposed unit already exists and is called tau. Or write it as 2pi if you want to be more easily understood.
I've sometimes heard the same remark about temperatures: jeez, how do you guys manage to work with something as unintuitive as degrees celcius? But really, you just get used to whatever it is you're using.
If you're going to knock the other way, wouldn't t be far to at least check what that it is rather than making something up as a strawman?
Responding to commenter e2e8: Using 2mm, 3mm, etc. would be substantially less functional.
Once you have a log scale, it hardly matters whether you measure diameter or area, that’s just a constant factor.
In the ideal case, the log scale would have a slightly easier to compute definition for the dilation at each step than the 39th root of 92 (~1.1229).
Perhaps they could use the 6th root of 2 instead (~1.1224). Then every 6 steps you’d get a factor of two. Size 0 could be defined as 1mm diameter (or size 0 could be 1 millimeter squared cross sectional area, with 3rd root of 2 as the step each time).
In practice if you’re wiring a house, it doesn’t much matter.
* * *
One of the reasons the metric system built on a base ten number system is so frustrating for practical purposes is that powers of ten are entirely arbitrary and indivisible, and tenth roots of ten or numbers expressed in terms of natural logarithms are even worse. We’d be much better off with a general-purpose base twelve number system, plus log scales uniformly designed around the twelfth root of 2. [Western music scale, ISO paper sizes, etc. would fit right in.]
For an exponential scale you can just pick some reasonably evenly-spaced round numbers and repeat them at different factors of 10: 1, 2, 5, 10, 20, 50, ...
Anyway, neither way here is right or wrong. One optimizes for uniformity of scaling, the other optimizes for intelligibility with a base ten number system.
However, using something like the E12 series would work pretty well.
You can look at almost any land assignment, building or commercial sale ever in the history of the United States and be able to compare / modify / extend it without any conversions. If my great-great-grandfather claimed 160 acres from the Homestead Act, any documentation from that time is just as easily understood today without any conversions.
Differing units only become an issue when globalization factors in (which is a fairly recent phenomenon). Many fields in the states, such as the sciences, have indeed chosen interoperability over backwards-compatibility, but these decisions are done on a field-by-field basis.
I'm in the camp that it'd generally be better to just rip the band-aid off and switch everything, but in general I value interoperability over backwards compatibility with software too. That said, who am I to tell Microsoft that they shouldn't worry about Windows backwards-compatibility? Who am I to tell farmers that they should measure their land in hectares?
What state? Are those 160 international acres, or 160 US survey acres?
What weighs more - an ounce of feathers, or an ounce of gold? (An ounce of gold because gold is measured in Troy ounces...)
For example the energy content of fossil fuels tends to be measured in BTU per (whatever), while electrical energy is measured in kilowatt hours. On the other hand, automobile engines are often rated in horsepower.
I find this sort of thing is ubiquitous in engineering. In metric-only countries, the situation is slightly less bad because that system provides somewhat fewer opportunities for mischief.
When doing manual work, e.g. woodworking or metalwork it's pretty convenient to use fractions. You tend to divide things in half or quarters.
With most things not made by hand any more, using fractions and inconsistent units like inches, feet, yards and miles is a nuisance for engineers.
It's when people inflict mixtures of BTUs with h or Joules with eV on us that I get aggravated. Each of those units has its place, but only in particular domains. Worse, the people most likely to use domain-specific units, are the ones most likely to not name the unit they are using.
Engineers are such people.
The SI system helps reduce the risks a little. But only a little, I have seen German engineers measure velocity in degrees.
Every other country made unit conversions.
The people will use old units for some time to buy cheese at the counter and all is well.
Or you buy fruit at the supermarket in Indonesia in ounces and kg (an ounce is an hg).
I cannot remember a single time I have seen a contract specifying mil instead of km.
("Mil" == 10km == 10 000m is used here as well only not in formal or scientific contexts that I can think of but maybe that is a difference between Sweden and here.)
2000 miles (i.e. 20k km).
I do woodworking in a metric country. It's terrible. My "quarter inch" chisel is 6mm but my "quarter inch" plywood is 6.5mm so it won't fit in a groove made by the chisel. 3/4" chisel is 20mm but plywood is 18mm. All the measurements are inch based but converted to metric arbitrarily rounding up or down.
3/4" is the only measurement that works well, it's 19.004mm. But a 19 mm drill bit isn't a part of any standard set.
Some automotive applications are similar. The Toyota Hilux pickup truck has an indestructible reputation because it was cloned by the Japanese from a US truck. They rounded every measurement up to nearest millimeter. It's beefier than the original while having a smaller and less powerful engine. You do need some exotic sized wrenches to work on it, though.
Mihm: That's correct. While the anti-metric forces included outright cranks, including people who believed that the inch was a God-given unit of measurement, the most sophisticated and powerful opponents of the metric system were anything but cranks. They were engineers who built the industrial infrastructure of the United States. And their concerns, while self-interested, were not entirely off base. Whatever the drawbacks of the English units, the inch was divided in ways that made sense to the mechanics and machinists of the era: it was built around "2s" rather than "10s," with each inch subdivided in half and in half again—and so forth. This permitted various sizes of screw thread to have some logical correspondence to all the other increments. The same was true of the sizes of other small parts that were essential modern machinery.
Anyway, the main advantage of metric seems to be its universality between different markets, rather than the somewhat silly idea of ideal relationships such as water in one arbitrary form having a certain weight and volume replacing a system that was derived from a hundreds of years process resembling a genetic algorithm.
The whole point of the American unit system is that they are optimized for specific use, but definitely not for conversion.
ISO paper sizes are a great counterexample, actually, because they took an application, rather than a conversion-centric approach. I.e when you print a signature you do so on a larger piece of paper, and then you fold, crop, and bind. Each fold cuts the paper size in half.
So ISO paper sizes are much more like American units than American paper sizes are.
US paper sizes assume that you have specific design sizes to your printing technology (picas) and then give you a grid of an even number of those units.
Graphic designers / typographers I know tend to prefer the US paper sizes.
Regular people who want to make photocopies of enlarged/reduced pages tend to prefer ISO sizes.
This is complete nonsense and sounds like musings of a desperate advocate.
If you want to believe something, you start to accept the lamest arguments.
You'd certainly also want a universal one for conversion, but not for every day tedium. Fahrenheit is nice because it spans the human experience in a nice range (say damn cold 0 to rather hot 100). But that's bunk if you're in context of chemistry where water's properties are far more comparative (0 freezes to 100 boiling). Though absolute scales are always a bit sporky, since they latch to a scale and wonk it sideways (-273 is a silly number no matter what anyone says).
And so on. Scales and units are merely benchmarks. Literally. Pick the right bench for the job and follow the marks.
(Though with enough effort you can make any scale work. Kinda like hammers and threaded carpenter nails.)
Edited; typing on phones ruins grammar.
In practice, it was not quite that simple, because of the half percent error, and wire insulation doesn't follow the same scaling pattern, but it was still quite handy, and much less trouble than metric, where you constantly had to dig out the wire gauge chart.
What is the unit of mass smaller than an Ounce? Some people know that this is a Grain (7,000 in a pound, which means the awfully convenient division of 437.5 per oz - assuming avoirdupois of course)
What if anything is smaller than an inch? Power of two fractions until 1/64 then "thous" - 15.625 thous in 1/64" - although I do like how biblical (KJV) it looks to be writing "thou"
Then we use Pica and Points. You know, like 12-point font. Which is 1/6th of an inch tall. 72-points to an inch, 12-points to a pica.
Get with the program man!
The point is that people do have a need for units that are tiny fractions of an inch, even in day to day life.
The units that people need to perform calculations with are the ones used in measuring for cooking, sewing, home improvement projects, and things of that nature.
Having said all that, I'm not sure whether this is really on-topic. The original article is very interesting and an example of American measurement system that improved on an imperial precedent.
Ordinary gains are about 4% a year (long-term returns of broad investments in the stock market, adjusted for inflation). If you buy a piece of equipment for $100 it should return $4/year (on top of maintenance costs, depreciation, and the like) otherwise you'll do better by buying a stock-market index fund.
How much do you think it would cost an arbitrary manufacturing company to switch over all their tools and equipment to metric measures, update and test all their designs and schematics, run them through any applicable regulatory agencies, and such? Include the cognitive up-front cost of switching, maintaining dual toolchains and inventories for any gradual transition, the opportunity cost of the profitable projects you would have to postpone while your company experts see to the metric switchover, and everything else. Consider adjusting the return downward a little to account for appropriate risks in this process.
Do you see savings substantially in excess of 4% a year? If so, that's extraordinary, and a worthwhile investment! If not, the company would do better for itself and its shareholders by spending money elsewhere, or returning that money through dividends or stock buybacks so it can be put in a stock market index fund.
If nothing else, the $N billion of nationwide switching costs could have been used on, like, renewable power, or something.
Other things which are better that we'll probably never have include broader gauge railways, Dvorak for everyone, Wankel engines, and Betamax (vs VHS).
Nearly all of the world and most science and engineering measurements in the US have managed the conversion, so why not the rest of the United States?
If the change is performed, the time gained by absolutely everyone making use of more saner unit system could offset the cost? maybe just by avoiding errors?
This are ideas that has to be transformed in order to be shared with mostly everyone else.
Shameful confession, I do use pounds for weight.
The general idea in most of these arguments is that "more sane" is a very hard thing to define. There are cases for which metric is more useful. There are cases where it's less useful. Yet the argument is almost never actually made that way; metric tends to be presented as an ideally perfect system not susceptible to even the most trifling criticism, and any case presented in which it's disadvantageous compared to an alternative is dismissed with an ad hominem attack implying the person making the case is just too much of a stupid American to grasp the heavenly best-in-all-possible-cases-for-all-people perfection of metric.
Truth is it's an arbitrary system optimized for a particular set of use cases. The traditional-ish units used in the US are a different arbitrary system, optimized for a different particular set of use cases.
> Truth is it's an arbitrary system optimized for a particular set of use cases.
Which is a particular set of use cases.
Besides, what are we saying to a future Martian colony with conversion to an Earth-centric unit? The metre was originally defined in 1793 as one ten-millionth of the distance from the equator to the North Pole. Do we think our Martian brethren will be ill educated in history? Would Mars demand its own metre? Are we sending a message of Earth's lasting dominance over the colony?
Perhaps, if we convert, we do the simple change of 1 foot being defined as the distance light travels in one nano-second. Its pretty damn close to that already, and then we don't have issues with Mars.
If you can make a B&S Gauge X+1 wire from a B&S Gauge X wire in one drawing step, while making a metric size N wire from a size M wire requires two drawing steps, that actually matters.
The article alludes to this, but doesn't actually explicitly state it.
I much prefer base-12 or base-16 systems, especially when measuring and hanging drywall!
On the other hand, the British waited way to long (until 1971) to decimal-ize their currency. Before that the pound was made up of 240 pence.
What measure of time do you use? Hour, minute, second, month, day year? Those are pretty arcane. The first four are pretty arbitrary, and existed way before America.
People in general don't give a [ expletive deleted ] about the source code; they just want the answer.
There's a sociological phenomenon at play here all right... a deeply pathological form of envy that Freud called "denial of USA-number-one-ness."
There's a huge (but not insurmountable) lock-in effect behind wire gauges. All the wire I can buy is sized in it. All the wire strippers, crimpers, pins, sockets, plugs, jacks, terminal strips, insert/removal tools and clamps I can buy are sized around the AWG standards. Bulkhead passages in ships and aircraft are sized for carrying certain numbers of wires of certain gauges in bundles. Tens of millions of engineering drawings specify wire sizes in AWG.
So why should we in the US change all of this? Just for the sake of being "modern"? I'll need a more persuasive argument than that.
Anyway at least for awhile if you want legal electrical work done, you're going to have an incredibly expensive and dangerous cutover. Its likely that cutover to metric related mistakes will cause as much property damage and death as "X" years of not harmonizing under one system. Where "X" is probably many more years (lifetimes?) than you'd expect.
Other places saw more benefit to adopting SI units either because they were smaller and closer to lots of other places with a need to harmonise, or playing catch-up. My own country, Britain, falls in the first group - though the population still uses some everyday old units, it didn't manage to be a hold-out despite its historic position and industrial heritage.
It's hard to see the position being static in the longer term though, particularly if world economic growth gradually leaves the USA with a relatively smaller slice (in absolute terms much bigger, but a smaller proportion) of the larger future world economy.
Well temperament wasn't known before 1681, there were predecessors that came close but none used the the twelfth root of two.
Logarithms were introduced by John Napier in 1614 , the twelfth root of two
was first calculated by Marin Mersenne in 1636 , well temperament was introduced by Werckmeister 1681 .
> Around 1600 Simon Stevin did attempt to calculate numerical values for the pitch intervals by decomposing 12th roots into combinations of square and cube roots; his results were not flawless.
He had the right idea but not the right math.
My source for the statement that "instruments were being tuned to this scale well before the invention of logarithms" is an article by Edward Dunne and Mark McConnell, "Pianos and Continued Fractions," Mathematics Magazine, Vol. 72, No. 2 (Apr., 1999), pp. 104-115. They write: "Guitars in Spain were evenly tempered at least as early as the fiftheenth century, two hundred years before Bach. And Hermanus Contractus, born 18 July 1013, invented a system of intervalic notation that anticipated equal temperatment."
My source on Simon Stevin's work is a Rudolf Rasch, "Tuning and Temperament," published as Chapter 7 in The Cambridge History of Western Music Theory." Rasch gives an interesting account of the state of root extraction before the advent of logarithms. He attributes the errors to "Stevin’s sometimes rather reckless rounding of digits after the decimal period, which are often truncated rather than rounded."
- Wild Thing, Josh Bazell