Hacker News new | past | comments | ask | show | jobs | submit login
The 39th Root of 92 (bit-player.org)
217 points by bit-player on Aug 6, 2016 | hide | past | web | favorite | 118 comments

I continue to be baffled by the arcane units in the US.

This uses a measure on the completely wrong abstraction level.

An accidental production process now defines a wire unit instead the end result the consumer/engineer/technician wants to know (mm^2 or inch^2 oh no no it's pica^2 ?).

There must be some sociological phenomenon going on that keeps the US from adopting modern units. Maybe can't admit a shortcoming and fix it?

I am not sure. American units usually focus on divisibility and factoring of integers, while metric units focus on quick base-10 math. So they are different concerns and this is why, I think, people from elsewhere in the world have so much trouble understanding American units.

If you are brought up with metric units, the units are built around the number system.

A good way to understand the problem by for the HN audience is the problem of using IEEE floats where precise numbers matter in programming. The problem with IEEE floats is that we usually think about them in base 10 but they represent something in base 2, and no lossless conversion across the domain is possible. American units bridge that sort of gap much better than metric ones do because they usually focus on divisibility than on quick representation and conversion.

There are certain areas where metric makes sense. If I have prices in kg and I want to know how many g I can buy for a certain amount of money, because the money and weight systems coincide we are well optimized for that problem.

But, imagine calculating the angles of a triangle or a hexagon if your degree system was base 10.

In other words, American units are abstracted around actual application.

There is of course a happy medium. We could change to duodecimal numbers and come up with a duodecimal metric system. Then everyone would get what they want, right? ;-)

Shout out to another base-12 fan. If only we had 12 fingers it might have happened.

Arbitrary divisions of angle aren't very interesting, since the radian is so fundamental. But there is an angle system using multiples of 100: gradians. 100 is a quarter-turn and the internal angles of a triangle add up to 200. A lot of electronic calculators still offer them (the DRG button = degrees, radians, gradians).

Gradians are absolutely terrible. For everyday use, radians are also pretty impractical. I personally like just writing fractions of a full turn, but degrees/minutes/seconds aren’t too bad.

There are two cases where linear divisions of arclength-measured angles make sense: (1) the angles to make regular polygons of low numbers of sides, (2) repeated binary divisions, whose cartesian coordinates can be computed easily because we can just add two vectors and renormalize using an inverse square root which we know how to efficiently compute, and which are then nice for fixed point representations using binary integers in computers. For the first case, I think rational fractions of a turn work best; for the second case, I like binary angles (“brads”) https://en.wikipedia.org/wiki/Binary_scaling#Binary_angles ; losing the ability to precisely represent a thirds of a full turn is just one of those trade offs you sometimes need to make.

Otherwise, treating angle as a linear quantity with precise measurements (instead of some kind of approximations) is for most problems less useful than just using a cartesian coordinate representation for an angle (possibly keeping track of the coordinates squared if we want to stick to rational arithmetic.)

Radians are only really useful for solving calculus problems by hand, or writing academic math/science papers where it’s important to stick to established conventions. For practical computation radians are almost always an inferior choice, more computationally expensive and less precise (they’re built into lots of existing libraries, so can often be convenient despite the inefficiency).

Grads are part of how the metric system was constructed: the metre was originally 1/10000 the distance from the North Pole to the equator via Paris, so 1 grad of arc on a great circle is 100km. Compare the nautical mile which is 1 minute of arc.

> I personally like just writing fractions of a full turn

Well just multiply your fraction with τ = 2π and you get radians. E.g. a whole revolution is τ radians (or 360°), half a revolution is ½τ radians or (180°), etc.

We have 12 non-thumb-finger visible divisions, though!


Each finger has three numbers, starting with the index and ending in the pinky, and you count pointing to the relevant division with your thumb.

One of the Native American tribes went with base-8 because they counted the areas between the fingers because you can haul bottles with those.

I too wish base-12 were used more. Given how much a penny is worth, getting rid of decimal money and substituting a base-12 coin would solve a lot of pizza / dinner cost split problems.

> American units usually focus on divisibility and factoring of integers

The other day, right next to the problematic awg definition, I discovered the kcmil. Which is a kilo pi/4 Micro square inch.

Have you heard of the "cubic ton"? I am not kidding.


"Hundred" used to mean different things for different commodities.

four score for some things, six score for others, or for onions, a hundred was 15^2

That is beautiful.

I can see what you are trying to say, but I really don't think you're right. Yes, fractions are sometimes nicer with other subdivisions than decimal, but e.g. angles of a triangle: right-angled with 90 deg + 2 x 45 deg would be 50 + 2 x 25 deg. Works fine. Ultimate precision isn't necessary in everyday use.

The real reason why the US hasn't switched is that it is really painful to change something like this.

If you have never wired a house for electrical, then sure, the units seem arbitrary. But if you do, then the units seem very comfortable. 12ga for most typical 20a circuits, 14ga for light, 10ga or thicker for some heavy-duty circuits.

Hey boss, we seem to be all out of 3.31mm^2 romex, all I got left is 2.08mm^2. And why did you mention cross-sectional area? Why not radius? or diameter? Or better yet, circumference? I'll take awg12 and awg14 please.

The "it's more natural" argument, fielded by both sides, is largely a myth - take it from someone who lived in several countries and struggled with different units. Pretty much the only reason why something feels natural to you is because that's what you are accustomed to using.

There's nothing more natural about an inch than a centimeter, and nice, round, easy-to-remember numbers can be had on both scales for "natural", commonly found distances. Ditto for pounds and kilograms, Celsius and Fahrenheit, acre and hectare etc.

Objectively, metric wins because it uses the same decimal scale as our number system, and because the units are designed to establish the most straightforward relations between different quantities.

Yes, using 12 for a base has some advantages due to more divisors, but not being consistent with decimal wipes them all out (and traditional units don't consistently use 12, either - consider units of volume, for example). In an ideal world, we'd have 6 fingers on each hand, and use base-12 everywhere; alas...

I don't know what's natural, but I live in Finland where we use the metric / SI system extensively. Some people don't even know how long an inch is, let alone a foot.

Recently I was building a roof with an old, very experienced builder. Turns out that on construction sites, nails and planks are always discussed in inches, even when they're actually metric. So a 60mm nail would be "a two point fiver" ("kakspuokki" or such in Finnish).

The reverse is true for metric pipe threads. They're named with mm but are actually the same dimensions as the inch threads. Luckily the inch numbers have only an arbitrary relationship to any dimensions of the thread anyway so nobody will get confused trying to measure them.

The sizes of lumber and other construction materials are inch based, rounded to mm, even in Finland. Instead of 1/4", 1/2", and 3/4" we have 6.5mm, 12mm and 18mm.

Note the inconsistency in 1/4".

I mean I hate the US customary system (i'm from a metric country and metric early-education), but jeez, AWG is fine IMO.

What's the purpose of having mm^2? You can't measure it any easier (how the heck are you gonna measure area without calipers? you're gonna need a gauge with holes in it to identify wires), nor is it easier to express the number easily.

For manufacturing purposes, expressing the radius or diameter might be good, but for using them, the AWG number is really nice and streamlined.

For manufacturing you want the cross-sectional area because that relates most directly to the quantity of material.

You may have a point when it comes to mm vs. inches, etc. But my post was about logarithms vs. cross-sectional area vs diameter vs. circumference. Does your point still hold?

Cross-sectional area is most important metric for wires because it is directly (and linearly) related to resistance which in turn is related to current carrying capability.

It's really a question that ought to be asked to someone who actually deals with wires in the industry as their day to day job, but I suspect that area is actually the most useful metric, because it can be directly plugged into formulas for tensile strength and electric resistance.

This doesn't really preclude a logarithmic scale, but it should be the kind that's easy to convert (i.e. increasing numbers denote increasing area). Looking at AWG, it could actually even be decimal, like dB. Consider: 17 gauge is almost exactly 1 mm^2 in area, so if we pick exactly mm^2 as 1 on our hypothetical scale, then 10 would be 10mm - close to 7 gauge, and -10 would be 0.1mm - close to 27 gauge. And there are plenty of industries that already know how to work with dB scale, and use the shortcuts that it offers.

By the way, while looking up related things, I've discovered the existence of a weird unit called "circular mil" (basically, cross-section of a wire 1 mil in diameter) that is, apparently, already used in US for wires that are out of bounds on AWG gauge scale. Which seems to indicate that cross-section area is, indeed, the preferred metric.

That's the trick. In a sense, the units follow the application, and hence English units always feel a bit natural. They have a tendency to have integer multiples and simple fractions.

And when I say natural, I don't mean elegant. They often tend to be thoroughly arbitrary. But they match the needs, which is usually reasonably pragmatic. For example, 360 degrees is ideal for simple in-head directional geometry, but radians are by far simpler in algebra or very precise measurements (because Pi can often be neatly factored out).

Once you start using enough decimal points, all units are lousy. I remember using angstroms in astronomy because it fit better into the optics theory and the distances are already absurd it didn't matter. So you may as well use the units that are convenient, and just get good swapping.

(PS: glass sheets are sold by the square foot, but in thicknesses measured in millimeters. Turns out to be pretty convenient that way.)

The Sumerians divided the circle into 360 degrees because the Sun's annual path took 360 days to come around a full circle. It's no wonder that 360 degrees feel natural for directions.

Another thing I've encountered: in South India, there is a measure of distance & time called a "nazhika". It is 24 minutes. 2 1/2 nazhika is one hour, 60 nazhikas make a day. A nazhika is also roughly the time taken for a normal person to walk 1 mile. Hence a nazhika is also used as approximately one mile. Seems a bit weird until you remember that light-years involve the same identification of time and distance.

Non-metric units like these and others ("foot") have probably very good reasons behind them, and they could be used by people in their day-to-day activities while not carrying measuring instruments with them. Metric System is more systematic, but people "lose touch" with intuitions of quantity.

I assume that the Sumerians were observant enough to realize that there were still a few days missing, and that 360 was better because it was a more divisible number actually.

The Sumerians did, in fact, periodically insert intercalary months in order to offset the difference, which suggests that this is in part close approximation and on the other part nice for application.

I feel like unit debates often result from a misunderstanding between sides of "is perfect the enemy of good?" I thought it suspicious that they would have miscounted the days, but you're right: it's so dang useful that it's worth a little error (with occasional corrections) for such convenience.

Of course, at some point it becomes a huge pain and you buckle down and choose something just as arbitrary but easier to handle. And that's why we don't count angles in the milliseconds since 1970. Or days in a year; though we do sometimes measure 3D angles in solid minutes, which seems like a metaphor taken too far :)

How many kiloseconds will you spend working next week?

Even in metric-using countries, nobody uses it for time.

For actual applications of spherical trig though degree, minute, second makes a lot of sense, primarily because the earth rotates about one arc second every 4 minutes, and if you know this then you can do manual navigation via the stars and many other things.

However for many things I do use metric time, just not for the human aspect of it. For example, for one customer (admittedly in the sciences), we had to help them estimate how much hardware they needed for additional load. So you do the work in seconds because at that point the math is easiest, and convert to ratios following.

I for one would love to have metric time. Just today I had to add some durations together to get total duration, which would have been simple thing to do with metric time.

Of course for practical use using SI second wouldn't be very good solution. Traditionally second is derived from the length of day, and I think that would make sense for metric time too. 1 milliday would be somewhat close to 1 minute and 50 millidays (or maybe half deciday) would be close to one hour. Of course the name probably should be something else than "day" to reduce confusion.

So I could be working something like 16 decidays next week.

I did not know this. Thanks.

Regarding your PS: The most baffling measurements to me are tires. The diameter is expressed in inches, the width in millimeters, and the sidewall thickness is given as a percentage of tread width. So you get 255/40R17 to describe a 17” tire that's 10” wide with 4” sidewalls. (Or a 430mm tire that's 255mm wide with 100mm sidewalls)

And try to replace your wheel, for example, for one that has 16 or 18 inches, and calculate the size of the tire to have the same final size...

One of the neat things about degrees and plane geometry is how regular polygons you can have without partial degrees.

The number of sides depends on the divisiblity of 360 by the number of angles so:

3, 4, 5, 6, 8, 9, 10, 12, 15, 18, 20, 30, and so forth.

For angle measurement where you want a closed geometry figure at the end, degrees are extremely elegant, and 360 is 2^3x3^2x5

360 is arbitrary I hope you know. https://en.wikipedia.org/wiki/Degree_(angle)#History

Wouldn't a much more natural unit for angular measurements be 1/(2*pi)? That way angular measurements are represented as their fraction of a circle (which is pretty easy to visualize; 180 degrees becomes 1/2.)

You'd lose the definition of angle as arc length / radius, which would make your proposed unit just as arbitrary as degrees - most formulas involving angles would need to contain a conversion factor.

You'd also lose the small-angle approximations sin(theta)=tan(theta)=theta which in my field are used extensively to convert nonlinear to linear equations.

Nonetheless, your proposed unit already exists and is called tau. Or write it as 2pi if you want to be more easily understood.

The most relevant point here is that the natural choice of units is very subjective and depends on the task at hand. For example, particle physics uses "natural units" where all units are powers of gigaelectronvolts: https://en.wikipedia.org/wiki/Natural_units#.22Natural_units...

Same deal for metric, though. I know I need 0.75mm2 for small appliances, 1.5mm2 is your run-of-the-mill cable for up to 10A (equal to 20A at 110V), 2.5mm2 and 4.5mm2 for heavy-duty circuits.

I've sometimes heard the same remark about temperatures: jeez, how do you guys manage to work with something as unintuitive as degrees celcius? But really, you just get used to whatever it is you're using.

Wait, is there a temperature measurement out there that's mire intuitive than "water freezes at zero and boils at 100"?

In Sweden, when running wiring we'll refer to the dimensions by monikers like "one and half squared", "six squared" and so on, corresponding to "1.5mm^2" etc. It has never felt very bulky to me, and doesn't seem like more effort than saying 14-gauge (or at least close enough).

If you're going to knock the other way, wouldn't t be far to at least check what that it is rather than making something up as a strawman?

Your example of mm is not true at all. There's no need for 3 significant figures for electrical wire size. Where I am, it's 2.5mm^2 for 10 A circuits, 1.0mm^2 for lighting, etc. The 2nd digit is always a 0 or a 5 so it's even simpler than it looks.

In New Zealand, I'm fairly sure that the standard is to use diameter to measure wire. It's something like 2 mm for a 10 Amp circuit.

awg14 or whatever isn't exactly the perfect gauge for whatever application it is used for. It is somewhat arbitrary. A nice metric system of gauges would have 2 mm, 2.5 mm, 3 mm, etc. and there would be still be a nice round number gauge for each application.

For some value of "nice round number". What really matters with most wires is the current carrying capacity, which largely depends on the cross-sectional area, not the diameter. Equally-spaced diameters don't give you equally-spaced areas.

If you have an industrial production process which is going to produce a set number of wire sizes, and you want them to be maximally useful for a wide variety of applications, you want a logarithmic scale.

Responding to commenter e2e8: Using 2mm, 3mm, etc. would be substantially less functional.

Once you have a log scale, it hardly matters whether you measure diameter or area, that’s just a constant factor.

In the ideal case, the log scale would have a slightly easier to compute definition for the dilation at each step than the 39th root of 92 (~1.1229).

Perhaps they could use the 6th root of 2 instead (~1.1224). Then every 6 steps you’d get a factor of two. Size 0 could be defined as 1mm diameter (or size 0 could be 1 millimeter squared cross sectional area, with 3rd root of 2 as the step each time).

In practice if you’re wiring a house, it doesn’t much matter.

* * *

One of the reasons the metric system built on a base ten number system is so frustrating for practical purposes is that powers of ten are entirely arbitrary and indivisible, and tenth roots of ten or numbers expressed in terms of natural logarithms are even worse. We’d be much better off with a general-purpose base twelve number system, plus log scales uniformly designed around the twelfth root of 2. [Western music scale, ISO paper sizes, etc. would fit right in.]

AWG is exponential not logarithmic.

For an exponential scale you can just pick some reasonably evenly-spaced round numbers and repeat them at different factors of 10: 1, 2, 5, 10, 20, 50, ...

You can plot out the wire gauges you want on a log scale and then round them to nice whole (metric measurement) numbers. That is in fact what is done for things like fasteners.

And then when you try to scale your whole design up, whoops, all the rounding changes and you need to redo everything.

Anyway, neither way here is right or wrong. One optimizes for uniformity of scaling, the other optimizes for intelligibility with a base ten number system.

It's not like you are making tight fits with wires anyway, a millimeter here or there won't matter.

However, using something like the E12 series would work pretty well.

Sure. Those number I gave could be areas. I am not certain this the right standard, but IEC 60228 specifies 0.5 mm2, 0.75 mm2, 1 mm2, 1.5 mm2, 2.5 mm2, 4 mm2 etc...


Just like with software, backwards compatibility is a double-edged sword.

You can look at almost any land assignment, building or commercial sale ever in the history of the United States and be able to compare / modify / extend it without any conversions. If my great-great-grandfather claimed 160 acres from the Homestead Act, any documentation from that time is just as easily understood today without any conversions.

Differing units only become an issue when globalization factors in (which is a fairly recent phenomenon). Many fields in the states, such as the sciences, have indeed chosen interoperability over backwards-compatibility, but these decisions are done on a field-by-field basis.

I'm in the camp that it'd generally be better to just rip the band-aid off and switch everything, but in general I value interoperability over backwards compatibility with software too. That said, who am I to tell Microsoft that they shouldn't worry about Windows backwards-compatibility? Who am I to tell farmers that they should measure their land in hectares?

> You can look at almost any land assignment, building or commercial sale ever in the history of the United States and be able to compare / modify / extend it without any conversions.

What state? Are those 160 international acres, or 160 US survey acres?


What weighs more - an ounce of feathers, or an ounce of gold? (An ounce of gold because gold is measured in Troy ounces...)

As much as I hate the metric system myself, I don't think it's true that U.S. measuring culture is a self-consistent system that avoids unnecessary conversions.

For example the energy content of fossil fuels tends to be measured in BTU per (whatever), while electrical energy is measured in kilowatt hours. On the other hand, automobile engines are often rated in horsepower.

I find this sort of thing is ubiquitous in engineering. In metric-only countries, the situation is slightly less bad because that system provides somewhat fewer opportunities for mischief.

In imperial vs. metric, I don't think the actual units are as big of a difference as the convention of using fractions vs. decimals.

When doing manual work, e.g. woodworking or metalwork it's pretty convenient to use fractions. You tend to divide things in half or quarters.

With most things not made by hand any more, using fractions and inconsistent units like inches, feet, yards and miles is a nuisance for engineers.

Inches, feet, and yards and miles aren't really inconsistent, becuase they are all related by somewhat reasonsable integer ratios. In this computer age, there isn't even much advantage in replacing all those ratios with powers of ten.

It's when people inflict mixtures of BTUs with h or Joules with eV on us that I get aggravated. Each of those units has its place, but only in particular domains. Worse, the people most likely to use domain-specific units, are the ones most likely to not name the unit they are using.

Engineers are such people.

The SI system helps reduce the risks a little. But only a little, I have seen German engineers measure velocity in degrees.

If you're working with floating point in meters, all it takes is a look at the exponent (e.g. 1.75923e-5) to figure out whether it's micrometers, centimeter or kilometers. Not as practical with a non base-10 unit system.

Who says people should collectively forget what an inch is?

Every other country made unit conversions.

The people will use old units for some time to buy cheese at the counter and all is well.

Yeah, right up until your car lease contract limits you to 2000 Swedish Miles per year. (Note, these are about 6x longer than American miles).

Or you buy fruit at the supermarket in Indonesia in ounces and kg (an ounce is an hg).

I guess anyone who knows what Swedish mil is will specify kilometres instead.

I cannot remember a single time I have seen a contract specifying mil instead of km.

("Mil" == 10km == 10 000m is used here as well only not in formal or scientific contexts that I can think of but maybe that is a difference between Sweden and here.)

My car lease (in Sweden) is in Swedish miles.

2000 miles (i.e. 20k km).

The problem with your argument is that most other countries have converted to metric without any issues regarding the fact that your land used to be 1000 acres and is now 404 Ha.

and for some of these countries, their official records can even be older than the 250 years you'll find in the US.

They took an existing system and standardized it with as few changes as possible. It's the equivalent of continuously updating the same code base instead of rewriting from scratch.

You know what is worse than imperial units? Measurements that were traditionally inches but converted to millimeters, inconsistently.

I do woodworking in a metric country. It's terrible. My "quarter inch" chisel is 6mm but my "quarter inch" plywood is 6.5mm so it won't fit in a groove made by the chisel. 3/4" chisel is 20mm but plywood is 18mm. All the measurements are inch based but converted to metric arbitrarily rounding up or down.

3/4" is the only measurement that works well, it's 19.004mm. But a 19 mm drill bit isn't a part of any standard set.

Some automotive applications are similar. The Toyota Hilux pickup truck has an indestructible reputation because it was cloned by the Japanese from a US truck. They rounded every measurement up to nearest millimeter. It's beefier than the original while having a smaller and less powerful engine. You do need some exotic sized wrenches to work on it, though.

The English system of measure is based on offsets to make retooling factories easily done in house. It's very similar to metric paper sizes. Look into the history it's actually quite logical.

Appelbaum: So the people blocking adoption of the metric system weren't backward-looking traditionalists, but cutting-edge industrialists?

Mihm: That's correct. While the anti-metric forces included outright cranks, including people who believed that the inch was a God-given unit of measurement, the most sophisticated and powerful opponents of the metric system were anything but cranks. They were engineers who built the industrial infrastructure of the United States. And their concerns, while self-interested, were not entirely off base. Whatever the drawbacks of the English units, the inch was divided in ways that made sense to the mechanics and machinists of the era: it was built around "2s" rather than "10s," with each inch subdivided in half and in half again—and so forth. This permitted various sizes of screw thread to have some logical correspondence to all the other increments. The same was true of the sizes of other small parts that were essential modern machinery.


The philosophy seems to be reversed with the ISO 216 paper size standards like A4.

Anyway, the main advantage of metric seems to be its universality between different markets, rather than the somewhat silly idea of ideal relationships such as water in one arbitrary form having a certain weight and volume replacing a system that was derived from a hundreds of years process resembling a genetic algorithm.

I wonder if it is an internal/external optimization. The whole point of metric is that it is built around the number system, leading to easy conversion between units.

The whole point of the American unit system is that they are optimized for specific use, but definitely not for conversion.

ISO paper sizes are a great counterexample, actually, because they took an application, rather than a conversion-centric approach. I.e when you print a signature you do so on a larger piece of paper, and then you fold, crop, and bind. Each fold cuts the paper size in half.

So ISO paper sizes are much more like American units than American paper sizes are.

ISO paper sizes assume that content is scale-invariant.

US paper sizes assume that you have specific design sizes to your printing technology (picas) and then give you a grid of an even number of those units.

Graphic designers / typographers I know tend to prefer the US paper sizes.

Regular people who want to make photocopies of enlarged/reduced pages tend to prefer ISO sizes.

Totally misses what ISO paper sizes are designed for, namely book binding and signature printing. That's why the scale-invariant approach is so important. Once you know how many leaves in a signature and what page size you want, it is easy to figure out which paper size you want to print on.

So where to 3/8" and 5/16" threads play into this? They aren't easily divided by 2, or multiplied by 2 to make a larger thread.

3/8 is half between 1/4 and 1/2. 5/16 is half between 4/8 ( 1/2 ) and 3/8.

And going the other way, 2 * 3/8 = 3/4, and 2 * 5/16 = 5/8, both of which are also standard sizes.

5/16" is the world's most compatible thread. The numbers are close enough that, in the most common thread pitch, you can usually get away with using it interchangeably with M8.

Yeah the engineers love the process of calculating if that wire can carry the specified current. Because it's so industry friendly.

This is complete nonsense and sounds like musings of a desperate advocate.

If you want to believe something, you start to accept the lamest arguments.

You don't really recalculate that stuff over and over, do you? Every electrician I've worked with has a couple of the main ratings memorized and just applies it to standard. Anything more complex requires a calculator or tables either way, and once you're not doing the math in your head, it really doesn't matter what units you use. They're all arbitrary, so you may as well snap 'em to an integer value of convenience.

You'd certainly also want a universal one for conversion, but not for every day tedium. Fahrenheit is nice because it spans the human experience in a nice range (say damn cold 0 to rather hot 100). But that's bunk if you're in context of chemistry where water's properties are far more comparative (0 freezes to 100 boiling). Though absolute scales are always a bit sporky, since they latch to a scale and wonk it sideways (-273 is a silly number no matter what anyone says).

And so on. Scales and units are merely benchmarks. Literally. Pick the right bench for the job and follow the marks.

(Though with enough effort you can make any scale work. Kinda like hammers and threaded carpenter nails.)

Edited; typing on phones ruins grammar.

I actually did this a lot when I was designing electric motors. One detail the original article didn't note, is the 39th root of 92 is very close to the 6th root of 2, off by half a percent. This makes scaling for voltage changes simple. Going from US 120V to European 240V? Go up three wire gauges and double the number of turn.

In practice, it was not quite that simple, because of the half percent error, and wire insulation doesn't follow the same scaling pattern, but it was still quite handy, and much less trouble than metric, where you constantly had to dig out the wire gauge chart.

The thing that always gets me about this is how incomplete it is (at least in most people's experience)

What is the unit of mass smaller than an Ounce? Some people know that this is a Grain (7,000 in a pound, which means the awfully convenient division of 437.5 per oz - assuming avoirdupois of course)

What if anything is smaller than an inch? Power of two fractions until 1/64 then "thous" - 15.625 thous in 1/64" - although I do like how biblical (KJV) it looks to be writing "thou"

> What if anything is smaller than an inch?

Then we use Pica and Points. You know, like 12-point font. Which is 1/6th of an inch tall. 72-points to an inch, 12-points to a pica.

Get with the program man!

Too bad we had to go and make displays with different pixel densities...

Most people never use units smaller than can be accounted for by fractional inches and ounces in their daily lives, and never did. The only people who ever used grains were pharmacists, jewelers, and the like. The only people who ever had to resort to thous were precision machinists and the like.

Open Amazon, and search for something as mundane as plastic bags or trash cans. Notice that their thickness is specified in "mils". How many people make purchasing decisions on those products, for use in their daily lives?

All you need to know there is that a 3 mil is thicker than a 2 mil. That doesn't require any kind of calculation at all.

Most units are used in the same manner - no-one is mentally converting kilometers into meters to compare two km distances, for example.

The point is that people do have a need for units that are tiny fractions of an inch, even in day to day life.

Miles and km are the same, just on the the other end of the scale. A straight magnitude comparison requires no calculation.

The units that people need to perform calculations with are the ones used in measuring for cooking, sewing, home improvement projects, and things of that nature.

Have you considered how much it would cost to change units, and how absolutely little would be gained?

I would imagine the gains would be extraordinary - the deadweight cost of interacting with the metric world must be huge. The needlessly complicated calculations involved in dealing with things like fractions of inches impose their own subtle tax on productive effort. And let's not forget the avoidance of occasional catastrophes like space probes doomed by a conversion miscalculation or an airliner running out of gas in the middle of nowhere for a similar reason.

Having said all that, I'm not sure whether this is really on-topic. The original article is very interesting and an example of American measurement system that improved on an imperial precedent.

"Extraordinary gains?"

Ordinary gains are about 4% a year (long-term returns of broad investments in the stock market, adjusted for inflation). If you buy a piece of equipment for $100 it should return $4/year (on top of maintenance costs, depreciation, and the like) otherwise you'll do better by buying a stock-market index fund.

How much do you think it would cost an arbitrary manufacturing company to switch over all their tools and equipment to metric measures, update and test all their designs and schematics, run them through any applicable regulatory agencies, and such? Include the cognitive up-front cost of switching, maintaining dual toolchains and inventories for any gradual transition, the opportunity cost of the profitable projects you would have to postpone while your company experts see to the metric switchover, and everything else. Consider adjusting the return downward a little to account for appropriate risks in this process.

Do you see savings substantially in excess of 4% a year? If so, that's extraordinary, and a worthwhile investment! If not, the company would do better for itself and its shareholders by spending money elsewhere, or returning that money through dividends or stock buybacks so it can be put in a stock market index fund.

Clearly I was talking about switching the entire American society from imperial to metric. Your switch of focus to the impact on a single company and it's return on capital is certainly interesting but I am not 100% convinced it's germane. It's not necessarily a good idea to apply a Wall Street lens (quarters trump decades!) to every decision. Other countries around the world have successfully absorbed the short term pain involved and are reaping long term gains. Sticking to imperial forever means paying an imperial deadweight tax forever.

Other countries around the world have wasted billions and trillions on a lot of things (senseless wars, for starters) and gotten through the short-term pain that this has caused - that doesn't mean it was a good idea.

If nothing else, the $N billion of nationwide switching costs could have been used on, like, renewable power, or something.

It sounds as if you really think it's a good idea to stick with imperial. In your opinion is this a common attitude within the technologically sophisticated section of US society? I didn't think it was but maybe I am mistaken.

I prefer metric, from a policy perspective, I just think the path dependence has too strong a case to revamp industrial toolchains. (Changing the road signs and temperatures, by contrast, is both trivial and useless.)

Other things which are better that we'll probably never have include broader gauge railways, Dvorak for everyone, Wankel engines, and Betamax (vs VHS).

It's important to clarify whether you mean to change just the units used to describe the standard wire sizes, or changing the standard wire sizes themselves. No logarithmic scale is going to be easy to express precisely and concisely using a unit for diameter or cross-sectional area, but changing the standard sizes to something with convenient numbers in a new unit system would break compatibility with a massive amount of infrastructure.

>Have you considered how much it would cost to change units, and how absolutely little would be gained?

Nearly all of the world and most science and engineering measurements in the US have managed the conversion, so why not the rest of the United States?

Maybe can be re-framed as how much has been lost?

If the change is performed, the time gained by absolutely everyone making use of more saner unit system could offset the cost? maybe just by avoiding errors?

This are ideas that has to be transformed in order to be shared with mostly everyone else.

Shameful confession, I do use pounds for weight.

more saner unit system

The general idea in most of these arguments is that "more sane" is a very hard thing to define. There are cases for which metric is more useful. There are cases where it's less useful. Yet the argument is almost never actually made that way; metric tends to be presented as an ideally perfect system not susceptible to even the most trifling criticism, and any case presented in which it's disadvantageous compared to an alternative is dismissed with an ad hominem attack implying the person making the case is just too much of a stupid American to grasp the heavenly best-in-all-possible-cases-for-all-people perfection of metric.

Truth is it's an arbitrary system optimized for a particular set of use cases. The traditional-ish units used in the US are a different arbitrary system, optimized for a different particular set of use cases.

    > Truth is it's an arbitrary system optimized for a particular set of use cases.
Actually it's an arbitrary system optimised for calculations.

> Actually it's an arbitrary system optimised for calculations.

Which is a particular set of use cases.

It would be a serious cost. Saw cost estimates on changing the letterhead for a US State and that was obscene. Changing units makes that look like a pittances.

Besides, what are we saying to a future Martian colony with conversion to an Earth-centric unit? The metre was originally defined in 1793 as one ten-millionth of the distance from the equator to the North Pole. Do we think our Martian brethren will be ill educated in history? Would Mars demand its own metre? Are we sending a message of Earth's lasting dominance over the colony?

Perhaps, if we convert, we do the simple change of 1 foot being defined as the distance light travels in one nano-second. Its pretty damn close to that already, and then we don't have issues with Mars.


Have you considered how much it costs to keep them?

I strongly suspect that it was anything but "accidental", but was rather related to just how much you could draw down the diameter of a wire in one step without breaking it.

If you can make a B&S Gauge X+1 wire from a B&S Gauge X wire in one drawing step, while making a metric size N wire from a size M wire requires two drawing steps, that actually matters.

The article alludes to this, but doesn't actually explicitly state it.

I remember when reading "Napoleon: A Life" by Andrew Roberts, Napoleon wasn't a big fan of the base-10 metric system, but he figured it would be the best way to break with the hundreds of different measuring systems in place in France at the time, and enforce an empire-wide standard.

I much prefer base-12 or base-16 systems, especially when measuring and hanging drywall!

On the other hand, the British waited way to long (until 1971) to decimal-ize their currency. Before that the pound was made up of 240 pence.

Maybe, but knowing mm^2 doesn't really help either. This AWG system has the benefit of discretization of wire thickness, so manufacturers can optimize their processes for only ~30 gauges instead of answering "what do you mean you don't have 3.1415mm^2 wires?!"

This is a bit of a middlebrow comment that we tend to unfortunately find at the top on HN.

What measure of time do you use? Hour, minute, second, month, day year? Those are pretty arcane. The first four are pretty arbitrary, and existed way before America.

The derivation isn't all that interesting; the use is. Everybody who cares already knows what 12 ga. wire is.

People in general don't give a [ expletive deleted ] about the source code; they just want the answer.

> There must be some sociological phenomenon going on that keeps the US from adopting modern units. Maybe can't admit a shortcoming and fix it?

There's a sociological phenomenon at play here all right... a deeply pathological form of envy that Freud called "denial of USA-number-one-ness."

Okay, I'll play devil's advocate here.

There's a huge (but not insurmountable) lock-in effect behind wire gauges. All the wire I can buy is sized in it. All the wire strippers, crimpers, pins, sockets, plugs, jacks, terminal strips, insert/removal tools and clamps I can buy are sized around the AWG standards. Bulkhead passages in ships and aircraft are sized for carrying certain numbers of wires of certain gauges in bundles. Tens of millions of engineering drawings specify wire sizes in AWG.

So why should we in the US change all of this? Just for the sake of being "modern"? I'll need a more persuasive argument than that.

A somewhat more interesting problem is we live in a centrally controlled economy with many centers. There's a paperwork storm downstream of thousands of local building codes. Most are minor variations on minor details of the NEC, but it would be a huge job to harmonize and metric-ify them. For example where I live, politicians "had to do something" after someone died in a pool electrocution decades ago, so our local code is NEC plus a microscopic tightening of 1950s NEC swimming pool regs (which ironically are probably looser than 2016 NEC swimming pool regs, negating the politicians intent from decades ago).

Anyway at least for awhile if you want legal electrical work done, you're going to have an incredibly expensive and dangerous cutover. Its likely that cutover to metric related mistakes will cause as much property damage and death as "X" years of not harmonizing under one system. Where "X" is probably many more years (lifetimes?) than you'd expect.

And that's just the economic "center" for commercial and residential structures. There are centers in lots of other industries. I referred to the one that I deal with directly (fighter aircraft) in my original post. I'm sure you've thought about multiplying this effort across hundreds of other industries...

Yes, that's the core of the issue. The US is a big monobloc, an economic superpower with not only the most advanced economy but also a lot of the history tied to developing the world's industries.

Other places saw more benefit to adopting SI units either because they were smaller and closer to lots of other places with a need to harmonise, or playing catch-up. My own country, Britain, falls in the first group - though the population still uses some everyday old units, it didn't manage to be a hold-out despite its historic position and industrial heritage.

It's hard to see the position being static in the longer term though, particularly if world economic growth gradually leaves the USA with a relatively smaller slice (in absolute terms much bigger, but a smaller proportion) of the larger future world economy.

I agree. The only way things in the US will change with regards to standards like this will be if it becomes excruciatingly painful to continue being different.

I think you're making some assumptions. When I was coming up through the school system in the 90s, I thought metrication was a federal government thing (since it is). I doubt my parents are aware of the origins of the metric system. Things may have changed for the youngest generation though.

> It’s worth noting that instruments were being tuned to this scale well before the invention of logarithms. I assume it was done by ear or perhaps by geometry, not by algebra.

Well temperament wasn't known before 1681, there were predecessors that came close but none used the the twelfth root of two. Logarithms were introduced by John Napier in 1614 [1], the twelfth root of two was first calculated by Marin Mersenne in 1636 [2], well temperament was introduced by Werckmeister 1681 [3].

> Around 1600 Simon Stevin did attempt to calculate numerical values for the pitch intervals by decomposing 12th roots into combinations of square and cube roots; his results were not flawless.

He had the right idea but not the right math.

[1] https://en.wikipedia.org/wiki/John_Napier

[2] https://en.wikipedia.org/wiki/Twelfth_root_of_two

[3] https://en.wikipedia.org/wiki/Well_temperament

(I'm the OP here.)

My source for the statement that "instruments were being tuned to this scale well before the invention of logarithms" is an article by Edward Dunne and Mark McConnell, "Pianos and Continued Fractions," Mathematics Magazine, Vol. 72, No. 2 (Apr., 1999), pp. 104-115. They write: "Guitars in Spain were evenly tempered at least as early as the fiftheenth century, two hundred years before Bach. And Hermanus Contractus, born 18 July 1013, invented a system of intervalic notation that anticipated equal temperatment."

My source on Simon Stevin's work is a Rudolf Rasch, "Tuning and Temperament," published as Chapter 7 in The Cambridge History of Western Music Theory." Rasch gives an interesting account of the state of root extraction before the advent of logarithms. He attributes the errors to "Stevin’s sometimes rather reckless rounding of digits after the decimal period, which are often truncated rather than rounded."

"In metric, one milliliter of water occupies one cubic centimeter, weighs one gram, and requires one calorie of energy to heat up by one degree centigrade—which is 1 percent of the difference between its freezing point and its boiling point. An amount of hydrogen weighing the same amount has exactly one mole of atoms in it. Whereas in the American system, the answer to “How much energy does it take to boil a room-temperature gallon of water?” is “Go fuck yourself,” because you can’t directly relate any of those quantities."

- Wild Thing, Josh Bazell

As a guitar player I'm glad the article also mentioned equal temperament tuning. The frets of a guitar for example are spaced using the 12th root of 2. Musical theory is a tangled rat's nest, but at least that part is easy to understand.

Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact