In the 1990s, Grand Central was rewired, and everything except railroad traction power was converted to 60Hz. All conversion equipment was replaced with solid state gear. It took quite a while just to find everything that was powered off one of the nonstandard systems.
It wasn't until 2005 that the last 25Hz rotary converter was retired from the NYC subway system. (Third rail power is 600VDC, but subway power distribution was 13KV 25Hz 3-phase.)
The only thing missing is 400Hz power!
Do you mean 50Hz there?
>"Until the 1990s, Grand Central Station in New York had almost everything - 60Hz commercial power, 40Hz LIRR power (Pennsylvania Railroad standard) 25Hz NYC Subway power, 700VDC Metro North power, 600VDC subway third rail power, and some old Edison 100VDC power."
and then subsequently states:
."In the 1990s, Grand Central was rewired, and everything except railroad traction power was converted to 60Hz."
I am asking because I am trying to make sense of the entire comment. They also mention 4 or 5 other electrical frequencies and voltages, so there's a lot packed into that comment. I am genuinely interested in the comment and trying to learn something.
A summary: once upon a time, Grand Central Station had AC at multiple frequencies in addition to 60hz (also it had some DC power). In the 1990s it was rewired so that the AC only used the (American) standard 60hz.
I suppose its hard to make sense of if you didn't know that 60Hz is the standard in the U.S.?
They probably had some old 60Hz three phase supply, maybe something weird like a "wild leg" configuration, but when everything was modernized they likely switched to 480V three phase for the big stuff like the ventilation motors, and 240V single phase with neutral for supplying smaller motors and 120V systems.
I think Before 1948, LA's Power Grid Ran at 50hz :)
This made matters trickier after Fukushima, as the nation is effectively two smaller electricity grids, not one large one - so making up for the shortfall became harder than it could have been. (However, there's a massive frequency converter interface between the two grids.)
Edit: Aw, shucks - now that I revisit the article, I see the exact same points being made in that article's comment section. My bad.
The problem still exists but many of newer machines have its own inverters so it's less of the issues...
Also, running a 60Hz motor on a 50Hz grid will cause it to run hotter (larger current than nominally on 60Hz; also, less efficient cooling as it turns slower). If the designer designed to a price point rather than to a standard, this may be an issue.
On the macro scale, however, you've got a problem if, say, you suddenly lose a power plant in the 50Hz region of your country. You cannot then simply make up for the shortfall by distributing the load between all plants in the country (or, unless you're an island nation like Japan - from your neighbours, too) - only the ones in the 50Hz part of it.
This is just a long-winded way of saying that the larger your power grid, the more robust it is when a power plant goes offline.
The Japanese are in the unenviable position of having two small national grids rather than one large one - and, for an encore, they're on a bunch of islands and likely unable to import significant power from neighbours, too!
Hence any plant downtime is felt much harder than it would any other place. Tough luck.
The timezone database (maintained by the people who are very particular about making sure that a time specified is a well known time) have a note in the northamerica data file:
# From Paul Eggert (2016-08-20):
# In early February 1948, in response to California's electricity shortage,
# PG&E changed power frequency from 60 to 59.5 Hz during daylight hours,
# causing electric clocks to lose six minutes per day. (This did not change
# legal time, and is not part of the data here.) See:
# Ross SA. An energy crisis from the past: Northern California in 1948.
# Working Paper No. 8, Institute of Governmental Studiåes, UC Berkeley,
# 1973-11. http://escholarship.org/uc/item/8x22k30c
As for the U.S. refusing to adopt the planet-wide metric system? You've got me there. That's just… Weird.
Language isn't close to a solved problem. Hopefully the next hundred years can finally see the reconstruction of the Tower of Babel for a generation of the world soon to come. But there also needs to be a legitimate reason for us to want to unify language. If we operate independently of one another and let businesses control any interactions by proxy of money, we will stay separate.
As for language, I'm kind of split. On one hand language is a tool for communication that becomes more efficient the more standardized it is. On the other hand, language is closely related to art and culture. I'm Icelandic (and a poet at heart) and I love my language deeply. I can succintly say things in Icelandic that can't esaily be expressed in English. And that probably applies to every language.
So I'm torn. I don't want to lose my language. But I also want a more integrated world.
The transition happens much slower, although I've been watching it accelerate over the past couple of decades. English is encroaching on the language. We speak Icelandic, but there are small grammatical changes happening, making the language more English - and at the same time, we're losing Icelandic vocabulary.
My prediction is that my grandkids (that's probably at least 13-15 years in the future (hopefully)) will speak Icelandic very similar to what I do, it'll just be a liiiittle different.
But these small changes will certainly accumulate and the pace of change will probably accelerate with more communication and integration. So who knows what will happen in the next 50-100 years.
Language is the major barrier of freedom, I can theoretically get a job and move to Dublin, London, Berlin, Rome, Cadiz, Porto, Budapest, Taliin or Athens. In reality the language prevents it in most cases.
In 100 years time we could either live in a fuedal society where you can live/work in a small area owned by a local baron who reports to an international mega corporation, with international style communications in a language nobody knows - just like 14th century England and Latin, or we could have a free borderless world where there is fictionless movement and communication. I hope technology will enable the latter before the former entrenchs again.
It doesn't have to be a federal law. Change can start gradually with private companies. Pokemon Go, for instance, shows only kilometers and doesn't offer imperial units as an option. All water bottles in the US have both mililiters and fluid ounces on their label.
For instance, a 4 x 8 foot sheet of plywood down at Home Depot isn't really 4 x 8 feet; it actually has an imperial measurement of 47.938 x 95.938 inches, which is approximately 1217 x 2437 mm - the standard metric size is 1220 x 2440 mm:
Thicknesses (as noted in the table too) have similar "whole" metric numbers. You see this also on some (not all) general materials sold to the American consumer public - the actual material is metric sized, but marketed with imperial measurements.
I think this is done both for cultural reasons, as well as just "momentum" and that the public is used to it. But the reality is (almost) everything is now metric, the public just isn't fully aware of it yet.
Also, since the gap between plywood takes space, plywood (and drywall) is a bit less than stated size to give you breathing room to put 4x8s repeating every 4 and 8 feet.
That said I don’t know why a nominal 2×4 is actually 1½ × 3½. Used to be finishing work shaved it down, now they just make ‘em that way to be ornery.
That's actually required by federal law. A lot of commercial regulations in the US require metric measurements in addition to imperial.
Don't tell people who are annoyed by metric, but all the 'imperial' units we use now are 100% based on metric units. Our metrology has been metric since that time, all the day-to-day usage is derived units.
Which is why an inch is exactly 2.54cm instead of some rounded fraction.
As well as the Dvorak keyboard layout, I imagine
Also, Esperanto is ugly as fuck. It has I believe 4 accented letters, and I have no idea why anyone would think when divising a new language that accents or umlauts are good ideas (unless all the letters supported them and had consistent meaning). Otherwise, just use new characters.
Edit: Not sarcasm. I work with imperial daily farming, and worked with metric doing chemistry in college.
> Edit: Not sarcasm.
Heh. I think you'd be disappointed. I haven't heard of of any modern country seriously considering moving from metric to imperial units.
> They're sane and easily divisible into eighths and more with a single significant digit of the next smaller
Divisible maybe. But definitely not sane.
Let's take a look at lengths and weights for example:
How many inches in a foot? 12. Ok. Then it must be 12 feet in a yard. Nope. It's 3. How many yards in a mile? Guessing 1000 or 1200. Nope. Apparently 1760.
Let's try weights. Going up from ounces. Since we already did imperial lengths and saw there are 12 inches in a foot, guessing there should be 12 ounces in a pound. Wrong again. It's 16. Ok, so then 1 US ton should be 1600 pounds right? Nope, wrong again. Apparently it is 2000!
I grew up with metric then came to US and lived here for more than a decade. Apart from knowing how many inches in a foot I had to look up all the other ones because they are non-intuitive and don't follow any pattern to me. I can estimate how much is a yard, an ounce, a pound, an inch, and a foot. But again I could intuitively estimate things even better when it came to meters, kilometers and kilograms.
Your biggest complaints are that (1) metric units are not quantities you use in everyday life and (2) that you don't know how large a metric quantity is.
Both are just because you are unfamiliar with the system, people working with this system do not have this problem at all. Regarding (1), people don't care that a litre is too large too drink, they just know that a glass is 0.2-0.25L, so you can get 4-5 glasses out of a litre bottle. The benefit base 10 units is that they're easy to convert and use, people don't care if something is in cL/dL/L because it's just a comma placement away.
Regarding (2), people have a very good feeling about how big their everyday units are. E.g. a metre is about a step (large/small step depending on your size), that's fine for rough measurements, and about as (in)accurate as your "a foot is my forearm".
Half of 12 is 6, half of 1/4 is 1/8, I drill the hole at 6 and 1/8".
It's easy to get a fractional measurement of whatever is in front of you and immediately work with that measurement in your head then find the corresponding mark on your measuring tool. There is no reason the same system would not work with metric, it just seems to be more common with imperial units.
E.g. IKEA cabinets are 60 cm wide (90, or 120 for larger cabinets), common countertop depth is 60 cm, and so on.
That way we get to use sane and consistent units and we can easily see/measure the middle of a 60cm piece of wood or whatever.
It also tickles me how eyeballing these things is apparently something desirable. Whenever I work around my house I measure everything twice to be sure...
Not everything in the world is (or should be) provided by Ikea :)
Or the length of one's foot, even.
You're right about meters not being as good a "human scale" measurement as feet. The idea of average height being 1.8 m is pretty awkward on its face. Turns out, though, the metric world has settled on starting with centimeters, though. But it's not clear that this is better, because now you end up throwing around high magnitude numbers like "183 cm" rather than "6 ft".
Given that we're clearly comfortable going with a diminutive unit (viz. cm vs m), I've always thought it would be better for the world to settle on the decimeter as the reference unit instead. It's larger than a centimeter but smaller than a meter, which is what we're after, and it's about the width of one's hand, which is arguably a more natural choice for something "human scale" than the foot. The snag is that "foot" still rolls of the tongue a lot more easily that "decimeter". So we go ahead and say 1 dm = "1 hand". It's a great unit, because if we want, we can scale up or down to meters and centimeters with (base 10-derived) constant factors, which is so easy that anybody can do it in their head.
The only snag left is that "hand" is already in use as a unit. This turns out to be less problematic than it sounds, because the legacy hand is an obscure unit really only used in horse breeding. And we're in luck, because as its name suggests, the imperial "hand" is named after the span of one's hand (with fingers extended), so they're roughly the same—it's not as if you end up with one name for two wildly differing sizes. This is the same kind of "conversational equivalence" we get with a ton and a metric tonne. That is, in conversation you're basically never reduced to needing the speaker to clarify which it is that he or she means, because you just don't need that kind of precision—a ton and a tone are both two very large masses that are in the same ballpark as one another.
Perhaps most importantly, the transition from feet to hands is fairly straightforward in conversational use, because end up saying that "1 ft" equals "about 3 hands".
People make a big deal about pi versus tau, but getting widespread adoption of the "hand" as a unit seems to me to be a much more worthwhile cause, because it would have a much bigger practical impact on everyday life than tau ever would.
>Or the length of one's foot, even.
-If you're using size 14 (US; male), 15,5 (US; female), 48 (Europe), 12,5 (Mexico)...
The good thing about standards is that there's so many to choose from.
thus underscoring your point: we can't even agree on how to represent numbers, let alone measurements!
Don't get me started in "tiny pints" used in the u.s.
The U.S. is on metric; NIST defines and curates standards on the ISU and is a critical, founding participant in the GCWM. Manufacturing and engineering specifications (especially for the military) are usually ISU. It's just that commercially imperial units are still popular. Most people aren't engineers or scientists and so they don't think about this a lot, but marketing loves inertia.
I worked as a scientist in the US. Everything is metric. Cars are assembled with metric bolts.
The only thing left that's imperial is generally the consumer facing untold of pounds and gallons.
There's a lot of domain-specific units in use which aren't even imperial originally (fluid barrel, troy ounce/carat, ton of cooling, horsepower, AWG) that will stick around for a long time since they're used so frequently in finance, planning and B2B transactions, along with other imperial units that'll remain because they've taken on domain-specific uses (acre of land, fabric yard, bushel, mils) even when the engineers and logistics folks touching that same stuff are using SI equivalents.
You're right that consumer marketing is still toward imperial. I think it's just going to be a slow process of gradual weaning away as older people pass on. I'm personally of a "in-the-middle" generation (X) - I tend to be more comfortable with imperial units, but if metric is required, or I think it might be a better measurement to use (depending on the purpose), then I'll use it instead.
Admittedly these are oldish examples, but https://en.m.wikipedia.org/wiki/Mars_Climate_Orbiter springs to mind, and my 2005 Jeep is a crazy hodgepodge of metric and imperial bolts.
It's frustrating not being able to repair the tools I buy from home depot using tools bought from home depot. I couldn't even go metric if I wanted to. They don't sell raw materials in metric. You get 4x8' sheets of plywood and something called a 2x4 which seems to have nothing to do with either measuring system. It's not even Cartesian.
The great idea was having larger and smaller units for a given thing related to the base unit by powers of 10. That's a lot easier to work with than 12 inches to a foot, 3 feet to a yard, 1760 yards to a mile.
The stupid idea was basing the meter on the distance between the North Pole and the Equator.
What they should have done is based it on existing systems of units. The meter should have been equal to 3 feet. The definition of the foot at the time, in both England and on the continent I believe, was a bit fuzzy, but they all had such a unit and it was reasonably consistent everywhere. Then they could have worked at making the definition of the foot/meter more precise and reproducible, keeping it within the range of existing practice.
That would have given us the benefits of a metric system, but with the conversion between metric and prior units easier. Converting feet to meters would have then been a simple division by 3. An inch would be exactly 25 mm. (25 is easy to multiple by in your head: just divide by 4 and shift the decimal point two to the right. Similarly it is easy to divide by in you head if you do it as a multiply by 4 and then shift the decimal two to the left).
Similarly for all other metric base units except for temperature.
For temperature, as far as I can tell, there was NO good reason to switch to Celsius. Unlike with distance, we don't make bigger or smaller units out of multiples of the base unit. We just state the numeric value, in decimal, possibly with modifiers like kilo or micro, and the base unit. In other words, with temperature we were kind of already doing metric, in the sense of using powers of 10 and writing things in decimal.
So the only thing switching from the older, widely in use Fahrenheit scale to Celsius actually made better was changing the anchor points for the scale. Fahrenheit based his scale on a 0 point of the coldest temperature he could make in his lab, and his 100 point at what he thought was human body temperature.
Celsius based his on water freezing at 100 and boiling at 0, but the whole universe thought having lower numbers mean hotter was stupid, so it was quickly flipped to freezing at 0 and boiling at 100.
Now those anchor points are better than the ones Fahrenheit picked, because they are easier to reproduce and more consistent. (They still suck...but they suck a lot less). Fahrenheit's degree size was better, though, especially for the range of temperatures that most people live in. Celsius' degree is too coarse.
The right approach would have been to make Fahrenheit the scale for metric temperature, but anchor at it better points. Instead of 0 as coldest Fahrenheit could make in his lab and 100 human body temperature, use the same anchors Celsius did: freezing and boiling of water. Define freezing as 32 Fahrenheit and boiling as 212 Fahrenheit.
This would have made as much sense as adopting Celsius, and no one would need a new thermometer.
Same comment for temperatures. Both scale emerge around the time (mid 1700s). It's not like people switched from Fahrenheit to Celcius...
1. I like the sentiment though (base the unit on something universal in nature).
2. It explains, btw, why 10000 km are about 5400 nautical miles (90 degrees between equator and pole, 60 minutes per degree)
Why is that stupid?
The point of SI (and its predecessors like MKS and CGS dating from the early 19th century) is readily achievable, coherent and replicable standards for not just length, but also mass, time, temperature, electric current, amount (of a uniform subsatance), luminous intensity and several derived units like volume (of an arbitrarily shaped container), pressure, electric charge, force, and so forth.
Some of the fundamental units were harder to insert into a coherent system in an easily replicable and achievable way.
You've chosen to look at two such units -- length and temperature.
The metre has an interesting history whose beginnings suffered from difficulties in achieving independent reproducibility of the standard metre. Starting in the 17th century, various approaches were explored, with two leading candidates surviving into the 19th century, both relating to the geometry of the Earth in principle measurable everywhere with suitable equipment. One candidate required a detailed survey of a meridian and an almanac of angular measurements one could make against objects in the sky or objects receding over the horizon or alternatively with a map angular measurements of objects of known height disappearing over the horizon. Another candidate required an excellent portable frequency standard and an almanac relating that to time-of-day in a location-dependent fashion.
The first is the version that survived until the definition of the metre was tied to the properties of atoms and the universality of the speed of light, mostly because it was more reproducible. This was the "meridional" version; one could readily produce a high precision metre prototype with good equipment and a stable platform on a large calm body of water extending to the horizon along a meridian (this means one could do so essentially on the shore of a large lake). The definition is 1/10000 of 1/4 of a great circle through both poles of a geoid that averaged out slight differences in oblateness of the Earth's surface. Apart from error terms relating to the non-uniformities in the Earth's true surface, realizations using trigonometry against objects in the sky suffered uncertainties because of the several sources of variation in the rotational speed of the planet. With the advent of GPS and decent approaches to defining a working geoid, a "meridional" approach is still viable, although I would be surprised if anyone seriously proposed dong so as a replacement for the present definition based on the speed of light.
The other leading candidate was the "pendulum" method. When one constructs any pendulum whose half-period is one second anywhere on the surface of the planet, the arm of the pendulum will have a length very close to one metre. One critical problem is that local mass concentrations, altitude, and latitude all influence the length, and already in the 18th century it was clear that the length of the pendulum could vary by several millimetres within a radius of even a few hundred kilometres in some places, and there was no a priori way to determine all of the local contributions in order to achieve the same accuracy possible with the meridional method. Worse, precisely calibrating a seconds pendulum was a difficult technical challenge even in laboratories, even though the underlying mathematical formula was fairly simple. The problem is that in ideal situations the dominant driving term is "g_0", the standard acceleration of terrestrial free fall (as it is now known). Unfortunately the actual acceleration of objects near the surface in free fall in vacuum varies significantly across the whole of the planet, and can even vary over relatively short timescales at one location, and there is no a priori way to determine the expectation value with great reliability. Indeed we've had relatively poor data with which to build a global almanac until the 21st century with satellite observatories like GRACE and GOCE, and even now relying on a seconds pendulum for defining a metre is an unattractive proposition.
The metric unit of temperature is the kelvin, not the degree Celsius. The Celsius scale is based on kelvins, but with a 0 point (that of the triple point of a particular standard of purified water at a particular pressure) that is fairly straightforwardly achieved with decent precision even in a typical school science classroom setting. The kelvin is defined in terms of of an exact fraction that triple point, and it "only" suffers some difficulties in the exact definition and realization of 0 K.
The kelvin is due to be redefined in the next few years taking a fixed value for the Boltzmann constant k_B which can be expressed in terms of J K^-1 where J is Joules and K is kelvins, while the modern Joule is already defined in terms of the Planck constant h, the speed of light in vacuum, and the second. This redefinition is aimed principally at coherency as mentioned at the top, as we replace features common near the surface of the Earth everywhere people live with physical constants expected to be the same everywhere in the observable universe. A strong parallel goal is reproducibility and realizability of the units; the kelvin was already easily reproduced with high precision, and few metrologists are wholly comfortable with a definition that could be much harder to "show" in a lab or factory.
> "more consistent"
The Kelvin scale has always been well-integrated with the other units of SI and its predecessors, particularly since the beginning of the late 19th century programme of defining units in terms of universal physical constants.
The equivalent in U.S. Customary Units is the Rankine scale, which has the same 0 point as the Kelvin scale, but using Fahrenheit-sized degrees (which in the U.S. are anyway defined by NIST as 5/9 K, and NIST prefers "rankine" over "degree Rankine"). There is no (formal) "Imperial" rankine; AFAICT the whole of the former British Empire uses kelvins either in the Kelvin scale or (when discussing weather or cooking, for instance) the Celsius scale, although proximity to the USA and aborted-by-1980s-politics conversion to metric leads to Canada using a mix of units -- e.g. Celsius in weather reporting and Fahrenheit in household cooking.
However, given the 5/9 constant conversion factor, using rankines vs kelvins is a matter of choice. The placement of a zero point in a scale with such degrees is almost essentially arbitrary (although there are obvious attractions for some realizable ground state as the zero point. While "absolute zero" might be reached asymptotically with close approximations of an ideal gas it is not an obviously perfect choice on theoretical grounds), and is a matter of suitability. Thus there is no clear "better" between the two everyday temperature scales; each has advantages and drawbacks. Equally importantly, neither offers scope for improvement of definition of degree or zero than the other.
Essentially all scientific applications use the Kelvin scale; most of the world is comfortable using Celsius in non-scientific applications. There is almost no use of the Rankine scale (even in the USA), and most people in the USA are comfortable using Fahrenheit in non-scientific applications. Some cultures use a mix of Celsius and Fahrenheit in everyday situations. And of course, many cultures have never used the Fahrenheit scale. None of these cultures or the economies they participate in seem to be on the verge of collapse because they have made a "wrong" choice of temperature scale, however it is notable at how quickly the everyday use of Fahrenheit in most of the former British Empire collapsed.
 advances in (long-baseline) interferometry in the late 19th century could have been directed at improving the definition of the "meridional" metre, but it would have been odd not to take advantage of short-baseline interferometry that is at the heart of the "wavelength" definition proposed by Michelson (of the famous Michelson and Morley experiment). Unfortunately a consistently reproducible monochromatic emitter was unachieved until the middle of the 20th century, and even with the advent of solid-state lasers, the "wavelength" interferometry approach has more sources of uncertainty than the present light-second definition.
That is why standards are hard. Because it requires those with power to voluntarily relinquish it, which very rarely will happen. And then to never try to circumvent standards with first mover advantage to reclaim that kind of monopoly power. At least right now societies around the world are very poorly structured to align the incentives towards cooperation.
My favourite is using the FAT file system format for the system partition. No checksums, fragile on disk format, and no built in support for mirroring just to name a few.
And why they kept the idea of having the clock in local time is baffling. That just leads to errors when DST failed to apply or was applied twice.
To be fair, humanity did standardize that long ago. There is a country that is refusing to switch to the standard.
Meanwhile, more people in Africa have cell phones than plumbing, and there is realistically almost nowhere in the world without some form of cellular data service now. All these people, however, are in isolated language specific silos of content. Even the Indian Internet is radically different from what I see despite both being in mostly the same language (with some Hindi mixed in).
I am always worried about how little interaction there is, through a medium of effectively no barriers than the ones we make ourselves, between the people of western powers and everyone else who is currently online but either not informed about what the Internet is (and thus just uses SMS) or is isolated from us by language barriers.
So many crazy machines were built before there was either a need or an effective way to standardize anything power related. Keep in mind that many kinds of manipulations of electricity are easy and cheap now, but that certainly wasn't the case when people started converting from steam/water to electricity!
One of my favorite, somewhat related, such machinery was those multi-pole "generators" that were used to create carrier waves for radio transmission between the thin era of spark-gaps and vacuum tube oscillators.
Most "waterwheels" turned far more slowly than 1000 rpm. More like a few hundred at most. They used pulleys and belts, and eventually CVTs.
Doesn't work. If you want fast standardization at all costs, only way to it by forced standardization with "obey or goto gulag" orders send from a "tyrannical government" and "violent" and feared standards enforcement authorities.
When you let people vote standardization takes aaaages. When you let corporations vote, it gets even worse... it may never truly happen because the agreed upon "standards" have purposeful ambiguities sprinkled everywhere 'cause "but that's too hard to implement", and practical implementations never fully obey because "what you gonna do, can't practically sue us for not implementing this". Also standardization favors commoditization, hence doesn't make much business sense to do it if you're the "big guy", and isn't so easy to do if you're the little guy (and you're also eating away your future profits if you're the small guy that knows it's gonna grow bug fast).
Personally, I believe Microsoft helped to engineer the downfall of SGI, via the whole Fahrenheit graphics "co-operation" - which also led to SGI making Windows NT workstations (in place of Irix).
That isn't to say that SGI didn't have some bumbling management of the time, but I do think that Microsoft took advantage of the situation to get them to chase a red herring to accelerate their demise, while also gaining a lot of new knowledge and IP via the sharing agreements they had for Fahrenheit.
Then again, had SGI not imploded, we wouldn't have NVidia today...
nvidia is basically one of the worst actors in the market, though. Artificially increasing prices and artificially segmenting the market, using anticompetitive methods to gain an advantage (oh how everyone loves gameworks), etc.
Understatement of the century.
> Then again, had SGI not imploded, we wouldn't have NVidia today...
That gets the timeline all wrong. Nvidia is a much older company than you are probably aware of, and might have been quite successful (and quite capable of attracting top talent) had SGI managed to stagger along for quite a bit longer. SGI never, ever had a shot at making the mass-market stuff that let Nvidia grow as quickly as they did. It wasn't in their DNA.
You are right about fahrenheit, it doomed everything.
My brother recently visited a hydro dam in northern Minnesota that had one turbine operating at 25hz even as recently as the 90s, serving at least one industrial customer still running equipment that predated the interconnected 60hz grid.
120VAC -> 1.0T2.0 -> 240VAC --> line --> 220VAC --> 1.8T1.0 --> 120VAC
I understood it to make transmission easier but the US would face that problem at least as much as anywhere else...
Seen three old houses run off two 20amp fused circuits.
Sliding friction. high current contacts. 60 times a second? Seriously?
There was once the argument that DC would cause the lamp filament to vibrate like AC would (in the presence of the Earth's magnetic field?) - and thus if you ran your lamp on DC it would last longer due to not being mechanically stressed (I think this was one of the pitches behind those buttons, too).
I think it was later found that the argument had little validity, and was more a marketing pitch. That said, a lamp does experience a moment where, when the current (AC or DC) is switched on, the filament does "flex" - partially from magnetism, partially from thermal loading as it heats up. This flex, over time, does produce a mechanical stress on the filament. It's a major reason why incandescent lamps typically burn out when you turn them on.
I blame politics for both
The interisland hvdc link in New Zealand was still using some mercury arc equipment as late as 2012
Where it becomes relevant is for television, and in both cases the refresh rate matches the mains frequency. PAL is 50Hz interlaced, and NTSC is 60Hz interlaced.
Except for PAL-M, which is PAL but 60Hz interlaced (Brazil uses 60Hz mains frequency).
Note btw that we need 60 frames to get lifelike.
And gamers often want higher because the input sampling of a game is hooked up to the frame rate, so the higher the rate the more responsive the controls are...
And only at that point do the whole 25 frame make sense, and something that can be dealt with by inserting an extra frame ever so often.
Before TVs were invented, we had actual films being rotated by hand crank or by a motor - their fps is independent of electrical mains. But cathode ray tube TVs were made to match mains frequency. Then to avoid having to do some kind of conversion, it was easier to film your movies to match TVs.
I think 24 fps for sound film originally arose from the requirements of the optical soundtrack: slower running 35mm film wouldn't have enough resolution for decent sound quality. That's my guess anyway.
Silent films were often shot on slower rates around 16-18 fps (which explains the widespread comic "speed-up" look for video copies of silent movies, as the transfer was done the easy way by playing it back at 24).
There are tricks you can do with shutter angle that negate this, but they are beyond the scope of this comment.
In 1919, more than two thirds of power generation in New York was 25 Hertz and it wasn't until as late as 1952 that Buffalo used more 60 Hertz power than 25 Hertz power. The last 25 Hertz generator at Niagara Falls was shut down in 2006.
It basically doesn't make a difference what polarity you feed it, if it's positive going into the rotor then it's positive going into the stator as well. When the rotor changes polarity, so does the stator.
It was very confusing as the clock consistently ran too fast, and the timer ended before food got as hot as it previously had. I was surprised that something like that would be built into the microwave, and that it would be able to guess something like that. Eventually I unplugged it for a couple hours, and it went back to normal when I plugged it back in.
Me neither: http://ieeexplore.ieee.org.sci-hub.cc/document/6444314/
I presume it was used in a lab or something, it weighed quite a bit and didn't look like something you'd have on your kitchen counter.
Surely ordinary light bulbs don't care about the frequency. Do they mean the electronics for fluorescent lamps? Were those common in the 1940s?
If you didn't, your electric clocks would run 12 minutes fast every hour (for example).
Florescent lamps became really popular during WWII, so there were many around in 1945.
PG&E (California' primary gas and electric utility) still has DC tariffs, thought I believe they provision it by installing a converter at the pint of use. I believe this is just for elevators.
Parts of Back Bay in Boston were still wired for 100V DC mains voltage into the 1960s
A local ham friend build a high power tone generator back in the 70's ... sent slow morse across town in the early hours one night by turning on/off all the streetlights in his neighbourhood
with the global trade electronics are simply made to work anywhere 100~240V 50/60Hz so the manufactures don’t have to make several different models. (They obviously still ship with different plugs but the actual converters work with almost any input)