Notably, electrification of rural areas lagged well behind that of cities and towns until the Rural Electrification Administration was created in 1935:
> "The REA continued into the postwar era and helped the percentage of electrified farms in the United States rise from 11 percent [1935] to almost 97 percent by 1960. The New Deal had helped rural America achieve near-total electrification."
This is comparable to the situation with high-speed internet in the US at present:
> "The Federal Communications Commission (FCC), a New Deal agency established in 1934, estimates that today a quarter of rural Americans and a third on tribal lands do not have access to broadband internet, defined as download speeds of at least 25 megabytes a second. Fewer than 2 percent of urban dwellers have this same problem."
This is what you get if you privatize and deregulate basic infrastructure services: huge holes in coverage and overpriced monopolistic control of the rest of it.
Agree, infrastructure is worthless, applications are everything.
Mr Edison is famous for "the lightbulb" (actually, "a"), but what he really did was lightbulbs + power stations (through General Electric).
Mr Birdseye is famous for frozen fish, but what [the company who bought his patent] really did was frozen fish + freezers in supermarkets.
The joke about the first telephone being the hardest sell (because there's no-one to call) has another problem, of no phone-lines, exchanges or (today's) cell-towers.
\muse I wonder if a solution to holes/monopoly abuse is to ease entry-to-market? The standard incumbent response is to deny oxygen to entrants, by giving great deals at the low end (like today's "free tiers"). Though, historically, regulatory capture instead raises barriers to entry.
Infrastructure is an enabling function. To claim it’s worthless is largely missing the point. It reminds me of the mechanical engineers I once worked with in rocket engine testing. Many claimed software was largely worthless because it was only replacing existing analog alternatives.
To the articles point, the problem is about helping people connect the dots between the infrastructure and the work they really care about.
It would be interesting to see how many of these criticisms are criticisms of these things as they stood and not for their potential; it seems ridiculous to place the onus on the critics to see if someone can execute well in the future.
Tbf, they the engineers I’m referencing right design The rockets but rather the test stands. They were used to things like manually actuated valves, analog data collection etc. In their world, they could have came up in industry with little to no software. Even after adoption, they thought software testing was a waste of time because “it’s just software, it doesn’t break.” Obviously, they were thinking as a mechanical engineer where “breaking” is synonymous with “wearing out” or “physically breaking”.
I wouldn't say infrastructure is worthless, more that infrastructure creates new market opportunities, and without it, markets just won't function well. For example, good roads allow farmers to transport their produce to distant markets in all weather conditions, good electricity distribution means farmers start buying washing machines, better broadband means rural people might start buying online services and so on.
Trying to game basic infrastructure for profits runs counter to this notion, and it's thus an area of the economy where government management makes the most sense.
Applications are infrastructure's value - without produce to transport (or other applications), what value roads? Infrastructure is means to application ends.
So yes, given benefiting applications, infrastructure improvements derive value.
Their value is entirely derived. They have no intrinsic value. They are, in themselves, worthless.
Just semantics.
I tend to agree with your take on public infrastructure, but I haven't thought about it enough to form a definite opinion.
I don’t think the commenter you’re replying to is using “through” in the sense of “providing power stations with the assistance of GE”, but rather “providing power by founding GE”.
> This is what you get if you privatize and deregulate basic infrastructure services: huge holes in coverage and overpriced monopolistic control of the rest of it.
shouldn't every square inch of Alaska be electrified, internetted, and cellphone towered? that way, just in case I consider whether to live there, I won't have to think about the inconvenience of it, it will make my decision easier.
i.e. spending money on expensive infrastructure to service small numbers of customers is not necessarily a brilliant idea. Who knows, it may not have even paid itself back for all remote communities within the lower 48; many rural communities are even smaller now then there were then.
That doesn't mean there weren't net benefits from the rural electrification act, but you are wrong to pitch it as "evil corporate barons vs everybody else". How about all of us together decide what we can afford? Would you accept your kid's argument that you pay to electrify (to code, mind you, and union electricians) your kid's treehouse in the backyard just because he accuses you of being a greedy tyrant if you don't?
The problem is, no farmers, no food. No resource extraction, no stuff. Cities cannot exist without rural workers.
But what rural worker wants to leave their children to unfair advantage? These things have value beyond economics, they are required to maintain population and workers of absolutely essential jobs.
(Otherwise, these areas further depopulate, and you have to provide other incentives to get people to do said work.)
what you are saying is, it's an economic decision, which is what I am saying. But you're not recognizing it as an economic decision, you're just saying "food. do it"
What I will agree with, is that while it could be wrapped into an economic decision, if so, then "making sure there are hospitals there", or police, or schools is also an economic decision.
Which pretty much means that with such reasoning, all centralized planning is. Many economists might disagree.
The electric company probably won't see much profit from electrifying a rural area, but getting rural areas access to electricity is good for society and helps everyone in the long term. It's the kind of big picture thinking that the government should incentivize.
The alternative of what? Socialize and subsidize infrastructure costs even more for rural dwellers so the city folks pay for the $20k cable a single rural family requires?
Edit: to be less snarky, there are plenty of examples of doing something inefficient for the benefit of society as a whole. The New Deal and the ADA are two examples that come to mind.
Unfortunately, we don’t do much of that anymore in the US.
I agree with your basic point--and subsidies are a complicated topic. Cities can't exist in isolation.
That said, Starlink, and presumably competitors at some point, does change the game. As does probably 5G and successors. They don't replace last mile/last 10 mile wired Internet for all cases but we do increasingly have viable alternative for more rural locations.
My brother's house had a 1Mb/s down ADSL wired connection and he was the last house on the road that could get "broadband" at all. With Starlink, he's able to work, stream video, etc. which wouldn't have been possible before.
It’s also important to recognize companies like SpaceX benefit heavily from socialized infrastructure. For example, they lease a pad at Kennedy Space Center and use DoD infrastructure at Vandenberg.
While there's perhaps some excessive idolatry of SpaceX and shade on ULA and NASA initiatives, I'm not sure that subsidies--somewhat overt or otherwise--are necessarily a bad thing. Certainly DARPA has a long history.
I tend to agree. I think the hybrid system has a lot of benefits, but unfortunately many of these discussions seems to devolve into framing the issue as a false dichotomy.
With that same argumentation, it can be argued that moving food from rural areas into cities is doing something for the benefit of people living in cities, not society as a whole.
Also, internet and such for rural people benefits society as a whole, indirectly. For example, better crop yields (Through better access to information) and such.
But if you take a look at the massive amounts of subsidies (to both farmers and consumers), you realize food (in the US, at least) is not a “free market”. Same with fuel, etc.
Society can clearly tolerate huge amounts of inequality and has done in the past, far moreso than what we have today (kings vs peasants). Also see my comment above for my views on people who claim to speak for all of society.
I didn’t claim society can’t tolerate any inequality. The question is how much it can tolerate and remain stable. There’s an argument that feudal systems you mentioned are now longer the norm because that inequality led to alternate systems.
Regarding your first response, you may want to visit the HN guidelines regarding shallow dismissals.
Clearly it's a lot more than slow internet for farmers?
Feudal systems didn't end because of inequality, they were the opposite: the feudal system ended when serfs were allowed to enclose their communal land and were given it as private property. It went from an equal system that suffered a tragedy of the commons ("the commons" literally referring to the common land that existed under the feudal system), to a system of private farmers who were able to capture some of the wealth from their effort and land.
>Clearly it's a lot more than slow internet for farmers?
Again, I'm not sure if the intent of this is another shallow dismissal or if you're failing to make the connection I'm trying to communicate.
>the feudal system ended when serfs were allowed to enclose their communal land and were given it as private property.
Why do you think the nobility allowed this? History tells us it's rare that a consolidated power willingly gives up that power. Is it possible that the inequality of rights and taxation led to the nobility's hand being forced here?
Anyway you look at it, many historic societies existed for hundreds of years with levels of inequality vastly greater than experienced currently by any farmer in the U.S.
The parent can be correct and still miss the forest for the trees. Too much of of HN seems focused on being pedantically correct while missing the larger context. They can be right about the mechanism that led to the end of feudalism while still missing the "why".
You are using the current situation of a farmer (who has access to the internet and other regulated monopolies) as evidence for the case against those very services. If you remove that level of access, I don't think you have the point you think you do.
It's like pointing to someone who gets government assistance and saying "See? There isn't enough quality of life differential to warrant them getting assistance." The very fact that they receive assistance is exactly why that level of discrepancy is lower.
If you're saying that its possible that greater inequality can be tolerated, it's missing the point. I'm not stating a claim about an absolute level; I'm saying society should be aware that a tipping point exists. The same is said for taxation; people can point to past eras where people paid a higher effective tax rate and claim that as evidence that the current level should be raised. We can probably debate what the appropriate level is (but I doubt anyone actually knows), but that wasn't the point. Arguing the "right level" of inequality doesn't negate the point that there is some level that will make society relatively unstable.
Besides, the historical comparison may not be as apt as you think. There are many reasons why the same comparison is disrupted; technology for one allows for easier and more effective coordination. I don't think many feudal peasants could organize quickly via carrier pigeon while today we have social media; Peasants couldn’t vote to overturn their govt etc, etc.
>Why do you believe such a thing exists in the U.S.?
Because it exists for every civilization and essentially every system. The U.S. is not inherently exceptional in this regard.
And, yes, MAD is plausible and also a strong deterrent. But a central crux of the theory is that the system is inhabited by rational agents. Humans can even be modeled as rational agents most of the time, but I don't believe they adhere to this model all of the time. Behavioral psychology seems to bear this out.
My point was that helping some minority often does help society as a whole.
The ADA, strictly speaking, only helps those with disabilities. But guess what? Having people with disabilities be able to access goods, services, and work just like anyone else helps society as a whole.
I don't really expect a straight answer to this, because the sort of people who loftily declare what is best for society as a whole usually just get angry instead of answering, but how do you justify the claim that this helps society as a whole? The ADA helps the minority by making the lives of the majority worse (higher costs, taxes, more effort etc). Making that tradeoff might be a highly moral position and justifiable on that basis alone, but there's no way to justify it on the basis of helping society as a whole. Society isn't a single thing that can be said to be helped or hindered. It helps a few people by hindering the many. It'd be better to just admit that and then argue on moral grounds, like via reference to religion.
Not the OP, but I think part of the answer is that society benefits from ensuring a certain amount of baseline rights, privileges, and opportunities are open to all its citizens, even when it's inconvenient. It's not just a moral position, it also helps ensure one group doesn't consolidate power and resources.
For example, I would want to make sure you still have access to voting even if it costs a disproportionately high amount of money to ensure that right for you or if it's much more probable that you would vote against my personal selfish interest. We generally want a society with a shared set of core principles rather than just a purely transactional society.
You’re right I’m not thinking of something super objective (I’m also not particularly interested in an objective measure of what’s good for society), but here are a few ways the ADA makes society better:
1. More people are able to contribute to society in ways they were unable to before. Personally, my life is better because I’ve learned from people of various abilities. That wouldn’t be possible if my school and workplaces were unable to accommodate them.
2. We all get older, we all lose abilities as we age. The ADA is helpful to literally every elderly person. Not to mention all able-bodied people are one accident away from requiring ADA amenities.
3. As other comments mentioned, some ADA regulations help more than just the disabled.
4. I don’t expect this to be a majority HN perspective, but I believe we’re only as free as the worst off in our society. Lifting the floor is always a good thing, in my opinion.
The ADA also helps anyone who has kids and a stroller. Also helps people when they are injured and immobile for a time. Dealing with these things in other countries makes you appreciate the ADA so much more.
Not knowing much about the REA, I doubt humanitarian arguments were behind it. I imagine it was successfully sold to Congress as a farm subsidy. Rural areas were where the farms were, and farms needed electricity.
We could do better, but there are still some contemporary examples. For instance it's costly to provide comprehensive medical care in rural areas. One of the effective functions of medicare / medicaid is to subsidize rural medical care.
For the individual, maybe. At a societal level, at least some people are required to live in rural areas, so we can have the raw materials society requires.
The fact that disability isn't a choice means that disabled people can't become undisabled, and abled people probably shouldn't make a choice to become permanently disabled.
Because rural/urban living is a choice, this means people can come in and out in response to supply and demand via the price mechanism. If rural areas are viewed as undesirable, rural wages will rise until it justifies people being there.
Neither of those had anything to do with moving people from the countryside into the cities, as far as I'm aware. The Chinese Communist Party is still trying to suppress rural-urban migration (e.g., via hukou).
One difference is that urban areas get that benefit with lower cost due to the inherent scaling available in dense population. The benefit might be the same, the cost is not.
I’d argue the isolated benefit is the same, but the network effect amplifies it. When a ton of people do music together, it becomes part of the culture. That effect is more difficult to scale in low density populations, which means their rate of improvement is probably lower.
> When a ton of people do music together, it becomes part of the culture.
I can't think of a better reason to subsidize the additional cost for rural broadband: without it, there is no nationally accessible "culture" that unifies us.
The utility market is generally a regulated monopoly. The market and govt both recognized the economies of scale with public infrastructure. As part of the tradeoff to be allowed to operate as a monopoly, utilities must also ensure rural or unprofitable customers still receive access. It’s all part of the deal and intrinsically tied to your point.
Exactly. People in this thread are perfectly happy to throw unknown heaps of other people's money at the problem until it is solved. And it's not like the government has a track record of efficiently spending money in the first place.
If farmers feel their lifestyle isn't good enough in proportion to the work they do, they should raise the prices of the food they sell, not get paid by the backdoor through subsidies.
Nobody wants to ban rural living, just stop subsidizing it, or subsidize it less. I don't get why responses to these suggestions are always framed this way: "but you need the countryside!" Well, yes... and that's what paying for goods and services is for.
If the situation were reversed, these kinds of defenses would not convince the people making them that rural dwellers ought to subsidize urban living.
Eh, we decided a while ago that certain things are worth subsidizing if they benefit the country as a whole.
Education, healthcare, transportation, retirement -- these are all things where we could say "People can just pay for these things if they value them", but instead we determined that some subsidies are good for the country as a whole (we often even want more subsidies for some of these).
Likewise, it's important to have a population that's willing to live and work in rural industries that supply big city centers. Subsidies to provide some of the infrastructure that more dense areas have can help the nation accomplish that needed population mix more easily.
This argument only makes sense if you're only subsidising consumption, but infrastructure increases the productive capacity of the economy as a whole, so the government gets a return on its subsidy in increased taxes.
So food prices skyrocket to make up for the lack of subsidies, and everyone gets a food purchase subsidy to make up for the high food prices since nobody wants people starving in the streets. Heck, Minnesota just became the fourth state to offer breakfast and lunch for free to all students regardless of ability to pay.
You can shuffle the board around however you want, but the truth is that agriculture (and therefor rural life) is going to be paid for one way or another.
So let food prices rise. This is a more honest method of accounting, because it is the people consuming the food who pay the cost of producing the food.
The status quo is that people indirectly pay for food through taxes, higher service cost, and other distortions that have no relation to how much food they consume. They are forced to pay a subsidy whether they like it or not.
Don't forget foreigners: If your country has a subsidized food production system and you're exporting food, then foreigners are benefiting from your subsidies. (This is actually a debate in Canada vs. USA regarding milk.)
It's vastly more economical to build infrastructure for a few million people spread over a metro rather than over a state and cities are centers of commerce and industry. Not only do cities more than pay for themselves universally they also pay for nonproductive rural areas which are very expensive to maintain and provided little revenue.
Basically unless you grow food the city folks would be economically better off if you didn't exist. Your world view is exactly the opposite of reality.
City dwellers also buy things manufactured in rural areas. Most of them expect a nationwide transportation network that isn't just interstate highways and gas stations. In addition to food, there is all the resource extraction that needs workers, who have families, and need healthcare etc. Basically a lot of people living in rural areas are either doing things that, in part, support people who live in more urban areas or they're supporting supporting those people.
Cities eat up tax dollars too. Boston's Big Dig was basically a $10 billion or so gift to Boston from, not only Western Massachusetts taxpayers but the rest of the country. (The Speaker of the House was from the Boston area.)
If you want to be mercenary about it farms already require only a tiny fraction of the rural population to run and will in the future require even fewer. We need the land. Virtually all of the folks not so much.
This is absolutely true, and the required subsidies for those places with mass exoduses will presumably drop over time (though will like increase per remaining person for those few necessary remaining people).
Some of that is right, but believe it or not farmers also need local services like health care workers, tradespeople, some semblance of government, etc.
All of what I said is right and all of what you said is right. The grandparent comment stated that.
> As a rural tax payer I'm paying for some city dwellers to sit around all day doing nothing. All the benefits you enjoy as a result of living in a modern society have been paid for by someone other than you.
No part of that is right. Economically rural areas are a drain justified only by farming, votes, and a charitable desire to serve our fellow man. The idea that a handful of farmers are somehow paying for the teeming masses in the city to sit on their hands is wrong on its face.
Because the 11% became 97% via the New Deal and government investment, not through market forces. That's not reaching to bring up politics, that's the history of what happened. Rural areas are more expensive to run physical infrastructure for, for fewer people, compared to an urban or suburban environment; that underlying fact hasn't changed since the 1920's when electricity was being run. How it immediately becomes political is the question of who pays for what, and how much of my taxes are going towards something that doesn't direct benefit me.
But that proves my point. There was no “new deal” that got us to 75% broadband coverage. 75% is clearly way better than 11% therefore not a valid parallel.
There absolutely has been a ton of government money funneled to, basically Verizon and Comcast shareholders to try and provide rural Internet service via the Telecomms Act of 1996 along with everything else that came after that. That the government hasn't been getting good value for its money, and 75%, and at only 25 Mbit, compared to 97% for electricity is a whole other topic.
> "USDA has been investing in rural telecommunications infrastructure for decades. Hundreds of millions of dollars are annually available in the RUS programs both by loans and grants all to support modern broadband e-Connectivity in rural communities."
> "In 2018, USDA introduced the ReConnect Program, which has invested over $1 billion to date to expand high-speed broadband infrastructure in unserved rural areas and tribal lands."
We don't let shady private corporations squat on the roads and freeways and charge tolls to anyone who wants to drive on them, why should we allow such behavior on the internet trunk fiber optic cables either - or on the copper/aluminum electricity grid, for that matter?
Arguably, having broadband now is more important than having electricity then. The network effects of nearly everyone having broadband are reducing access to offline alternatives (e.g. bank branches closing), whereas anything you can do with electricity can likely be done without electricity or any additional infrastructure beyond what's on the farm.
75% broadband coverage is a generous interpretation, especially as the floor of what is broad has grown. 25MB down, 5MB up isn't actually that broad or useful as 4K TVs and multiple simultaneous video calls become the norm.
>Because the 11% became 97% via the New Deal and government investment, not through market forces.
That 11% wasn't a static state. Electrification was happening, government or not.
The New Deal certainly sped it up but it takes a career politician level of dishonesty to take a government program that increased the rate of electrification and give said program credit all electrification after it's commencement.
I think it's fairer to say that there was a rural demand for electricity, but that privately owned electricity grids saw no profit in meeting that demand. Electric appliance manufacturers did want to sell their product to people in rural areas, and they realized that if government built out the electrical grid, they'd benefit from it as would rural electricity consumers. That's why FDR's Rural Electricification programs were widely popular.
The only opponents were the electric power companies who realized they might lose their captive markets if the idea spread to the cities. Hence they started buying lots of politicians.
Market forces would have correctly restricted the number of people living in inefficient places to live. Rural communities are massively more expensive to create infrastructure for. Instead we have locked ourselves into supporting an abnormally large number of people away from where infrastructure can efficiently be provided.
And those people living in rural areas (which are required to sustain societies, unlike urban areas) would become poorer and less educated as time went on due to the economic inefficiencies of serving them well. You are arguing for the emergence of class stratification as if that's somehow the desired version of our society, which is just insanity. There is a _reason_ we share costs as a society! Free market thinking is shallow and should not be applied here, full stop.
The issue isn't politics (though it is true that the ISPs have bought both political parties in Congress), it's economics. Building out basic infrastructure is one of the core necessities for widespread economic growth, and that includes roads and bridges, the electricity grid, the water supply, and the fiber-optic network.
It's not a very complicated issue, and I've never seen anyone present a coherent argument that privatized infrastructure improves economic activity overall, it's generally the opposite isn't it?
I suppose if your metric is the concentration of wealth in fewer hands, then yes, private infrastructure facilitates that outcome, but it shrinks the overall economic activity in terms of production of goods and services (i.e. consider the farmer who can't get to market because the bridge is washed out, the washing machine manufacturer who can't sell to the farmer who has no electricity, and so on).
There's plenty of arguments for privatized infrastructure, if you never heard them then you weren't looking for them. For example telecoms, energy and TV monopolies have been broken repeatedly in the 20th century in countries around the world, nobody wants them back. There were way more infrastructure monopolies in the early parts of the 20th century than there are now.
I'm sorry, isn't that an argument against privatized infrastructure, which always becomes monopolistic control of infrastructure by private interests? It's not like competing systems of basic infrastructure are at all plausible, we're not going to have multiple private roads systems are we?
No? Private infrastructure doesn't always become monopoly, that's why most infrastructure monopolies were created by nationalization at a time when that was all the rage. Railway monopolies, radio (e.g. BBC), telecoms, steel, water, electricity ... lots.
> we're not going to have multiple private roads systems are we?
Toll roads exist but indeed, roads are one of the cases where building and maintaining them is easy so you don't lose much from a state monopoly, and they take up a lot of space so duplication is unfortunate. But there are relatively few cases like that.
Are competing water companies going to build multiple water pipe systems to people's homes? Are competing electricity companies going to cover cities with multiple independent competing grids? Are competing ISPs going to build separate fiber networks to everyone's door? Should there be multiple rail networks owned by private parties, or just one that everyone uses cooperatively?
These are cases where it only makes sense to build one system, and any such system should be under public control, not under private monopoly control by some rent-extracting shareholder outfit.
>> Are competing {water,power,internet} companies going to cover cities with multiple independent competing {pipes}?
Sure and they have done in the past. The existing networks weren't initially built by governments, they were built by private entrepreneurs and then nationalized. That's true at least for power and internet, admittedly I don't know for water piping, that might be old enough to pre-date private companies of any significant size.
Now should they is a different question to will they. In many cases it's OK to allow that. I think the US still has private railway lines. In other places there aren't any left. But I agree that for networks that require enormous amounts of land and where there's ~no scope for innovation, government monopolies can be beneficial. Water, power and roads seem clear cut. We can add gas and sewerage to that. Rail is a sort of interesting middle ground where countries go back and forth because there is actually scope for innovation in how the signalling works and governments are typically extremely slow to deploy improvements. Note that most civilized countries don't nationalize the endpoints. Power generators, gas wells, trains etc are owned by private companies usually. Also, in practice these networks are often built and maintained by private contractors.
"The existing networks weren't initially built by governments, they were built by private entrepreneurs and then nationalized."
This is simply not true in many cases, and the Rural Electrification program is Exhibit A, isn't it? Exhibit B is the national freeway system. The takeover of the government-built Hetch Hetchey to CA power grid by PG&E is exhibit C, granted a complicated story:
Certainly the private sector has a role to play, i.e. the production of water pipe itself is a competitive business, so is the production of fiber optic cable, and of tools and equipment - but the systems themselves should never be privately owned.
I think you're looking only at the 20th century? The first electricity systems were built by companies (Tesla, Edison, Westinghouse etc). The first high quality roads were privately built and run:
> What are some of the things people do that they need something that fast?
Remote programmer work. I push and pull around many GBs every day. 8Mb/second would be -at best- difficult to bear. I also often do teleconferencing (and screen sharing) while pushing around lots of data.
If there were multiple people on my LAN who were doing remote programmer work, or who were just -say- watching "streaming" video while I was working, or -say- chose to patch a video game while I was working, 8Mb/second would make working impossible.
Hell, even 40Mb/second is pretty terrible. I recently moved from a 1400/40 Mb link to a ~300/300 Mb link. Despite the dramatic reduction in download speed, it's way, way, way better.
I have around 8mbps (1 mbyte/sec) and downloading everything big is painful. Nvidia driver - 10min. Vbox update - 3min. Linux netinst - feels like forever. Big npm/docker/etc updates - few minutes of waiting. Witcher 3… ohh.
Also you can’t watch 1080p without stutters while waiting. You can’t watch 1440p60 or 4k in any case.
Why do we need 25 MBit on all the farms and rural lands? (quoted from another subthread)
Because otherwise these areas will get stuck with <25mbps forever, and there’s already no reserve in 8-10mbps.
8Mbps (please capitalize the "M" for "mega" and lowercase the "b" for bits) is 8,000,000 bits per second.
That translates to about 0.954 MiB/sec (Mebibytes per second, see the "i" and the capital "B"?) which is close enough to 1.
We must keep in mind that there are two disparate things being measured in this thread and by FCC. The FCC is ostensibly measuring advertised signaling rates. Your Gigabit Ethernet signals at 1,000,000,000 bits per second, but can't transfer data that fast. Likewise, my cable ISP signals at 100Mbps down and 5Mbps up (yeah, it's criminal) but download speeds are a different thing.
Download speeds can, and should, be measured in computer-oriented mebibytes per second, rather than bits per second, because you are, after all, transferring files. (And yeah, disk space is often measured in powers of ten...) Your ISP's data cap is undoubtedly measured in gibibytes or tebibytes (even though they call them gigabytes or terabytes, that's what the units mean.)
So when you run speedtest.net or Ookla or whatever, you're measuring the actual throughput that the computer's network interface can squeeze through the narrowest straw in your link to the server. That is necessarily a touch lower than the lowest signaling rate of whatever equipment is in-between. Internet connections are sold by trumpeting signaling rates, but those are NOT download nor upload speeds. Never confuse them, because they are overly-optimistic estimates of your maximum throughput (which is infeasible given most PHY and link-layer frame designs.)
I don’t get why my comment caused this. No one would think of millibytes per second in that context, and the 4% difference between mebi and mega plays no role here either. I don’t even know if my meters are binary or decimal and am curious where this assumption comes from.
8Mbps is just about enough to stream netflix in hd. If you have other household members doing anything else it's inadequate. It's also borderline inadequate for any online gaming whatsoever.
Hitting network speed limits doesn't just cap you in those scenarios, it degrades very badly very quickly.
In my experience, a big problem is upload, which is sometimes as low as a tenth or hundredth the download. That really hurts remote work. Another factor is that high speed internet tends to have lower latency, because I guess you just have to build our more infra. That helps remote work and gaming.
My job is video game networking. The problem isn't steady state usage, it's burst usage combined with other devices on the network. 15kbps might be enough for 98% of use cases but you occasionally need way more.
Downloading 50GB Steam games in minutes so you can join your friends without having everyone wait around. And, most importantly, flexing on people with speed test results :D
If anyone in the household works from home, that 8 will be totally saturated by a single video meeting. If there are young kids that want to watch netflix, two parents needing to work, and if you're a developer who has to pull docker images or download a large file, God help you. I have 35 Mbps now and it still gets very painful sometimes.
"Live within the bounds of what's available" seems to be a lost concept these days.
I'm rural, I work remote, and I've done so for quite a while on about a 15/3 connection. I've got somewhat better now, and I have Starlink for the house right now (though at the ever-increasing costs, I'm debating dropping it and going back to a rural WISP for bulk transfer).
If you're on a lower speed connection... you don't try to live life like you're on gigabit. You cache content locally (Jellyfin or Plex solves a lot, DVD season pack bundles are dirt cheap on eBay and a USB DVD reader can read anything), you do lower bandwidth stuff, and you work around the availability. I've taken many video calls with audio over a cell phone, because my ISP was having a crappy day.
You can invent scenarios in which you "need" gigabit, but they sound like the artificially constructed situations they are, because not everyone has 16 people working from home with another 12 insisting on their own individual 4k streams.
This seems like your position: don't try to improve things. just work around the situation.
that seems silly, and I don't think that's ever been a widespread driving philosophy. Much more common (and sadly, also lost these days) is building a better world for our kids and their kids, etc. Trying to improve life for us and those around us, with hope that the next generation has it better than we did.
>You can invent scenarios in which you "need" gigabit,
If you have to work around slow internet by buying DVD's instead of streaming, pre-downloading movies you want to watch, or calling in to video calls to get reliable audio, I think it's fair to say that you "need" faster internet. Maybe not gigabit, but definitely faster than whatever you have now.
I think you want to be a bit ahead of where the technology is now to have some room for future possibilities and assuming you're pulling cable you might as well pull the "biggest" one you can(i.e. fiber). To give an analogy, we recently remodeled our kitchen, which required a rewiring of the electrical given current standards. That alone used up all the remaining circuits of our 100 amp panel. But we have several gas appliances that we eventually want to switch to electric heat-pump technology (water heater, dryer, furnace) and an EV which I'd like to have a level 2 charger for but that would most likely mean a service upgrade to 200 amps. On the flip-side, we have 1000Mbps; thinking about whether we have enough bandwidth to do X isn't even a thing now.
You remodeled in 2023 and you only provisioned 1 Gbit? There are already 2.5 and 5Gbit copper Ethernet PHYs out there. You said "pull the biggest one you can" but - 1Gbit of fiber?
I mean, I guess if your chief usage of traffic is to the outside world, a really snappy LAN doesn't mean much at all; it's not your bottleneck. But I'm confused about your post which appears to contradict itself. Perhaps I misunderstood, and your kitchen remodel only involved AC electrical and no networking. No IoT fridge yet?
If I were pulling network cabling today for a remodel, it'd surely be multi-mode fiber in the 100Gbit range, right? Right?
The 1 Gbit is the service to the house, not the LAN, and it was (still is?) the fastest we could get at the time we got it (2016). The remodel was the kitchen cabinetry, electrical, and plumbing only, no network. While I personally don't see a need for an IoT fridge in my foreseeable future, I'm not opposed to internet connected appliances generally if I see an advantage to having one.
The Internet is a utility and flows like water, so in this analogy, homes should have enough Internet to meet their daily needs, but they don't need so much as to be a factory. For a theoretical household of 4, 8 Mbit is way too low, but it does say that 10 gigabit might be excessive. The thing is though, that the analogy breaks down when running fiber allows for future backend upgrades and faster future speeds over copper.
Infrastructure is about sufficient capacity for peak not average usage. The first thing you have to understand is that bandwidth is oversubscribed and largely asymmetrical. You may easily only get 70% of bandwidth and you'll need to get a LOT down to have even a modest up.
4K streaming video can easily be 25Mbps. A family of 3-4 people can easily have multiple TVs and each one can be using bandwidth even when nobody is attending to it. Presumably this family has none because none of them will work. The average family has 25 internet using devices including computers, laptops, consoles, smart devices etc. Meanwhile average websites have ballooned up to 2MB or 16Mb per page. If you get 70% of max bandwidth and divide it even 10 ways you'll easily be waiting around 10 seconds per page. It's common now to have a camera out front triggered by motion but this requires more upstream than your 8Mbps connection from 1999 is liable to have since most connections aren't symmetrical. Same with video conferencing which will largely be impossible.
What's that you say johnny wants to play the latest triple A game? Well its 80GB of data. With over subscription and other devices you'll be very lucky to average more than 2Mbps over the 4 days this will require during which the family connection will suck even more than it normally does.
1 Megabyte per second seems like extremely slow. You need 6/7x more to stream 4k properly and downloading modern games would take forever. If you have a couple of people using it simultaneously even HD might not be great..
I use that too. And beyond that, the original analogy was that the light bulb was so helpful that it spread to all the farms and rural lands by 1960. Why do we need 25 MBit on all the farms and rural lands?
Well, because people live on those lands. I grew up in a rural area, and my limited access to internet almost failed me in multiple classes that required online testing. My bare-minimum Hughesnet setup would only give you 50kbps after you depleted the "Full Speed" 50 gigabytes a month of 25MBit speeds. No worries though, only $10/gigabyte to get back online so you could take your Biology quiz without getting kicked off halfway through.
Now that stuff like Starlink exists, it's easier to give the finger to Hughesnet and exploitative WISPs. Even still, the years I spent growing up with bare-minimum internet at cable-package prices has made me spiteful. It should have been addressed long before the private sector got around to fixing it.
Fellow ruran here, I've tried explaining to people just how good it felt to give Hughesnet the bird (in some regions called "the finger") when Starlink rolled in, and until you've had nothing else it's hard to imagine how much it affects.
Also people forget: farmers have families and kids, etc, and while the farmer themselves may not need much internet, the kids can't even do basic school anymore without it. But that said, farmers still have and watch TVs, facetime with their families, stream music, etc.
All that said though, Starlink is up to like $120/month and still not reliable enough to fully ditch the backup exploitative WISP if you work from home as I do, so Starlink is walking dangerously close to the line of exploitative. But they're here and working, so I will happily pay the money. I just hope that over time the $/b will get a lot more competitive.
Agriculture moves a fair amount of data around. You probably don't need 25 Mbit today, but need more than the ancient infrastructure can supply. The infrastructure being built to be capable of closing the latter gap is able to handle much more both for reasons of accommodating future needs and simply where the technology is now for modern installations.
And for that reason, as a farmer, I can get gigabit service on my farms, but where I live in an urban area where the infrastructure isn't as old and is still moderately capable I am topped out at 50 Mbit service.
It isn't by definition a problem that there are holes in coverage. People in the US live hugely far apart from each other because the government has paved far more roads than they should have. Bringing broadband to people in rural areas is hundreds of times more expensive than in urban areas because people are too far apart.
>> "The Federal Communications Commission (FCC), a New Deal agency established in 1934, estimates that today a quarter of rural Americans and a third on tribal lands do not have access to broadband internet, defined as download speeds of at least 25 megabytes a second. Fewer than 2 percent of urban dwellers have this same problem."
The current FCC broadband speed is 25 megabits/second, not bytes.
This article doesn't read like a PR piece in the way PG describes at all though in that there is no discernible tie-in to AI or GPT as far as I can tell, so if the goal was to promote something other than the idea that people can sometimes be hard to convince of the value of new technologies that eventually become ubiquitous then I think it failed in that respect. I could see a similar article being written about the internet itself someday after everyone who lived through the initial skepticism about its economic and social benefits (still dubious) eventually passes on.
I'm sure the person promoting electricity had financial interest in it. You know there is much nuance here yet you still had to make this simplistic comment.
Put your real name and email address in your profile and respond with your full identity exposed. I want to see how history plays out. Then I can go back on these old threads to see who was actually wrong.
Who were the idiots against the reality of global warming? The people who so fervently used every excuse to deny the reality of an impending catastrophe?
There is no difference between those people and the people who consistently attempt to use every avenue available to attack the abilities of AI. These aren't rational people. They are people with an agenda that pushes them to modify their perception of reality around them to fit that agenda. That agenda is fear. Fear that a machine can surpass us and replace the software craft we have spent years honing.
My advice to you is to open up your mind a bit.
Heck I put my full identity and contact info in my profile. I stand by my views without hiding behind a throwaway account. If history proves me wrong the record is here for everyone to see.
Some might fall into the category you've described, but I'd hazard a guess that a lot more are afraid of the rise of AI due to its owners. The cost to develop and operate these machines is high, and you can be sure whoever is using them to replace work done by people today will capture and hold every possible penny.
People are afraid that AI will not serve the common good, and will instead serve a very rich few. Why? Because that's how it's always been with new advances, and more than ever how it is today. The vast gaps in wealth inequality will grow much larger with AI - it needs to be addressed first.
Agreed. But why be delusional? The tool is right at your fingertips. Why deny the reality of the situation rather then face the truth?
If they fear the AI owners attack the owners directly. Don't attack reality itself and say the AIs are just stochastic parrots and there's no risk to jobs at all.
Fun fact, might be off on the details, but Triscuits were called that because they were marketed as being cooked by “elecTric” ovens which meant uniform heating and no burned Triscuits!
Electric Biscuits! Try Triscuits!
Edit: since people like my fun fact here’s the Twitter thread where a guy talks about it.
Uniscuit would be bread. Biscuit translates to "twice cooked" in French. This is similar to biscotti which translates to the same thing in Italian. Fun fact that many French and Italians don't even realize.
Much like its more popular relative, unicorn, uniwheat didn't make it onto Noah's ark. Perhaps you can find some that is still viable, frozen in the permafrost.
Triscuits and panko both refer to the finished products. For the set of all Triscuits, there does not exist any element which will ever again be baked. Ditto for panko.
In English, we use the habitual aspect for things that have been, and are still, done on a regular basis. If you wish to speak about a currently-implemented process for making food, for instance, you say "panko is baked by passing an electrical current through it".
If you have a small bag of panko on the counter, you may point to it and say "an electrical current was passed through this panko to bake it", but you could also construct the former sentence and be completely correct. But your use of the past tense in a general statement about panko implies that it is no longer made by that process, which leads those of us who speak English to incorrect conclusions. This confusion can be further compounded by the fact that we were discussing events of many decades past, so Triscuits, for example, may no longer be baked in electric ovens, although they certainly could still be.
> your use of the past tense in a general statement
It wasn't my use or my statement.
> implies that it is no longer made by that process
No. That's a possible interpretation, but it is by no means implied.
> leads those of us who speak English to incorrect conclusions
I'm a native Standard American English speaker. I did not jump to that conclusion. In Standard American English, there is a specific construction for expressing that idea: "panko, which used to be cooked by putting electrodes into the dough"
If you'll refresh your memory about the context of this comment thread, you will see several posters using past-tense to refer to historical situations and that is the context into which you interjected your thing about panko. I assumed that you knew that panko had once been made that way and was no longer. Others may have assumed that as well, given the established context and the way you wrote the sentence.
I have no idea about how either triscuits or panko are made nowadays. Thus, I referred to them in the past tense talking about how they were made when they were originally created.
> In 1920, New York Edison built a brand new power generation facility that could generate 770,000 kilowatt-hours. For reference, the city of New York City now uses about 100,000 kilowatt-hours per minute.
There must be some units confusion here. Surely they didn't quote the lifetime energy output of Edison's power plant? Maybe they mean that could generate at a power of 770,000 kilowatts? But then they say NYC consumes a power of 6,000,000 kilowatts, and it seems unlike Edison's power plant was already running at more than 10% of today's NYC needs. Maybe they meant 770,000 kilowatt-hours per day (i.e. 32,000 kilowatts)?
It did make many jobs and people redundant, though. But because the world was growing, the economy grew with it and replaced that with other jobs. If growth stagnates, there's no guarantee for job replacement.
And any parallel with AI/GPT is completely absurd, even though it's the reason why this is upvoted.
>>If growth stagnates, there's no guarantee for job replacement.
No guarantees, but no hard rules euther.
PCs are the ultimate clerical and administrative machines. You don't need secretaries or typists. Don't need memos and mailrooms. Stuff gets filed automatically.
We put one on every desk. Typists and secretaries went away, but administration went on a growth spurt. Whether it's school admin, corporate HR or hospital billing..
Administrative work became much more plentiful once PCs proliferated.
We write, more letters, file more forms, sign more agreements. Maybe that stuff is valuable, and since we can do more of it with computers, we do. Maybe it has nothing to do with efficiency or value.
Whatever the case, it demonstrates that the "progress Vs luddites" debate can't be solved with a simple model.
Absurdity assume a reasonable world. Sometimes the world is weird.
It's unfounded regardless of what you believe will happen.
We are still months / years too early to see how transformative exactly GPT based AI opportunities will be.
We are clearly beyond the "Neat academic resarch" phase and well into the product building phase of this new technology but some things are only clear in hindsight. The spectrum goes from "useful niche tools" to "industrial revolution" and we must not forget that even very successful technological breakthroughs are usually marketed way beyond their actual capabilities.
People in the second row who are now betting hard on AI may as well be the billionaires of tomorrow or they will be forgotten and swallowed up by confirmation bias.
I'd say the opposite: electricity actually did see fantastical predictions of automation ending work altogether which despite it being an enormously useful enabling technology still haven't come to pass, it enabled people to get jobs more than it cost them jobs
No, I'm saying electricity didn't "steal people's jobs" even taking into account roles that ceased to exist altogether, because it phased in slowly, created far more jobs than it took away and people whose roles were "replaced" by it simply adapted to different (usually better) jobs, and the absurd predictions of the time were all wrong not because of how they were phrased but because as a simple matter of fact electricity neither heralded a post-work utopia nor forced workers wages to stay at subsistence level.
I don't think "stealing jobs" means that there will be less jobs in absolute. It usually means (at least that's how I usually see it used) that people who current have a job and are trained to do that, will no longer be able to do it, and switching to another job is not easy: it's not just skills, but often also you need to relocate somewhere else etc.
The problems with coal miners losing their jobs is not because we don't have other jobs available. It's that the lives of those people will be upended and it's not surprising that people resist that.
Honestly I reckon they went for the wrong unit with Watts. Most things measure absolute values (e.g. miles) and then the speed is the derivative unit (miles/hour). Whereas Watts they went the other way. I reckon that's why everyone's confused and people expect to see "I use this much [stuff] in total", "this appliance uses this much [stuff per unit time] while it's on", because that's the way every other unit works.
At least for mains electricity, there is a good argument for using the derived W and the doubly-derived Wh (J / s derives W, W * s, derives Wh). It makes sense because electricity is not (for practical purposes, historically) stored by consumers, only consumed at the time of use.
For virtually all applications, the instant power is of overriding utility: power is how bright your light bulb is, how fast your hairdryer dries, how large a wire is required when building your house. (Since Voltage is constant, this could even be given as Amps!). In addition, most home appliances don't have a meterable output aside from operation time: if your run your lightbulb for an hour, you get an hour of light but that's not really something you can measure. If you run your bicycle for an hour, you've gone somewhere which is a distance you can measure!
Accumulative consumption of electricity is not really important for anything save for billing, so if you all you know is Watts and hours, Wh is a useful unit.
For transmission and distribution networks, even the Watt and Wh are relegated to secondary position, the VA (volt-amp) and VAh are more useful (since Vrms x Arms != Wrms, due to power factor/phase shift). For one bit of infrastructure, the Amps are the most important operational consideration (as V is approximately constant) and reporting as VA is useful to compare with other equipment running at a different voltage.
The SI system has the Joule for that. The real "problem" is that 1 Joule of energy is just too small to be practical in most situations, so we often resort to larger units such as a kWh (equal to 3.6 MJ).
Watts are the right mental model. Your stuff use some amount of power while running. Be it 50W for old lightbulb, or 1kW for space heater. You run this stuff for some amount during the day. Add all these up over a day for a city and you get to some total of power. Multiply it by hours and get to energy spend.
And then you can even multiply the watt hours to get costs.
Disagree. kWh is a unit most households would be familiar with. I bet the average budget-conscious jo knows their price per kWh and how many kWh a washing machine run would use up. So for a lay audience it is a good term.
I know the analogy of the day is AI, but I’ll make the case for cryptocurrency as the better analogy. I think everybody sees potential use for AI - probably more than it can actually do.
The technology that I hear being called “useless”, “pure speculative hype” etc is crypto and defi. Maybe today, because of lack of infrastructure, network effects, productized apps, etc it’s not as useful as traditional banking and fiat currency, but the reality is that there’s a future where we don’t need banks and nationalized currencies, and that is an enormous value-add for society as a whole. It may not happen today, 10 or even 100 years from now, but we will look back and find the idea that people had to be convinced of this absurd.
This isn't an argument. By this logic, you could just as easily say that in the future, people will look back and scoff at the amount of time and energy wasted chasing a form of decentralized currency rife with problems that would never be solved at scale. You can't presuppose a conclusion and then extrapolate from it.
It’s true, all I’m doing is speculating, but that’s all anybody is doing here. This article didn’t make it to the front page because it’s rich with content about how electricity companies ran a successful marketing campaign to evangelize the use cases for electricity. It’s basically a newspaper clipping and a few paragraphs of summary.
It’s here because people are projecting their feelings about today’s disruptive technologies onto this example about electricity, with the implication that people will one day look back and wonder why anyone doubted <technology X>.
Here’s what I’m saying: AI is a part of our future, and very few people doubt that. AR/metaverse may not be perfect today but I think most people would agree it’s bound to be part of our future. Green tech stuff like electric cars, renewable energy, etc most people agree are part of the future and won’t argue with you.
But for some reason, when you so much as whisper “crypto” on HN, people shout you down with rude and dismissive comments (not yours, but take a look around). I don’t work in crypto, I don’t have substantial holdings in crypto, I’m not shilling.
I’m just saying that as a technology, it feels the most misunderstood despite its (to me) obvious transformative potential, which I think makes it the best analogy for electricity in the early 1900s of the available options. Does that argument hinge on the assumption that crypto is an inevitable part of future life? Yes - but so does every other argument here.
Crypto doesn't harm anyone who doesn't choose to buy into the get rich quick schemes. If some people somewhere trade crypto between them, it's none of your business.
AI, on the other hand, harms millions of people right now - by government surveillance, face recognition, spam bots and SEO, deep fakes, exam cheating, job losses, killer drones etc., with almost no upside for the little guy.
It's the other way round: Crypto mining has wasted an awesome amount of resources and is still ongoing despite energy shortages, mass extinctions, air pollution from coal plants and global warming.
AI also uses a lot of energy, for no benefit. And traditional money system wastes even more energy and other resources, and employs millions of clerks doing meaningless work.
While I think that much of what crypto bros did is quite the waste of time and money, I believe that it has potential to be revolutionary.
If something really different has to start these days, centralized services are an easy target for the powers that be. The only way to circumvent them are truly decentralised systems.
The thing is - AI is a very apt analogy today. People outside the tech sphere often think it's a toy, but don't see the real productive uses of LLMs, for example.
In contrast, today's version of crypto has had its popular moment. Like the dirigible, it got a lot of mainstream coverage as a "promising" "revolutionary" technology, and it has made its millionaires and billionaires. Like the dirigible, it has been found wanting. There is some chance that the future will involve CBDCs, but I think most people agree that the ship has sailed on the Bitcoin-Ethereum-NFT-based "metaverse" that crypto entrepreneurs wanted to create.
I see you live in the San Francisco bay. If you go outside that little bubble, you won't see very many billboards advertising anything other than personal injury lawyers, restaurants, and casinos.
Useful electricity was not born suddenly into a world that had never heard of electricity. People had been playing with electrical toys and scientific equipment for generations before advances made industrial electricity possible. Electricity may have earned any number of different cultural reputations for its associations with aristocrats, magicians and quacks.
Just yesterday I rewatched James Burke’s Connections, episode 3, Distant Voices, which vividly illustrates some of the ways people tried using electricity.
Early electric service was less reliable than gas. My house had mixed gas lighting and electric, and there were even fixtures that were both. Not sure how long this transitional period lasted.
Adoption of phone service was even slower. First and second generation systems were pretty crappy by today’s expectations.
When I was in grad school, my cheap old rental house had light fixtures that allowed you to choose between electric and gas in one fixture. The gas pipe had long ago been disconnected, but the electric bulb sockets were stamped "Edison Patent."
What I imagine was that electric lighting could have started out in commercial or municipal use, and spread out into the general population as it got cheaper and more reliable. The same thing happened with cell phones and the Internet.
Also think of process of wiring a house. And because you are well enough to do it in first place having it done to look nicely. I don't think that is exactly cheap process, back then. It is still not. The amount of cabling even for basic lights is not that small.
Indeed, and it's something that greatly benefits from being done before the house is finished.
When I lived in Texas during a housing boom, we had an electrician on call for our factory. He told me that he would often stop at a construction site on his way home in the evening, and quickly make some extra money by completely wiring a house.
"...a brand new power generation facility that could generate 770,000 kilowatt-hours" - what does this even mean? Did the facility produce a certain amount of electricity and then it had to be shut down, after it had produced 770MWh ? Can't produce any more kWh so just fire everyone and demolish the facility?
This is an interestingly common units issue. I can understand confusing bits per second and bytes per second - most of the time the capitalization of the units isn’t important. But no one confuses miles and miles per hour!
I forgot for a moment where I was going with this, but now back on track: if they wanted to make the two values comparable, Edison's plant should also produce 770 MWh every minute.
> if they wanted to make the two values comparable
When was the last time you saw a journalism piece try to make units comparable?
That number can mean absolutely anything, there is no telling what the people could be thinking on the telephone game from transcribing the source all the way into a finished and edited design.
You are not wrong, and now I'm more confused. Unfortunately the linked report 404s, but an old copy was available through the Wayback Machine (it is exploring market needs wrt. photovoltaic systems in NYC). The introduction states that the city's total electrical consumption in 2015 was 52836 GWh.
Math time: (52836 × 1000 × 1000) / (365 × 24 × 60) = 100525 kWh of energy consumed in a minute. So that checks out.
On the other end of the comparison, by the early 1900s AC largely won and plants were appearing left and right like flowers in a field. I can't find the exact station nor its capabilities just by searching for the 1920 date.
Edison's first commercial station in Pearl Street from 1882 (still DC, I think) had 6 dynamos producing 100 kW of power each: https://en.wikipedia.org/wiki/Pearl_Street_Station Which is... let's see... 600 kWh every hour! :) Or 10 kWh per minute.
If the author suggests that Edison's plant produced an amount of electricity that is enough to cater for present day NYC's consumption a mere seven times over, that doesn't seem quite right. 770000 kWh in an hour is 12833 kWh per minute, in which case you need to build 10 Edison-plants to match the demand.
(I divided so many numbers in this comment, I sincerely hope that I did them right)
I think the 2015 electric consumption was 10000x of what the 600kW edison plant could generate?
This is a great example of how things get simpler if we drop the over time part of the units and simplify it to just the average power draw.
So in 2015 NYC consumed 52836 GWh. So the average power draw is 52836 GWh / 365 days = 6031510 kW . As in, at any given moment in 2015 NYC was on average pulling 6031510 kW or 6.03 GW.
The edison Pearl Street station could output 600kW. (and that's the theoretical peak of all 6x dynamos, probably less output in practice)
6031510 kW / 600 kW = 10052.5 so I think our current consumption is about 10000x higher not 7x-10x higher than the Pearl St station's output!
Journalist consistently use silly or incorrect units when discussing power usage. At least for this article the units aren't flat out wrong, just silly and I can see how "kilowatt-hours per minute" could be a bit more intuitive to readers.
(And don't get me started on how USB battery manufacturers advertise capacity in obtuse units like 27000 mAh @ 3.7 volts instead of just using 99.9 watt-hours or 27 amp-hours.)
Usually electric generation facilities or devices are described as what they can handle at their peak. I feel as if this vulgarization of units really makes it harder to understand the intangibility not of electrical demand which is also ephemeral by nature.
Most of the time we see “watts” it really means “watt hours” which measures work. We’re used flattening rate-measurements by measuring instantaneous points like the speedometer on your car, e.g. if you are going 60 miles-per-hour, you can expect to travel 60 miles in on hour if you maintain that rate. However, A 60 watt appliance will consume that 60 watts over 1 hour of use, which is like saying we are going “60 miles” in the example above.
Edison used a clever adoption hack which is still worth learning from today.
The gas companies of course fought the adoption of electricity. They would send people to electricity demonstrations. These people would have metal bars sewn into their sleeves and they would lean them against the bus are causing frightening sparks, fires, etc.
In response, Edison suggested putting the wires in “conduits” — the gas lines already in the house! This both addressed the fear and kicked gas out.
(This anecdote is in Utterbach’s “Mastering the Dynamics of Innovation”)
And there were likely multiple competing and incompatible formats in the early days. Utility is greatly reduced when some of the stuff you want won't work with other stuff.
Edison's was DC. He hated A/C and did public demonstrations of killing animals with A/C just to “prove” how dangerous it was. The Current Wars was a very interesting blip in history, and ultimately Tesla and Westinghouse won with A/C because of its ability travel over longer distances with minimal loss, the fact that minor variations in frequency can be used to determine current the load/demand ratio (allowing power plants to respond to load changes), and it’s voltage can be easily stepped up or down by passive devices (transformer).
If I think of everything I use electricity for these days, lighting* is pretty far down the list of usefulness. The sun can fill 80% of my lighting needs, giving up on the 20% really doesn't seem like it would be that painful.
I actually thought the list of items in the ad in the article was far more convincing than lighting as a use case.
Today my list of more important use cases would include things like long distance communication, refrigeration, transportation*, cooking/manufacturing, computing, data storage, small cameras, medical uses*...
except in so far as light is how I use electricity to make light for information transfer purposes. Something like e-ink would be an adequate substitute though.
* ICE engines do fill a lot of this niche, but not all of it. Subways, elevators, and the like are made much better via electricity.
My great-grandfather, when he built his suburban house after emigrating to the US from Scotland via Canada, included gas lines in the walls for gaslights in spite of the easy availability of electricity. Just in case.
Sure? Electricity is about means, not ways. There are many alternative ways to generate force, or lighting, or heating, or cooling, or food supplies. Where electricity won out, it was a gradual process of being superior in the particular application and long process of gradual production refinement to deliver on theoretical promise. Plus we all have to make concessions to new tech not replacing every single use case of old tech, and that's Ok. BBQ grills are fun in a way that electric stoves or microwaves are not.
The thing that's missing from the list are the tools and inventions which people had to be conviced of the usefulness of and ended up being useless.
"This happened for X so it will happen for Y" isn't an argument on its own - you either need to make a connection between X and Y, or say something fundamental about Y which makes the statement true.
Yeah it reminded me of a 1980s Saturday Morning Cartoon PSA I saw as a kid evidently called the Computer Critters that ran on ABC. Basically a bunch of attempts to convince people they needed to get a computer at home. https://youtube.com/watch?v=9rDIPyVqbHs
Not all people. There is a division of people here. People able to rationally see the consequences of a new technology.
And people who have to bend their perception of reality to protect a vested interest. For software engineers the skills and attributes we take pride in are our software craft and our intelligence. So it's normal to see that the attacks on AI are especially vicious on HN.
I'd say the division is about 50 50.
Gpt4 will not replace us. But it's a herald for something that will. That is reality.
> But it's a herald for something that will. That is reality.
This is mostly what the people who described as “rational” say all the time, but there’s no rationality or even a deep conversation about what to do if this happens. I can flip a coin and it will half of the times tell me that it’s the end! Both of this camps are arguing like political sides and the conversation is usually a repeated instance of some beliefs on both sides.
I believe, instead, we can talk about the practical ways we can deal with this change. For example we can start with looking at what other fields that got automated did. Unions? Regulations? Wild west? Free market capitalism? Monopolies? What? Or we can discuss how to take advantage of this new change. Just sayin…
People who believe such a change will happen, are in the position to lead the conversation in my opinion. This is not the first impactful change in history. It's not even clear if it's the biggest one and as any other change, people who have a rational understanding of it are in a better position to propose solutions to the problems it brings.
There is another problem though, which is very important to note. As much as this change is impactful, there are so much nonsense and bulsh*t going around it because some people are financially invested. Sometimes people make statements without revealing their true intentions. I imagine, that a person who is right now, integrating ChatGPT into something and dreaming about getting rich fast, is not gonna believe into whatever cautionary tail others tell. This specific aspect of the current hype, unlike the actual product, is dramatically similar to the crypto hype. It doesn't help either.
Well, it's obvious that people back then were simply too short-sighted to comprehend the revolutionary potential of electricity. As usual, it took the genius of a few forward-thinking entrepreneurs and engineers to drag society, kicking and screaming, into the modern age.
Looking back, the skepticism seems laughable, but it underscores a recurring theme throughout history: people's irrational resistance to change, even when the benefits are as clear as day. It takes those with true vision to push past the naysayers and bring transformative technologies into the mainstream.
I mean, just look at the internet or smartphones. The masses were skeptical, but visionaries like Steve Jobs and Tim Berners-Lee paved the way for the indispensable tech we enjoy today. It's almost too easy to predict that many of the technologies we doubt today will become cornerstones of tomorrow's society. But, of course, only a select few have the foresight to see that.
It's ok if people want to be behind the curve. No skin off my back if they want to delay their personal use of a transformative technology. Progress will happen with or without them.
people had to be convinced that the investment was worth it.
In more recent examples, people didn’t have to be convinced that mobile phones were useful, they had to be convinced that it was worth paying 100€ upfront + €30/month to use one of these.
One area where the usefulness of the technology vs investment is abundantly clear is EVs. Everybody knows they’re better for the environment, way cheaper to recharge, less vibrating, noiseless and gearless. But they’re almost three times as expensive than non-EV!
FYI the author of the article is Rose Eveleth and she did the excellent "Flash Forward" podcast. She's recently wound it up but I'd recommend the old episodes (probably don't start with the final season though).
Rightfully so. If you want people to get hyped about a thing, explaining what the thing can do for them should be an obvious necessity. But I guess that's an outdated mode of thinking, modern advertising campaigns rely more on emotional manipulation than a rational exposition of product features and benefits. Instead of promoting electricity by showing people light bulbs and electric appliances, I expect a modern advertiser would instead tell you that popular people all like electricity and that if you like electricity too, you might also become popular. Instead of showing people electric lights, you could just show some young attractive models having a picnicking in a lush city park with a narrator saying something about 'trailblazers and innovators', maybe referencing famous popular figures like Gandhi for no apparent reason.
> explaining what the thing can do for them should be an obvious necessity.
Obvious?? You are taking too much of American way of life and values for granted. The Party can just order something progressive and the people will jump with enthusiasm. You just need to train them that not jumping with enthusiasm when the Party orders something is dangerous for their career or, more effectively - for their life.
Communism is Soviet power plus the electrification of the whole country.
— Vladimir Lenin
Works like a charm. The only drawback of this approach is that you’ll get not the American way of life but the Soviet way life and the progressive will mean not what is really progressive, but what the Party orders. But well, is it that important?
My comment describes the presence and practices of communists in Russia in 1920.
If something after reading it makes you think that communists are somewhere else - this “something” is not my comment. I can only guess, but may be this “something” is you own living experience. No?
Thankfully no one followed with the potentials risks and costs. So a century later when the planet is baking & sinking, no one could possibly imagine giving up the new "necessity."
Note to crypto bros: This doesn't apply to cryptocurrency or blockchains. If cryptocurrency or blockchains could be more useful for most people than what already exists (fiat currency, traditional banks and payment systems), we would know it by now.
HN community member response: Given that Ethereum launched in July 2015 and prior to that, no programmatic blockchain existed, on what basis did you select your parameter of 7.75 years as the amount of time necessary to elapse before we're sure no valid at-scale use cases exist?
Crypto bro response: loaning my USDC on Notional for a fixed rate of 4.6% and then bridging it to Arbitrum Nova to then send my friends and family interest-bearing US dollar payments for zero transaction fees seems pretty f'ing useful to me.
> loaning my USDC on Notional for a fixed rate of 4.6% and then bridging it to Arbitrum Nova to then send my friends and family interest-bearing US dollar payments for zero transaction fees seems pretty f'ing useful to me.
Note to AI dorks: This doesn’t apply to neural networks. If neural nets could be more useful for most applications than what already exists (expert systems, human intervention), we would know it by now.
As long as BTC has tons of "price action," it can't live up to the dream of a stable non-central-bank-backed currency. It will continue to be a speculative asset. At this point, the only actual "inflation hedge" in the cryptocurrency space seems to be Monero. Everything else has "price action" like levered NASDAQ.
https://penntoday.upenn.edu/news/how-appliance-boom-moved-mo...
Notably, electrification of rural areas lagged well behind that of cities and towns until the Rural Electrification Administration was created in 1935:
https://livingnewdeal.org/a-light-went-on-new-deal-rural-ele...
> "The REA continued into the postwar era and helped the percentage of electrified farms in the United States rise from 11 percent [1935] to almost 97 percent by 1960. The New Deal had helped rural America achieve near-total electrification."
This is comparable to the situation with high-speed internet in the US at present:
> "The Federal Communications Commission (FCC), a New Deal agency established in 1934, estimates that today a quarter of rural Americans and a third on tribal lands do not have access to broadband internet, defined as download speeds of at least 25 megabytes a second. Fewer than 2 percent of urban dwellers have this same problem."
This is what you get if you privatize and deregulate basic infrastructure services: huge holes in coverage and overpriced monopolistic control of the rest of it.