Hacker News new | past | comments | ask | show | jobs | submit login
People had to be convinced of the usefulness of electricity (smithsonianmag.com)
293 points by olalonde on March 19, 2023 | hide | past | favorite | 303 comments



Lightbulbs powered by electricity were a convincing app, and so was the electricity-powered washing machine and similar labor-saving appliances:

https://penntoday.upenn.edu/news/how-appliance-boom-moved-mo...

Notably, electrification of rural areas lagged well behind that of cities and towns until the Rural Electrification Administration was created in 1935:

https://livingnewdeal.org/a-light-went-on-new-deal-rural-ele...

> "The REA continued into the postwar era and helped the percentage of electrified farms in the United States rise from 11 percent [1935] to almost 97 percent by 1960. The New Deal had helped rural America achieve near-total electrification."

This is comparable to the situation with high-speed internet in the US at present:

> "The Federal Communications Commission (FCC), a New Deal agency established in 1934, estimates that today a quarter of rural Americans and a third on tribal lands do not have access to broadband internet, defined as download speeds of at least 25 megabytes a second. Fewer than 2 percent of urban dwellers have this same problem."

This is what you get if you privatize and deregulate basic infrastructure services: huge holes in coverage and overpriced monopolistic control of the rest of it.


Agree, infrastructure is worthless, applications are everything.

Mr Edison is famous for "the lightbulb" (actually, "a"), but what he really did was lightbulbs + power stations (through General Electric).

Mr Birdseye is famous for frozen fish, but what [the company who bought his patent] really did was frozen fish + freezers in supermarkets.

The joke about the first telephone being the hardest sell (because there's no-one to call) has another problem, of no phone-lines, exchanges or (today's) cell-towers.

\muse I wonder if a solution to holes/monopoly abuse is to ease entry-to-market? The standard incumbent response is to deny oxygen to entrants, by giving great deals at the low end (like today's "free tiers"). Though, historically, regulatory capture instead raises barriers to entry.


Infrastructure is an enabling function. To claim it’s worthless is largely missing the point. It reminds me of the mechanical engineers I once worked with in rocket engine testing. Many claimed software was largely worthless because it was only replacing existing analog alternatives.

To the articles point, the problem is about helping people connect the dots between the infrastructure and the work they really care about.


> Many claimed software was largely worthless because it was only replacing existing analog alternatives.

Reminds me of the famous and unfortunate quotes [1]

[1]https://www.ittc.ku.edu/~evans/stuff/famous.html


The Bill Gates quote at the end is unsourced. In the unlikely event Gates ever actually said or wrote that, somebody would have a citation.


He denies it himself, but a plausible explanation given here is that he perhaps meant '~ in the lifetime of this system': https://skeptics.stackexchange.com/questions/2863/did-bill-g...

(It goes back to 1985 apparently, so does seem unlikely it came from nowhere, since it wouldn't have seemed so ridiculous then.)


It would be interesting to see how many of these criticisms are criticisms of these things as they stood and not for their potential; it seems ridiculous to place the onus on the critics to see if someone can execute well in the future.


"Many claimed software was largely worthless because it was only replacing existing analog alternatives."

The most glaring flaw in that argument, especially with that specific context, is that those rockets would not exist without software.


Tbf, they the engineers I’m referencing right design The rockets but rather the test stands. They were used to things like manually actuated valves, analog data collection etc. In their world, they could have came up in industry with little to no software. Even after adoption, they thought software testing was a waste of time because “it’s just software, it doesn’t break.” Obviously, they were thinking as a mechanical engineer where “breaking” is synonymous with “wearing out” or “physically breaking”.


I wouldn't say infrastructure is worthless, more that infrastructure creates new market opportunities, and without it, markets just won't function well. For example, good roads allow farmers to transport their produce to distant markets in all weather conditions, good electricity distribution means farmers start buying washing machines, better broadband means rural people might start buying online services and so on.

Trying to game basic infrastructure for profits runs counter to this notion, and it's thus an area of the economy where government management makes the most sense.


Applications are infrastructure's value - without produce to transport (or other applications), what value roads? Infrastructure is means to application ends.

So yes, given benefiting applications, infrastructure improvements derive value.

Their value is entirely derived. They have no intrinsic value. They are, in themselves, worthless.

Just semantics.

I tend to agree with your take on public infrastructure, but I haven't thought about it enough to form a definite opinion.

\tangentially related: arguing metabolism pathways need to exist before genes can improve them https://www.quantamagazine.org/a-biochemists-view-of-lifes-o...


> Mr Birdseye is famous for frozen fish

ahem, that's Captain Birdseye, thank you very much!



> power stations (through General Electric)

Edison founded GE.


I don’t think the commenter you’re replying to is using “through” in the sense of “providing power stations with the assistance of GE”, but rather “providing power by founding GE”.


> This is what you get if you privatize and deregulate basic infrastructure services: huge holes in coverage and overpriced monopolistic control of the rest of it.

shouldn't every square inch of Alaska be electrified, internetted, and cellphone towered? that way, just in case I consider whether to live there, I won't have to think about the inconvenience of it, it will make my decision easier.

i.e. spending money on expensive infrastructure to service small numbers of customers is not necessarily a brilliant idea. Who knows, it may not have even paid itself back for all remote communities within the lower 48; many rural communities are even smaller now then there were then.

That doesn't mean there weren't net benefits from the rural electrification act, but you are wrong to pitch it as "evil corporate barons vs everybody else". How about all of us together decide what we can afford? Would you accept your kid's argument that you pay to electrify (to code, mind you, and union electricians) your kid's treehouse in the backyard just because he accuses you of being a greedy tyrant if you don't?


The problem is, no farmers, no food. No resource extraction, no stuff. Cities cannot exist without rural workers.

But what rural worker wants to leave their children to unfair advantage? These things have value beyond economics, they are required to maintain population and workers of absolutely essential jobs.

(Otherwise, these areas further depopulate, and you have to provide other incentives to get people to do said work.)


what you are saying is, it's an economic decision, which is what I am saying. But you're not recognizing it as an economic decision, you're just saying "food. do it"


Not quite.

What I will agree with, is that while it could be wrapped into an economic decision, if so, then "making sure there are hospitals there", or police, or schools is also an economic decision.

Which pretty much means that with such reasoning, all centralized planning is. Many economists might disagree.


Everything that involves resources is certainly at least an economic decisions, or political-economic decision, etc.


The electric company probably won't see much profit from electrifying a rural area, but getting rural areas access to electricity is good for society and helps everyone in the long term. It's the kind of big picture thinking that the government should incentivize.


The alternative of what? Socialize and subsidize infrastructure costs even more for rural dwellers so the city folks pay for the $20k cable a single rural family requires?


Yes.

Edit: to be less snarky, there are plenty of examples of doing something inefficient for the benefit of society as a whole. The New Deal and the ADA are two examples that come to mind.

Unfortunately, we don’t do much of that anymore in the US.


I agree with your basic point--and subsidies are a complicated topic. Cities can't exist in isolation.

That said, Starlink, and presumably competitors at some point, does change the game. As does probably 5G and successors. They don't replace last mile/last 10 mile wired Internet for all cases but we do increasingly have viable alternative for more rural locations.

My brother's house had a 1Mb/s down ADSL wired connection and he was the last house on the road that could get "broadband" at all. With Starlink, he's able to work, stream video, etc. which wouldn't have been possible before.


It’s also important to recognize companies like SpaceX benefit heavily from socialized infrastructure. For example, they lease a pad at Kennedy Space Center and use DoD infrastructure at Vandenberg.


While there's perhaps some excessive idolatry of SpaceX and shade on ULA and NASA initiatives, I'm not sure that subsidies--somewhat overt or otherwise--are necessarily a bad thing. Certainly DARPA has a long history.


I tend to agree. I think the hybrid system has a lot of benefits, but unfortunately many of these discussions seems to devolve into framing the issue as a false dichotomy.


Strictly speaking, that's doing something inefficient for the benefit of people living in rural areas, not society as a whole.


With that same argumentation, it can be argued that moving food from rural areas into cities is doing something for the benefit of people living in cities, not society as a whole.

Also, internet and such for rural people benefits society as a whole, indirectly. For example, better crop yields (Through better access to information) and such.

So, I think your argument is deeply flawed.


The difference is that food is paid for on the free market. Government infrastructure isn't.


But if you take a look at the massive amounts of subsidies (to both farmers and consumers), you realize food (in the US, at least) is not a “free market”. Same with fuel, etc.


Sometimes the benefit comes in the form of a stable society. The idea that there is a clear a 1-to-1 transactional benefit is myopic.


Are you saying farmers would revolt if their internet was slow.


I’m saying society can only tolerate a certain amount of inequality.


Society can clearly tolerate huge amounts of inequality and has done in the past, far moreso than what we have today (kings vs peasants). Also see my comment above for my views on people who claim to speak for all of society.


I didn’t claim society can’t tolerate any inequality. The question is how much it can tolerate and remain stable. There’s an argument that feudal systems you mentioned are now longer the norm because that inequality led to alternate systems.

Regarding your first response, you may want to visit the HN guidelines regarding shallow dismissals.


Clearly it's a lot more than slow internet for farmers?

Feudal systems didn't end because of inequality, they were the opposite: the feudal system ended when serfs were allowed to enclose their communal land and were given it as private property. It went from an equal system that suffered a tragedy of the commons ("the commons" literally referring to the common land that existed under the feudal system), to a system of private farmers who were able to capture some of the wealth from their effort and land.


>Clearly it's a lot more than slow internet for farmers?

Again, I'm not sure if the intent of this is another shallow dismissal or if you're failing to make the connection I'm trying to communicate.

>the feudal system ended when serfs were allowed to enclose their communal land and were given it as private property.

Why do you think the nobility allowed this? History tells us it's rare that a consolidated power willingly gives up that power. Is it possible that the inequality of rights and taxation led to the nobility's hand being forced here?


The parent is correct.

Anyway you look at it, many historic societies existed for hundreds of years with levels of inequality vastly greater than experienced currently by any farmer in the U.S.


The parent can be correct and still miss the forest for the trees. Too much of of HN seems focused on being pedantically correct while missing the larger context. They can be right about the mechanism that led to the end of feudalism while still missing the "why".

You are using the current situation of a farmer (who has access to the internet and other regulated monopolies) as evidence for the case against those very services. If you remove that level of access, I don't think you have the point you think you do.

It's like pointing to someone who gets government assistance and saying "See? There isn't enough quality of life differential to warrant them getting assistance." The very fact that they receive assistance is exactly why that level of discrepancy is lower.

If you're saying that its possible that greater inequality can be tolerated, it's missing the point. I'm not stating a claim about an absolute level; I'm saying society should be aware that a tipping point exists. The same is said for taxation; people can point to past eras where people paid a higher effective tax rate and claim that as evidence that the current level should be raised. We can probably debate what the appropriate level is (but I doubt anyone actually knows), but that wasn't the point. Arguing the "right level" of inequality doesn't negate the point that there is some level that will make society relatively unstable.

Besides, the historical comparison may not be as apt as you think. There are many reasons why the same comparison is disrupted; technology for one allows for easier and more effective coordination. I don't think many feudal peasants could organize quickly via carrier pigeon while today we have social media; Peasants couldn’t vote to overturn their govt etc, etc.


> I'm saying society should be aware that a tipping point exists.

Why do you believe such a thing exists in the U.S.?

I personally find the arguments for mutually assured destruction very plausible.

Whether it's between countries or between factions within a country with WMDs, a real fighting war will practically wipe nearly everyone out.


>Why do you believe such a thing exists in the U.S.?

Because it exists for every civilization and essentially every system. The U.S. is not inherently exceptional in this regard.

And, yes, MAD is plausible and also a strong deterrent. But a central crux of the theory is that the system is inhabited by rational agents. Humans can even be modeled as rational agents most of the time, but I don't believe they adhere to this model all of the time. Behavioral psychology seems to bear this out.


> Because it exists for every civilization and essentially every system. The U.S. is not inherently exceptional in this regard.

How does that follow? Clearly the presence of WMDs changed the game.

I also don't see how rationality, or it's absence, relates to this point?


My point was that helping some minority often does help society as a whole.

The ADA, strictly speaking, only helps those with disabilities. But guess what? Having people with disabilities be able to access goods, services, and work just like anyone else helps society as a whole.


I don't really expect a straight answer to this, because the sort of people who loftily declare what is best for society as a whole usually just get angry instead of answering, but how do you justify the claim that this helps society as a whole? The ADA helps the minority by making the lives of the majority worse (higher costs, taxes, more effort etc). Making that tradeoff might be a highly moral position and justifiable on that basis alone, but there's no way to justify it on the basis of helping society as a whole. Society isn't a single thing that can be said to be helped or hindered. It helps a few people by hindering the many. It'd be better to just admit that and then argue on moral grounds, like via reference to religion.


Not the OP, but I think part of the answer is that society benefits from ensuring a certain amount of baseline rights, privileges, and opportunities are open to all its citizens, even when it's inconvenient. It's not just a moral position, it also helps ensure one group doesn't consolidate power and resources.

For example, I would want to make sure you still have access to voting even if it costs a disproportionately high amount of money to ensure that right for you or if it's much more probable that you would vote against my personal selfish interest. We generally want a society with a shared set of core principles rather than just a purely transactional society.


You’re right I’m not thinking of something super objective (I’m also not particularly interested in an objective measure of what’s good for society), but here are a few ways the ADA makes society better:

1. More people are able to contribute to society in ways they were unable to before. Personally, my life is better because I’ve learned from people of various abilities. That wouldn’t be possible if my school and workplaces were unable to accommodate them.

2. We all get older, we all lose abilities as we age. The ADA is helpful to literally every elderly person. Not to mention all able-bodied people are one accident away from requiring ADA amenities.

3. As other comments mentioned, some ADA regulations help more than just the disabled.

4. I don’t expect this to be a majority HN perspective, but I believe we’re only as free as the worst off in our society. Lifting the floor is always a good thing, in my opinion.


The ADA also helps anyone who has kids and a stroller. Also helps people when they are injured and immobile for a time. Dealing with these things in other countries makes you appreciate the ADA so much more.


Not knowing much about the REA, I doubt humanitarian arguments were behind it. I imagine it was successfully sold to Congress as a farm subsidy. Rural areas were where the farms were, and farms needed electricity.


We could do better, but there are still some contemporary examples. For instance it's costly to provide comprehensive medical care in rural areas. One of the effective functions of medicare / medicaid is to subsidize rural medical care.


> doing something inefficient for the benefit of society ... ADA

Having a disability isn't a choice. Living rurally is.


For the individual, maybe. At a societal level, at least some people are required to live in rural areas, so we can have the raw materials society requires.


The fact that disability isn't a choice means that disabled people can't become undisabled, and abled people probably shouldn't make a choice to become permanently disabled.

Because rural/urban living is a choice, this means people can come in and out in response to supply and demand via the price mechanism. If rural areas are viewed as undesirable, rural wages will rise until it justifies people being there.


We should minimize our eco-footprint with high density living, not subsidizing high environmental impact lifestyles.

How does the rural folk having high speed internet benefit society?


A great idea! A terrific plan like this needs a clever name.

We could call it the Great Leap Forward, or maybe the Cultural Revolution.


Neither of those had anything to do with moving people from the countryside into the cities, as far as I'm aware. The Chinese Communist Party is still trying to suppress rural-urban migration (e.g., via hukou).


> How does the rural folk having high speed internet benefit society?

How does an urban population having high speed internet benefit society? Whatever your answer, that's the benefit for rural folks too.


One difference is that urban areas get that benefit with lower cost due to the inherent scaling available in dense population. The benefit might be the same, the cost is not.

I’d argue the isolated benefit is the same, but the network effect amplifies it. When a ton of people do music together, it becomes part of the culture. That effect is more difficult to scale in low density populations, which means their rate of improvement is probably lower.


> When a ton of people do music together, it becomes part of the culture.

I can't think of a better reason to subsidize the additional cost for rural broadband: without it, there is no nationally accessible "culture" that unifies us.


The utility market is generally a regulated monopoly. The market and govt both recognized the economies of scale with public infrastructure. As part of the tradeoff to be allowed to operate as a monopoly, utilities must also ensure rural or unprofitable customers still receive access. It’s all part of the deal and intrinsically tied to your point.


Let's cut off their electricity and phone too and forcibly relocate them. It's not like other countries haven't done that in the past. /s


When? I can't think of any examples of forced rural-urban migration other than the Highland Clearances in the UK, which isn't really comparable.


Pol pot did it in Cambodia


That was urban-rural migration.


Rural people need reliable access to power for farming.


We can just make them pay for it instead of mooching off everybody else.


Exactly. People in this thread are perfectly happy to throw unknown heaps of other people's money at the problem until it is solved. And it's not like the government has a track record of efficiently spending money in the first place.


Someone's gotta grow the food. Our farmers deserve Netflix.


And eDoctor visits, farmer forums, IoT data services, banking, accounting apps, etc


Or perhaps they deserve to be treated like adults and decide for themselves what infrastructure they want to pay for.


I think the larger point is that guaranteeing access ensures they have the opportunity to make that choice.


Enjoy starving to death then.


If farmers feel their lifestyle isn't good enough in proportion to the work they do, they should raise the prices of the food they sell, not get paid by the backdoor through subsidies.


How do the food, livestock, minerals, and wood that grows in rural areas benefit society?


Those are priced into the economy. The value the cities provide is transferred to rural for food, livestock, and minerals.

Handing rural people an extra $50k in addition of the payment for the goods ands services to install the infrastructure doesn't seem right to me.


I wonder how cities would support themselves without rural landmass? The benefit is two way, and a $20k price tag does not capture it all all.


Nobody wants to ban rural living, just stop subsidizing it, or subsidize it less. I don't get why responses to these suggestions are always framed this way: "but you need the countryside!" Well, yes... and that's what paying for goods and services is for.

If the situation were reversed, these kinds of defenses would not convince the people making them that rural dwellers ought to subsidize urban living.


Eh, we decided a while ago that certain things are worth subsidizing if they benefit the country as a whole.

Education, healthcare, transportation, retirement -- these are all things where we could say "People can just pay for these things if they value them", but instead we determined that some subsidies are good for the country as a whole (we often even want more subsidies for some of these).

Likewise, it's important to have a population that's willing to live and work in rural industries that supply big city centers. Subsidies to provide some of the infrastructure that more dense areas have can help the nation accomplish that needed population mix more easily.


This argument only makes sense if you're only subsidising consumption, but infrastructure increases the productive capacity of the economy as a whole, so the government gets a return on its subsidy in increased taxes.


So food prices skyrocket to make up for the lack of subsidies, and everyone gets a food purchase subsidy to make up for the high food prices since nobody wants people starving in the streets. Heck, Minnesota just became the fourth state to offer breakfast and lunch for free to all students regardless of ability to pay.

You can shuffle the board around however you want, but the truth is that agriculture (and therefor rural life) is going to be paid for one way or another.


So let food prices rise. This is a more honest method of accounting, because it is the people consuming the food who pay the cost of producing the food.

The status quo is that people indirectly pay for food through taxes, higher service cost, and other distortions that have no relation to how much food they consume. They are forced to pay a subsidy whether they like it or not.

Don't forget foreigners: If your country has a subsidized food production system and you're exporting food, then foreigners are benefiting from your subsidies. (This is actually a debate in Canada vs. USA regarding milk.)


>So let food prices rise.

"Let them eat cake"


If it costs an extra $20k to set up a new farm then we let food prices rise slightly to cover the costs. Why does the government have to step in?


As a rural tax payer I'm paying for some city dwellers to sit around all day doing nothing.

All the benefits you enjoy as a result of living in a modern society have been paid for by someone other than you.


Rural farm land and dense urban cities are equally important. We're all fundamentally interconnected.


Isn't farming (the thing that AFAIK keeps most rural places, well, rural) subsidized in the USA to the similar level that it is in EU?


It's vastly more economical to build infrastructure for a few million people spread over a metro rather than over a state and cities are centers of commerce and industry. Not only do cities more than pay for themselves universally they also pay for nonproductive rural areas which are very expensive to maintain and provided little revenue.

Basically unless you grow food the city folks would be economically better off if you didn't exist. Your world view is exactly the opposite of reality.


City dwellers also buy things manufactured in rural areas. Most of them expect a nationwide transportation network that isn't just interstate highways and gas stations. In addition to food, there is all the resource extraction that needs workers, who have families, and need healthcare etc. Basically a lot of people living in rural areas are either doing things that, in part, support people who live in more urban areas or they're supporting supporting those people.

Cities eat up tax dollars too. Boston's Big Dig was basically a $10 billion or so gift to Boston from, not only Western Massachusetts taxpayers but the rest of the country. (The Speaker of the House was from the Boston area.)


> Basically unless you grow food the city folks would be economically better off if you didn't exist.

"Basically, if we didn't need to eat, we'd be better off of our digestive system didn't exist."

What a hilariously strange world view. There are comparative advantages for both urban and rural areas, and both are vitally important.

Urban areas optimize for concentration of labor. Rural areas optimize for land-and-resource-dependent operations.

You won't build a successful large scale R&D lab in a small farm town, but you also won't build a successful mining operation in downtown LA.

(Also as an addendum, there are so many industries dependent on land and resources other than just agriculture).


If you want to be mercenary about it farms already require only a tiny fraction of the rural population to run and will in the future require even fewer. We need the land. Virtually all of the folks not so much.


This is absolutely true, and the required subsidies for those places with mass exoduses will presumably drop over time (though will like increase per remaining person for those few necessary remaining people).


Some of that is right, but believe it or not farmers also need local services like health care workers, tradespeople, some semblance of government, etc.


All of what I said is right and all of what you said is right. The grandparent comment stated that.

> As a rural tax payer I'm paying for some city dwellers to sit around all day doing nothing. All the benefits you enjoy as a result of living in a modern society have been paid for by someone other than you.

No part of that is right. Economically rural areas are a drain justified only by farming, votes, and a charitable desire to serve our fellow man. The idea that a handful of farmers are somehow paying for the teeming masses in the city to sit on their hands is wrong on its face.


The comment wasn't meant to be taken so literally, the point was that almost everything is paid for by someone else.


How is 11% having electricity comparably to 75% having broadband? Feels like you’re really reaching to make this segway to politics.


Because the 11% became 97% via the New Deal and government investment, not through market forces. That's not reaching to bring up politics, that's the history of what happened. Rural areas are more expensive to run physical infrastructure for, for fewer people, compared to an urban or suburban environment; that underlying fact hasn't changed since the 1920's when electricity was being run. How it immediately becomes political is the question of who pays for what, and how much of my taxes are going towards something that doesn't direct benefit me.


But that proves my point. There was no “new deal” that got us to 75% broadband coverage. 75% is clearly way better than 11% therefore not a valid parallel.


There absolutely has been a ton of government money funneled to, basically Verizon and Comcast shareholders to try and provide rural Internet service via the Telecomms Act of 1996 along with everything else that came after that. That the government hasn't been getting good value for its money, and 75%, and at only 25 Mbit, compared to 97% for electricity is a whole other topic.


https://www.usda.gov/broadband

> "USDA has been investing in rural telecommunications infrastructure for decades. Hundreds of millions of dollars are annually available in the RUS programs both by loans and grants all to support modern broadband e-Connectivity in rural communities."

> "In 2018, USDA introduced the ReConnect Program, which has invested over $1 billion to date to expand high-speed broadband infrastructure in unserved rural areas and tribal lands."

We don't let shady private corporations squat on the roads and freeways and charge tolls to anyone who wants to drive on them, why should we allow such behavior on the internet trunk fiber optic cables either - or on the copper/aluminum electricity grid, for that matter?


Arguably, having broadband now is more important than having electricity then. The network effects of nearly everyone having broadband are reducing access to offline alternatives (e.g. bank branches closing), whereas anything you can do with electricity can likely be done without electricity or any additional infrastructure beyond what's on the farm.


75% broadband coverage is a generous interpretation, especially as the floor of what is broad has grown. 25MB down, 5MB up isn't actually that broad or useful as 4K TVs and multiple simultaneous video calls become the norm.


>Because the 11% became 97% via the New Deal and government investment, not through market forces.

That 11% wasn't a static state. Electrification was happening, government or not.

The New Deal certainly sped it up but it takes a career politician level of dishonesty to take a government program that increased the rate of electrification and give said program credit all electrification after it's commencement.


I think it's fairer to say that there was a rural demand for electricity, but that privately owned electricity grids saw no profit in meeting that demand. Electric appliance manufacturers did want to sell their product to people in rural areas, and they realized that if government built out the electrical grid, they'd benefit from it as would rural electricity consumers. That's why FDR's Rural Electricification programs were widely popular.

The only opponents were the electric power companies who realized they might lose their captive markets if the idea spread to the cities. Hence they started buying lots of politicians.


Market forces would have correctly restricted the number of people living in inefficient places to live. Rural communities are massively more expensive to create infrastructure for. Instead we have locked ourselves into supporting an abnormally large number of people away from where infrastructure can efficiently be provided.


And those people living in rural areas (which are required to sustain societies, unlike urban areas) would become poorer and less educated as time went on due to the economic inefficiencies of serving them well. You are arguing for the emergence of class stratification as if that's somehow the desired version of our society, which is just insanity. There is a _reason_ we share costs as a society! Free market thinking is shallow and should not be applied here, full stop.


Or some of them would would have moved, or not moved there in the first place, or increased the prices of their goods.

Something so cynical about such low expectations. "If we weren't in charge, they would all devolve into savagery!"


Not sure I'm the cynical one here. Even so, bad things really do happen sometimes.


https://www.broadbandsearch.net/blog/digital-divide-urban-ru...

The issue isn't politics (though it is true that the ISPs have bought both political parties in Congress), it's economics. Building out basic infrastructure is one of the core necessities for widespread economic growth, and that includes roads and bridges, the electricity grid, the water supply, and the fiber-optic network.

It's not a very complicated issue, and I've never seen anyone present a coherent argument that privatized infrastructure improves economic activity overall, it's generally the opposite isn't it?

I suppose if your metric is the concentration of wealth in fewer hands, then yes, private infrastructure facilitates that outcome, but it shrinks the overall economic activity in terms of production of goods and services (i.e. consider the farmer who can't get to market because the bridge is washed out, the washing machine manufacturer who can't sell to the farmer who has no electricity, and so on).


There's plenty of arguments for privatized infrastructure, if you never heard them then you weren't looking for them. For example telecoms, energy and TV monopolies have been broken repeatedly in the 20th century in countries around the world, nobody wants them back. There were way more infrastructure monopolies in the early parts of the 20th century than there are now.


I'm sorry, isn't that an argument against privatized infrastructure, which always becomes monopolistic control of infrastructure by private interests? It's not like competing systems of basic infrastructure are at all plausible, we're not going to have multiple private roads systems are we?


No? Private infrastructure doesn't always become monopoly, that's why most infrastructure monopolies were created by nationalization at a time when that was all the rage. Railway monopolies, radio (e.g. BBC), telecoms, steel, water, electricity ... lots.

> we're not going to have multiple private roads systems are we?

Toll roads exist but indeed, roads are one of the cases where building and maintaining them is easy so you don't lose much from a state monopoly, and they take up a lot of space so duplication is unfortunate. But there are relatively few cases like that.


Are competing water companies going to build multiple water pipe systems to people's homes? Are competing electricity companies going to cover cities with multiple independent competing grids? Are competing ISPs going to build separate fiber networks to everyone's door? Should there be multiple rail networks owned by private parties, or just one that everyone uses cooperatively?

These are cases where it only makes sense to build one system, and any such system should be under public control, not under private monopoly control by some rent-extracting shareholder outfit.


>> Are competing {water,power,internet} companies going to cover cities with multiple independent competing {pipes}?

Sure and they have done in the past. The existing networks weren't initially built by governments, they were built by private entrepreneurs and then nationalized. That's true at least for power and internet, admittedly I don't know for water piping, that might be old enough to pre-date private companies of any significant size.

Now should they is a different question to will they. In many cases it's OK to allow that. I think the US still has private railway lines. In other places there aren't any left. But I agree that for networks that require enormous amounts of land and where there's ~no scope for innovation, government monopolies can be beneficial. Water, power and roads seem clear cut. We can add gas and sewerage to that. Rail is a sort of interesting middle ground where countries go back and forth because there is actually scope for innovation in how the signalling works and governments are typically extremely slow to deploy improvements. Note that most civilized countries don't nationalize the endpoints. Power generators, gas wells, trains etc are owned by private companies usually. Also, in practice these networks are often built and maintained by private contractors.


"The existing networks weren't initially built by governments, they were built by private entrepreneurs and then nationalized."

This is simply not true in many cases, and the Rural Electrification program is Exhibit A, isn't it? Exhibit B is the national freeway system. The takeover of the government-built Hetch Hetchey to CA power grid by PG&E is exhibit C, granted a complicated story:

https://www.dailyjournal.com/articles/293699-a-hundred-years...

Certainly the private sector has a role to play, i.e. the production of water pipe itself is a competitive business, so is the production of fiber optic cable, and of tools and equipment - but the systems themselves should never be privately owned.


I think you're looking only at the 20th century? The first electricity systems were built by companies (Tesla, Edison, Westinghouse etc). The first high quality roads were privately built and run:

https://en.wikipedia.org/wiki/Turnpike_trust

(well, semi-private at least, corporations didn't exist in the same way that far back)

And the internet was all private and still is, with no change on the horizon.


>deregulate basic infrastructure services

What part of ISPs have been deregulated? Is it the bit where they're legal monopolies in many jurisdictions?


25 megabytes per second is 209,715,200bps. That's over twice as fast as my cable ISP's present downstream connection. Is that what they really meant?


No, looks like they confused the units. It should be Mbit.


Even if they mean MBit, I rarely need more than 8. What are some of the things people do that they need something that fast?


> What are some of the things people do that they need something that fast?

Remote programmer work. I push and pull around many GBs every day. 8Mb/second would be -at best- difficult to bear. I also often do teleconferencing (and screen sharing) while pushing around lots of data.

If there were multiple people on my LAN who were doing remote programmer work, or who were just -say- watching "streaming" video while I was working, or -say- chose to patch a video game while I was working, 8Mb/second would make working impossible.

Hell, even 40Mb/second is pretty terrible. I recently moved from a 1400/40 Mb link to a ~300/300 Mb link. Despite the dramatic reduction in download speed, it's way, way, way better.


I have around 8mbps (1 mbyte/sec) and downloading everything big is painful. Nvidia driver - 10min. Vbox update - 3min. Linux netinst - feels like forever. Big npm/docker/etc updates - few minutes of waiting. Witcher 3… ohh.

Also you can’t watch 1080p without stutters while waiting. You can’t watch 1440p60 or 4k in any case.

Why do we need 25 MBit on all the farms and rural lands? (quoted from another subthread)

Because otherwise these areas will get stuck with <25mbps forever, and there’s already no reserve in 8-10mbps.


Please let's not confuse more units here.

  8Mbps (please capitalize the "M" for "mega" and lowercase the "b" for bits) is 8,000,000 bits per second.
  That translates to about 0.954 MiB/sec (Mebibytes per second, see the "i" and the capital "B"?) which is close enough to 1.
We must keep in mind that there are two disparate things being measured in this thread and by FCC. The FCC is ostensibly measuring advertised signaling rates. Your Gigabit Ethernet signals at 1,000,000,000 bits per second, but can't transfer data that fast. Likewise, my cable ISP signals at 100Mbps down and 5Mbps up (yeah, it's criminal) but download speeds are a different thing.

Download speeds can, and should, be measured in computer-oriented mebibytes per second, rather than bits per second, because you are, after all, transferring files. (And yeah, disk space is often measured in powers of ten...) Your ISP's data cap is undoubtedly measured in gibibytes or tebibytes (even though they call them gigabytes or terabytes, that's what the units mean.)

So when you run speedtest.net or Ookla or whatever, you're measuring the actual throughput that the computer's network interface can squeeze through the narrowest straw in your link to the server. That is necessarily a touch lower than the lowest signaling rate of whatever equipment is in-between. Internet connections are sold by trumpeting signaling rates, but those are NOT download nor upload speeds. Never confuse them, because they are overly-optimistic estimates of your maximum throughput (which is infeasible given most PHY and link-layer frame designs.)


I don’t get why my comment caused this. No one would think of millibytes per second in that context, and the 4% difference between mebi and mega plays no role here either. I don’t even know if my meters are binary or decimal and am curious where this assumption comes from.


8Mbps is just about enough to stream netflix in hd. If you have other household members doing anything else it's inadequate. It's also borderline inadequate for any online gaming whatsoever.

Hitting network speed limits doesn't just cap you in those scenarios, it degrades very badly very quickly.


> It's also borderline inadequate for any online gaming whatsoever.

8Mbps is more than enough for a few dozen people to play any modern game on the same connection.


In my experience, a big problem is upload, which is sometimes as low as a tenth or hundredth the download. That really hurts remote work. Another factor is that high speed internet tends to have lower latency, because I guess you just have to build our more infra. That helps remote work and gaming.


My job is video game networking. The problem isn't steady state usage, it's burst usage combined with other devices on the network. 15kbps might be enough for 98% of use cases but you occasionally need way more.


I just checked multiple games from Destiny 2 to MW2 and Valorant. None peaked more 100-150Kbps, and that's the high end.


Not if they're using GeForce NOW or any similar service. One person needs 15 Mbit for 720p, never mind a few dozen.


Downloading 50GB Steam games in minutes so you can join your friends without having everyone wait around. And, most importantly, flexing on people with speed test results :D


If anyone in the household works from home, that 8 will be totally saturated by a single video meeting. If there are young kids that want to watch netflix, two parents needing to work, and if you're a developer who has to pull docker images or download a large file, God help you. I have 35 Mbps now and it still gets very painful sometimes.


"Live within the bounds of what's available" seems to be a lost concept these days.

I'm rural, I work remote, and I've done so for quite a while on about a 15/3 connection. I've got somewhat better now, and I have Starlink for the house right now (though at the ever-increasing costs, I'm debating dropping it and going back to a rural WISP for bulk transfer).

If you're on a lower speed connection... you don't try to live life like you're on gigabit. You cache content locally (Jellyfin or Plex solves a lot, DVD season pack bundles are dirt cheap on eBay and a USB DVD reader can read anything), you do lower bandwidth stuff, and you work around the availability. I've taken many video calls with audio over a cell phone, because my ISP was having a crappy day.

You can invent scenarios in which you "need" gigabit, but they sound like the artificially constructed situations they are, because not everyone has 16 people working from home with another 12 insisting on their own individual 4k streams.


This seems like your position: don't try to improve things. just work around the situation.

that seems silly, and I don't think that's ever been a widespread driving philosophy. Much more common (and sadly, also lost these days) is building a better world for our kids and their kids, etc. Trying to improve life for us and those around us, with hope that the next generation has it better than we did.


I try. Which is why I reject a lot of the digital nonsense that's just attention vampires for the sake of advertising profits.


>You can invent scenarios in which you "need" gigabit,

If you have to work around slow internet by buying DVD's instead of streaming, pre-downloading movies you want to watch, or calling in to video calls to get reliable audio, I think it's fair to say that you "need" faster internet. Maybe not gigabit, but definitely faster than whatever you have now.


I think you want to be a bit ahead of where the technology is now to have some room for future possibilities and assuming you're pulling cable you might as well pull the "biggest" one you can(i.e. fiber). To give an analogy, we recently remodeled our kitchen, which required a rewiring of the electrical given current standards. That alone used up all the remaining circuits of our 100 amp panel. But we have several gas appliances that we eventually want to switch to electric heat-pump technology (water heater, dryer, furnace) and an EV which I'd like to have a level 2 charger for but that would most likely mean a service upgrade to 200 amps. On the flip-side, we have 1000Mbps; thinking about whether we have enough bandwidth to do X isn't even a thing now.


You remodeled in 2023 and you only provisioned 1 Gbit? There are already 2.5 and 5Gbit copper Ethernet PHYs out there. You said "pull the biggest one you can" but - 1Gbit of fiber?

I mean, I guess if your chief usage of traffic is to the outside world, a really snappy LAN doesn't mean much at all; it's not your bottleneck. But I'm confused about your post which appears to contradict itself. Perhaps I misunderstood, and your kitchen remodel only involved AC electrical and no networking. No IoT fridge yet?

If I were pulling network cabling today for a remodel, it'd surely be multi-mode fiber in the 100Gbit range, right? Right?


The 1 Gbit is the service to the house, not the LAN, and it was (still is?) the fastest we could get at the time we got it (2016). The remodel was the kitchen cabinetry, electrical, and plumbing only, no network. While I personally don't see a need for an IoT fridge in my foreseeable future, I'm not opposed to internet connected appliances generally if I see an advantage to having one.


I definitely don't want a fridge that can self replace the mayo. I'm not sure what other people need something like that for


The Internet is a utility and flows like water, so in this analogy, homes should have enough Internet to meet their daily needs, but they don't need so much as to be a factory. For a theoretical household of 4, 8 Mbit is way too low, but it does say that 10 gigabit might be excessive. The thing is though, that the analogy breaks down when running fiber allows for future backend upgrades and faster future speeds over copper.


Infrastructure is about sufficient capacity for peak not average usage. The first thing you have to understand is that bandwidth is oversubscribed and largely asymmetrical. You may easily only get 70% of bandwidth and you'll need to get a LOT down to have even a modest up.

4K streaming video can easily be 25Mbps. A family of 3-4 people can easily have multiple TVs and each one can be using bandwidth even when nobody is attending to it. Presumably this family has none because none of them will work. The average family has 25 internet using devices including computers, laptops, consoles, smart devices etc. Meanwhile average websites have ballooned up to 2MB or 16Mb per page. If you get 70% of max bandwidth and divide it even 10 ways you'll easily be waiting around 10 seconds per page. It's common now to have a camera out front triggered by motion but this requires more upstream than your 8Mbps connection from 1999 is liable to have since most connections aren't symmetrical. Same with video conferencing which will largely be impossible.

What's that you say johnny wants to play the latest triple A game? Well its 80GB of data. With over subscription and other devices you'll be very lucky to average more than 2Mbps over the 4 days this will require during which the family connection will suck even more than it normally does.


1 Megabyte per second seems like extremely slow. You need 6/7x more to stream 4k properly and downloading modern games would take forever. If you have a couple of people using it simultaneously even HD might not be great..


Do you live alone?


Docker.


I use that too. And beyond that, the original analogy was that the light bulb was so helpful that it spread to all the farms and rural lands by 1960. Why do we need 25 MBit on all the farms and rural lands?


Well, because people live on those lands. I grew up in a rural area, and my limited access to internet almost failed me in multiple classes that required online testing. My bare-minimum Hughesnet setup would only give you 50kbps after you depleted the "Full Speed" 50 gigabytes a month of 25MBit speeds. No worries though, only $10/gigabyte to get back online so you could take your Biology quiz without getting kicked off halfway through.

Now that stuff like Starlink exists, it's easier to give the finger to Hughesnet and exploitative WISPs. Even still, the years I spent growing up with bare-minimum internet at cable-package prices has made me spiteful. It should have been addressed long before the private sector got around to fixing it.


Fellow ruran here, I've tried explaining to people just how good it felt to give Hughesnet the bird (in some regions called "the finger") when Starlink rolled in, and until you've had nothing else it's hard to imagine how much it affects.

Also people forget: farmers have families and kids, etc, and while the farmer themselves may not need much internet, the kids can't even do basic school anymore without it. But that said, farmers still have and watch TVs, facetime with their families, stream music, etc.

All that said though, Starlink is up to like $120/month and still not reliable enough to fully ditch the backup exploitative WISP if you work from home as I do, so Starlink is walking dangerously close to the line of exploitative. But they're here and working, so I will happily pay the money. I just hope that over time the $/b will get a lot more competitive.


All that tractor DRM has to phone home somehow!


Agriculture moves a fair amount of data around. You probably don't need 25 Mbit today, but need more than the ancient infrastructure can supply. The infrastructure being built to be capable of closing the latter gap is able to handle much more both for reasons of accommodating future needs and simply where the technology is now for modern installations.

And for that reason, as a farmer, I can get gigabit service on my farms, but where I live in an urban area where the infrastructure isn't as old and is still moderately capable I am topped out at 50 Mbit service.


It isn't by definition a problem that there are holes in coverage. People in the US live hugely far apart from each other because the government has paved far more roads than they should have. Bringing broadband to people in rural areas is hundreds of times more expensive than in urban areas because people are too far apart.


>> "The Federal Communications Commission (FCC), a New Deal agency established in 1934, estimates that today a quarter of rural Americans and a third on tribal lands do not have access to broadband internet, defined as download speeds of at least 25 megabytes a second. Fewer than 2 percent of urban dwellers have this same problem."

The current FCC broadband speed is 25 megabits/second, not bytes.


Ah just from how this is all written is obvious someone with a financial interest is trying to promote GPT-4.


You think this five paragraph article on Smithsonian Magazine's site is part of a conspiracy?


'conspiracy' is a loaded way to refer to http://www.paulgraham.com/submarine.html


This article doesn't read like a PR piece in the way PG describes at all though in that there is no discernible tie-in to AI or GPT as far as I can tell, so if the goal was to promote something other than the idea that people can sometimes be hard to convince of the value of new technologies that eventually become ubiquitous then I think it failed in that respect. I could see a similar article being written about the internet itself someday after everyone who lived through the initial skepticism about its economic and social benefits (still dubious) eventually passes on.


I'm sure the person promoting electricity had financial interest in it. You know there is much nuance here yet you still had to make this simplistic comment.

Put your real name and email address in your profile and respond with your full identity exposed. I want to see how history plays out. Then I can go back on these old threads to see who was actually wrong.

Who were the idiots against the reality of global warming? The people who so fervently used every excuse to deny the reality of an impending catastrophe?

There is no difference between those people and the people who consistently attempt to use every avenue available to attack the abilities of AI. These aren't rational people. They are people with an agenda that pushes them to modify their perception of reality around them to fit that agenda. That agenda is fear. Fear that a machine can surpass us and replace the software craft we have spent years honing.

My advice to you is to open up your mind a bit.

Heck I put my full identity and contact info in my profile. I stand by my views without hiding behind a throwaway account. If history proves me wrong the record is here for everyone to see.


Some might fall into the category you've described, but I'd hazard a guess that a lot more are afraid of the rise of AI due to its owners. The cost to develop and operate these machines is high, and you can be sure whoever is using them to replace work done by people today will capture and hold every possible penny.

People are afraid that AI will not serve the common good, and will instead serve a very rich few. Why? Because that's how it's always been with new advances, and more than ever how it is today. The vast gaps in wealth inequality will grow much larger with AI - it needs to be addressed first.


Agreed. But why be delusional? The tool is right at your fingertips. Why deny the reality of the situation rather then face the truth?

If they fear the AI owners attack the owners directly. Don't attack reality itself and say the AIs are just stochastic parrots and there's no risk to jobs at all.


I wonder what is the Amish's take on AI.


25 MB?? Holy schnikes..


Fun fact, might be off on the details, but Triscuits were called that because they were marketed as being cooked by “elecTric” ovens which meant uniform heating and no burned Triscuits!

Electric Biscuits! Try Triscuits!

Edit: since people like my fun fact here’s the Twitter thread where a guy talks about it.

https://twitter.com/sageboggs/status/1242968530250870786?s=4...


Heh, I always assumed Triscuit was named to be one better than a Biscuit. Which itself is one better than the mythical Uniscuit.


Uniscuit would be bread. Biscuit translates to "twice cooked" in French. This is similar to biscotti which translates to the same thing in Italian. Fun fact that many French and Italians don't even realize.


> Biscuit translates to "twice cooked" in French.

One letter too many: bi-cuit.


Languages evolve over time and add random letters between words. Also bis = twice in Latin.


I am now on a quest to create the long lost Uniscuit...


Much like its more popular relative, unicorn, uniwheat didn't make it onto Noah's ark. Perhaps you can find some that is still viable, frozen in the permafrost.


Neat, I guess similar to Panko? But that was more of a military time necessity


I assume that the tricuits were cooked by a fairly conventional electric oven, not like panko, which was cooked by putting electrodes into the dough.


"was"? Has something changed that they do not use this process anymore?


Yes. They're done.

Triscuits and panko both refer to the finished products. For the set of all Triscuits, there does not exist any element which will ever again be baked. Ditto for panko.


In English, we use the habitual aspect for things that have been, and are still, done on a regular basis. If you wish to speak about a currently-implemented process for making food, for instance, you say "panko is baked by passing an electrical current through it".

If you have a small bag of panko on the counter, you may point to it and say "an electrical current was passed through this panko to bake it", but you could also construct the former sentence and be completely correct. But your use of the past tense in a general statement about panko implies that it is no longer made by that process, which leads those of us who speak English to incorrect conclusions. This confusion can be further compounded by the fact that we were discussing events of many decades past, so Triscuits, for example, may no longer be baked in electric ovens, although they certainly could still be.

https://twitter.com/sageboggs/status/1242968548949004288/pho... Thanks.


> your use of the past tense in a general statement

It wasn't my use or my statement.

> implies that it is no longer made by that process

No. That's a possible interpretation, but it is by no means implied.

> leads those of us who speak English to incorrect conclusions

I'm a native Standard American English speaker. I did not jump to that conclusion. In Standard American English, there is a specific construction for expressing that idea: "panko, which used to be cooked by putting electrodes into the dough"

That construction was not used here.


If you'll refresh your memory about the context of this comment thread, you will see several posters using past-tense to refer to historical situations and that is the context into which you interjected your thing about panko. I assumed that you knew that panko had once been made that way and was no longer. Others may have assumed that as well, given the established context and the way you wrote the sentence.

So I hope this clears it up for everyone.

Thanks.


I have no idea about how either triscuits or panko are made nowadays. Thus, I referred to them in the past tense talking about how they were made when they were originally created.


> I assumed that you knew that panko had once been made that way and was no longer...the way you wrote the sentence.

Again, I was not the one who made the comment. Please stop with the inappropriate behavior.

> Thanks.

Don't do this, please.


You're welcome.


> That construction was not used here.

Except that is how "was" was used by the two people you were responding to.


> In 1920, New York Edison built a brand new power generation facility that could generate 770,000 kilowatt-hours. For reference, the city of New York City now uses about 100,000 kilowatt-hours per minute.

There must be some units confusion here. Surely they didn't quote the lifetime energy output of Edison's power plant? Maybe they mean that could generate at a power of 770,000 kilowatts? But then they say NYC consumes a power of 6,000,000 kilowatts, and it seems unlike Edison's power plant was already running at more than 10% of today's NYC needs. Maybe they meant 770,000 kilowatt-hours per day (i.e. 32,000 kilowatts)?


“Don’t worry about people stealing an idea. If it’s original, you will have to ram it down their throats.”

Howard H. Aiken


"Electricity will steal all your jobs! "

"You don't need to worry about electricity, but about other people knowing to harness electricity better than you!"

Ridiculous and at the same time actually true.


It did make many jobs and people redundant, though. But because the world was growing, the economy grew with it and replaced that with other jobs. If growth stagnates, there's no guarantee for job replacement.

And any parallel with AI/GPT is completely absurd, even though it's the reason why this is upvoted.


>>If growth stagnates, there's no guarantee for job replacement.

No guarantees, but no hard rules euther.

PCs are the ultimate clerical and administrative machines. You don't need secretaries or typists. Don't need memos and mailrooms. Stuff gets filed automatically.

We put one on every desk. Typists and secretaries went away, but administration went on a growth spurt. Whether it's school admin, corporate HR or hospital billing.. Administrative work became much more plentiful once PCs proliferated.

We write, more letters, file more forms, sign more agreements. Maybe that stuff is valuable, and since we can do more of it with computers, we do. Maybe it has nothing to do with efficiency or value.

Whatever the case, it demonstrates that the "progress Vs luddites" debate can't be solved with a simple model.

Absurdity assume a reasonable world. Sometimes the world is weird.


Could be a case of Jevons paradox?

https://en.m.wikipedia.org/wiki/Jevons_paradox

Sometimes when things look weird to us, it just means that it's counterintuitive and necessarily irrational


The parallel may well be wrong, but I would go as far as calling it absurd.


It's unfounded regardless of what you believe will happen.

We are still months / years too early to see how transformative exactly GPT based AI opportunities will be.

We are clearly beyond the "Neat academic resarch" phase and well into the product building phase of this new technology but some things are only clear in hindsight. The spectrum goes from "useful niche tools" to "industrial revolution" and we must not forget that even very successful technological breakthroughs are usually marketed way beyond their actual capabilities.

People in the second row who are now betting hard on AI may as well be the billionaires of tomorrow or they will be forgotten and swallowed up by confirmation bias.


I'd say the opposite: electricity actually did see fantastical predictions of automation ending work altogether which despite it being an enormously useful enabling technology still haven't come to pass, it enabled people to get jobs more than it cost them jobs


Your not saying the opposite of what I said :-) I literally said that indeed electricity did cause many jobs to go away.

My point was more about the way these predictions are phrased and that they may sound absurd, regardless of how they would pan out


No, I'm saying electricity didn't "steal people's jobs" even taking into account roles that ceased to exist altogether, because it phased in slowly, created far more jobs than it took away and people whose roles were "replaced" by it simply adapted to different (usually better) jobs, and the absurd predictions of the time were all wrong not because of how they were phrased but because as a simple matter of fact electricity neither heralded a post-work utopia nor forced workers wages to stay at subsistence level.


Ah I see what you mean.

I don't think "stealing jobs" means that there will be less jobs in absolute. It usually means (at least that's how I usually see it used) that people who current have a job and are trained to do that, will no longer be able to do it, and switching to another job is not easy: it's not just skills, but often also you need to relocate somewhere else etc.

The problems with coal miners losing their jobs is not because we don't have other jobs available. It's that the lives of those people will be upended and it's not surprising that people resist that.


> the city of New York City now uses about 100,000 kilowatt-hours per minute.

Also known as 6 GW.


kWh per time is a terrible unit and we should stop using it. Also kilowatts hours isn't great either, although I guess its convenient.


Honestly I reckon they went for the wrong unit with Watts. Most things measure absolute values (e.g. miles) and then the speed is the derivative unit (miles/hour). Whereas Watts they went the other way. I reckon that's why everyone's confused and people expect to see "I use this much [stuff] in total", "this appliance uses this much [stuff per unit time] while it's on", because that's the way every other unit works.


At least for mains electricity, there is a good argument for using the derived W and the doubly-derived Wh (J / s derives W, W * s, derives Wh). It makes sense because electricity is not (for practical purposes, historically) stored by consumers, only consumed at the time of use.

For virtually all applications, the instant power is of overriding utility: power is how bright your light bulb is, how fast your hairdryer dries, how large a wire is required when building your house. (Since Voltage is constant, this could even be given as Amps!). In addition, most home appliances don't have a meterable output aside from operation time: if your run your lightbulb for an hour, you get an hour of light but that's not really something you can measure. If you run your bicycle for an hour, you've gone somewhere which is a distance you can measure!

Accumulative consumption of electricity is not really important for anything save for billing, so if you all you know is Watts and hours, Wh is a useful unit.

For transmission and distribution networks, even the Watt and Wh are relegated to secondary position, the VA (volt-amp) and VAh are more useful (since Vrms x Arms != Wrms, due to power factor/phase shift). For one bit of infrastructure, the Amps are the most important operational consideration (as V is approximately constant) and reporting as VA is useful to compare with other equipment running at a different voltage.


The SI system has the Joule for that. The real "problem" is that 1 Joule of energy is just too small to be practical in most situations, so we often resort to larger units such as a kWh (equal to 3.6 MJ).


Why is it a terrible unit? And what would you replace it with?

It's reasonably human-scale, which is better than Joules...

If you want to pick on a terrible unit, pick on BTU.


>And what would you replace it with

just plain old kW. kW * hr/ hr is just 1 kW.

A kW is relatable. About the same as an electric kettle, microwave oven, or 100 lightbulbs.


Watts are the right mental model. Your stuff use some amount of power while running. Be it 50W for old lightbulb, or 1kW for space heater. You run this stuff for some amount during the day. Add all these up over a day for a city and you get to some total of power. Multiply it by hours and get to energy spend.

And then you can even multiply the watt hours to get costs.


kW is a useful measure of power kWh is a useful measure of total energy consumed

kWh/{time} is a useless and just confusing unit. So yes, I think we all agree kWh/s is a bad unit, but kWh (or gWh or similar) are all useful units.


If you want human-scale, then use calories :)


Which one? Scientific calories, or nutrition calories?


"I went 50 miles an hour for 45 minutes."

This conveys useful additional information.


I'm using 40 KWh per minute does not imply anything about how much I'm using in total though, only a rate.


Just like 40 Mph doesn’t tell you how far you have traveled. But if you provide a duration as well then you do know.


Disagree. kWh is a unit most households would be familiar with. I bet the average budget-conscious jo knows their price per kWh and how many kWh a washing machine run would use up. So for a lay audience it is a good term.


People pay for electricity by the kWh, so kWh/minute is easily mentally convertible to $/minute. Watts -> $/minute requires multiple conversions.


I pay my electricity once per months, not per minute.


Or, roughly 5x the amount of energy it takes to send a Delorian through time


I know the analogy of the day is AI, but I’ll make the case for cryptocurrency as the better analogy. I think everybody sees potential use for AI - probably more than it can actually do.

The technology that I hear being called “useless”, “pure speculative hype” etc is crypto and defi. Maybe today, because of lack of infrastructure, network effects, productized apps, etc it’s not as useful as traditional banking and fiat currency, but the reality is that there’s a future where we don’t need banks and nationalized currencies, and that is an enormous value-add for society as a whole. It may not happen today, 10 or even 100 years from now, but we will look back and find the idea that people had to be convinced of this absurd.


This isn't an argument. By this logic, you could just as easily say that in the future, people will look back and scoff at the amount of time and energy wasted chasing a form of decentralized currency rife with problems that would never be solved at scale. You can't presuppose a conclusion and then extrapolate from it.


It’s true, all I’m doing is speculating, but that’s all anybody is doing here. This article didn’t make it to the front page because it’s rich with content about how electricity companies ran a successful marketing campaign to evangelize the use cases for electricity. It’s basically a newspaper clipping and a few paragraphs of summary.

It’s here because people are projecting their feelings about today’s disruptive technologies onto this example about electricity, with the implication that people will one day look back and wonder why anyone doubted <technology X>.

Here’s what I’m saying: AI is a part of our future, and very few people doubt that. AR/metaverse may not be perfect today but I think most people would agree it’s bound to be part of our future. Green tech stuff like electric cars, renewable energy, etc most people agree are part of the future and won’t argue with you.

But for some reason, when you so much as whisper “crypto” on HN, people shout you down with rude and dismissive comments (not yours, but take a look around). I don’t work in crypto, I don’t have substantial holdings in crypto, I’m not shilling.

I’m just saying that as a technology, it feels the most misunderstood despite its (to me) obvious transformative potential, which I think makes it the best analogy for electricity in the early 1900s of the available options. Does that argument hinge on the assumption that crypto is an inevitable part of future life? Yes - but so does every other argument here.


Crypto doesn't harm anyone who doesn't choose to buy into the get rich quick schemes. If some people somewhere trade crypto between them, it's none of your business.

AI, on the other hand, harms millions of people right now - by government surveillance, face recognition, spam bots and SEO, deep fakes, exam cheating, job losses, killer drones etc., with almost no upside for the little guy.


It's the other way round: Crypto mining has wasted an awesome amount of resources and is still ongoing despite energy shortages, mass extinctions, air pollution from coal plants and global warming.


AI also uses a lot of energy, for no benefit. And traditional money system wastes even more energy and other resources, and employs millions of clerks doing meaningless work.


You don't see any benefits of AI?


While I think that much of what crypto bros did is quite the waste of time and money, I believe that it has potential to be revolutionary.

If something really different has to start these days, centralized services are an easy target for the powers that be. The only way to circumvent them are truly decentralised systems.


Remind me of the saying: "They laughed at Galileo; but they also laughed at Bozo the Clown."

NFT/cryptos are the Bozo the Clown of technological innovations.


The thing is - AI is a very apt analogy today. People outside the tech sphere often think it's a toy, but don't see the real productive uses of LLMs, for example.

In contrast, today's version of crypto has had its popular moment. Like the dirigible, it got a lot of mainstream coverage as a "promising" "revolutionary" technology, and it has made its millionaires and billionaires. Like the dirigible, it has been found wanting. There is some chance that the future will involve CBDCs, but I think most people agree that the ship has sailed on the Bitcoin-Ethereum-NFT-based "metaverse" that crypto entrepreneurs wanted to create.


There are billboards everywhere touting AI. My mom talks about it.


I see you live in the San Francisco bay. If you go outside that little bubble, you won't see very many billboards advertising anything other than personal injury lawyers, restaurants, and casinos.


A friend’s sister came into town from Nashville, and I had the pleasure of explaining what ChatGPT was to her.


Useful electricity was not born suddenly into a world that had never heard of electricity. People had been playing with electrical toys and scientific equipment for generations before advances made industrial electricity possible. Electricity may have earned any number of different cultural reputations for its associations with aristocrats, magicians and quacks.

Just yesterday I rewatched James Burke’s Connections, episode 3, Distant Voices, which vividly illustrates some of the ways people tried using electricity.


Early electric service was less reliable than gas. My house had mixed gas lighting and electric, and there were even fixtures that were both. Not sure how long this transitional period lasted.

Adoption of phone service was even slower. First and second generation systems were pretty crappy by today’s expectations.


When I was in grad school, my cheap old rental house had light fixtures that allowed you to choose between electric and gas in one fixture. The gas pipe had long ago been disconnected, but the electric bulb sockets were stamped "Edison Patent."

What I imagine was that electric lighting could have started out in commercial or municipal use, and spread out into the general population as it got cheaper and more reliable. The same thing happened with cell phones and the Internet.


Also think of process of wiring a house. And because you are well enough to do it in first place having it done to look nicely. I don't think that is exactly cheap process, back then. It is still not. The amount of cabling even for basic lights is not that small.


Indeed, and it's something that greatly benefits from being done before the house is finished.

When I lived in Texas during a housing boom, we had an electrician on call for our factory. He told me that he would often stop at a construction site on his way home in the evening, and quickly make some extra money by completely wiring a house.


"...a brand new power generation facility that could generate 770,000 kilowatt-hours" - what does this even mean? Did the facility produce a certain amount of electricity and then it had to be shut down, after it had produced 770MWh ? Can't produce any more kWh so just fire everyone and demolish the facility?


This is an interestingly common units issue. I can understand confusing bits per second and bytes per second - most of the time the capitalization of the units isn’t important. But no one confuses miles and miles per hour!


I bet they forgot to include that it can generate that 770MWh every hour.


770MWh every hour is just 770MW

The hours cancel outs!


But then the next sentence goes like this: "For reference, the city of New York City now uses about 100,000 kilowatt-hours per minute."

So...


minutes and hours still cancel out, you just have to do a little bit more arithmetic in the process.


I forgot for a moment where I was going with this, but now back on track: if they wanted to make the two values comparable, Edison's plant should also produce 770 MWh every minute.


> if they wanted to make the two values comparable

When was the last time you saw a journalism piece try to make units comparable?

That number can mean absolutely anything, there is no telling what the people could be thinking on the telephone game from transcribing the source all the way into a finished and edited design.


You are not wrong, and now I'm more confused. Unfortunately the linked report 404s, but an old copy was available through the Wayback Machine (it is exploring market needs wrt. photovoltaic systems in NYC). The introduction states that the city's total electrical consumption in 2015 was 52836 GWh.

Math time: (52836 × 1000 × 1000) / (365 × 24 × 60) = 100525 kWh of energy consumed in a minute. So that checks out.

On the other end of the comparison, by the early 1900s AC largely won and plants were appearing left and right like flowers in a field. I can't find the exact station nor its capabilities just by searching for the 1920 date.

Edison's first commercial station in Pearl Street from 1882 (still DC, I think) had 6 dynamos producing 100 kW of power each: https://en.wikipedia.org/wiki/Pearl_Street_Station Which is... let's see... 600 kWh every hour! :) Or 10 kWh per minute.

If the author suggests that Edison's plant produced an amount of electricity that is enough to cater for present day NYC's consumption a mere seven times over, that doesn't seem quite right. 770000 kWh in an hour is 12833 kWh per minute, in which case you need to build 10 Edison-plants to match the demand.

(I divided so many numbers in this comment, I sincerely hope that I did them right)


I think the 2015 electric consumption was 10000x of what the 600kW edison plant could generate?

This is a great example of how things get simpler if we drop the over time part of the units and simplify it to just the average power draw.

So in 2015 NYC consumed 52836 GWh. So the average power draw is 52836 GWh / 365 days = 6031510 kW . As in, at any given moment in 2015 NYC was on average pulling 6031510 kW or 6.03 GW.

The edison Pearl Street station could output 600kW. (and that's the theoretical peak of all 6x dynamos, probably less output in practice)

6031510 kW / 600 kW = 10052.5 so I think our current consumption is about 10000x higher not 7x-10x higher than the Pearl St station's output!


These are two different power plants. I could not find more info about the one that opened in 1920 and is actually in the article.


And 100,000kWh / minute is just 6 gigawatts or 6,000,000 kilowatts. Google is great at unit math like this: https://www.google.com/search?q=100000+kWh+%2F+1+minute+in+G...

Journalist consistently use silly or incorrect units when discussing power usage. At least for this article the units aren't flat out wrong, just silly and I can see how "kilowatt-hours per minute" could be a bit more intuitive to readers.

(And don't get me started on how USB battery manufacturers advertise capacity in obtuse units like 27000 mAh @ 3.7 volts instead of just using 99.9 watt-hours or 27 amp-hours.)


That's the joke


Usually electric generation facilities or devices are described as what they can handle at their peak. I feel as if this vulgarization of units really makes it harder to understand the intangibility not of electrical demand which is also ephemeral by nature.

Most of the time we see “watts” it really means “watt hours” which measures work. We’re used flattening rate-measurements by measuring instantaneous points like the speedometer on your car, e.g. if you are going 60 miles-per-hour, you can expect to travel 60 miles in on hour if you maintain that rate. However, A 60 watt appliance will consume that 60 watts over 1 hour of use, which is like saying we are going “60 miles” in the example above.


What? No. A 60 watt appliance will consume 60 watts for however long it is on. If it is on for 1 hour then it will have consumed 60 watthours!


Edison used a clever adoption hack which is still worth learning from today.

The gas companies of course fought the adoption of electricity. They would send people to electricity demonstrations. These people would have metal bars sewn into their sleeves and they would lean them against the bus are causing frightening sparks, fires, etc.

In response, Edison suggested putting the wires in “conduits” — the gas lines already in the house! This both addressed the fear and kicked gas out.

(This anecdote is in Utterbach’s “Mastering the Dynamics of Innovation”)


It probably wasn’t that useful on day one. Took a while to get a lot of products out there that required electricity.


And there were likely multiple competing and incompatible formats in the early days. Utility is greatly reduced when some of the stuff you want won't work with other stuff.


Yes, that I recall reading about, there were even competing DC grids and AC grids. I think Edison's was DC.


Edison's was DC. He hated A/C and did public demonstrations of killing animals with A/C just to “prove” how dangerous it was. The Current Wars was a very interesting blip in history, and ultimately Tesla and Westinghouse won with A/C because of its ability travel over longer distances with minimal loss, the fact that minor variations in frequency can be used to determine current the load/demand ratio (allowing power plants to respond to load changes), and it’s voltage can be easily stepped up or down by passive devices (transformer).


That's exactly it, thaks for providing the missing context. As a youngster, I was a big fan of Edison and read a lot about him back then.


I mean, lighting was kind of a killer app


We still refer to the electricity bill as the “light bill” in Brazil.


I'm an American, and this is what my mother calls it, though I call it the electric bill.


Same in Poland


and electric motors.


If I think of everything I use electricity for these days, lighting* is pretty far down the list of usefulness. The sun can fill 80% of my lighting needs, giving up on the 20% really doesn't seem like it would be that painful.

I actually thought the list of items in the ad in the article was far more convincing than lighting as a use case.

Today my list of more important use cases would include things like long distance communication, refrigeration, transportation*, cooking/manufacturing, computing, data storage, small cameras, medical uses*...

except in so far as light is how I use electricity to make light for information transfer purposes. Something like e-ink would be an adequate substitute though.

* ICE engines do fill a lot of this niche, but not all of it. Subways, elevators, and the like are made much better via electricity.

** Imaging devices especially come to mind


Where do you live? For half the year, we have less than 12 hours of sunlight.

Prior to electricity, humanity has spent a lot of time and effort to have light past sundown.


I think you massively underestimate how annoying not being able to see after dark is.


My great-grandfather, when he built his suburban house after emigrating to the US from Scotland via Canada, included gas lines in the walls for gaslights in spite of the easy availability of electricity. Just in case.


This is classic Ted talk by Jeff Bezos is a good one that references electricity if you haven’t watched it before

https://youtu.be/vMKNUylmanQ


Never seen this one before. Thanks for linking to it!


Sure? Electricity is about means, not ways. There are many alternative ways to generate force, or lighting, or heating, or cooling, or food supplies. Where electricity won out, it was a gradual process of being superior in the particular application and long process of gradual production refinement to deliver on theoretical promise. Plus we all have to make concessions to new tech not replacing every single use case of old tech, and that's Ok. BBQ grills are fun in a way that electric stoves or microwaves are not.


People Had to Be Convinced of the Usefulness of Cars

People Had to Be Convinced of the Usefulness of Computers

People Had to Be Convinced of the Usefulness of the Internet

People Had to Be Convinced of the Usefulness of LLMs


The thing that's missing from the list are the tools and inventions which people had to be conviced of the usefulness of and ended up being useless.

"This happened for X so it will happen for Y" isn't an argument on its own - you either need to make a connection between X and Y, or say something fundamental about Y which makes the statement true.


People Had to Be Convinced of the Usefulness of Programmatic Decentralized Public Chains (aka Crypto)


For the most part, they still do.


People Had to Be Convinced of the Usefulness of Flying Cars

If they are not convinced, the thing doesn't happen?


Yeah it reminded me of a 1980s Saturday Morning Cartoon PSA I saw as a kid evidently called the Computer Critters that ran on ABC. Basically a bunch of attempts to convince people they needed to get a computer at home. https://youtube.com/watch?v=9rDIPyVqbHs


Not all people. There is a division of people here. People able to rationally see the consequences of a new technology.

And people who have to bend their perception of reality to protect a vested interest. For software engineers the skills and attributes we take pride in are our software craft and our intelligence. So it's normal to see that the attacks on AI are especially vicious on HN.

I'd say the division is about 50 50.

Gpt4 will not replace us. But it's a herald for something that will. That is reality.


> But it's a herald for something that will. That is reality.

This is mostly what the people who described as “rational” say all the time, but there’s no rationality or even a deep conversation about what to do if this happens. I can flip a coin and it will half of the times tell me that it’s the end! Both of this camps are arguing like political sides and the conversation is usually a repeated instance of some beliefs on both sides.

I believe, instead, we can talk about the practical ways we can deal with this change. For example we can start with looking at what other fields that got automated did. Unions? Regulations? Wild west? Free market capitalism? Monopolies? What? Or we can discuss how to take advantage of this new change. Just sayin…


How can we talk about what you're suggesting when half of the people don't even believe such a change will ever happen?


People who believe such a change will happen, are in the position to lead the conversation in my opinion. This is not the first impactful change in history. It's not even clear if it's the biggest one and as any other change, people who have a rational understanding of it are in a better position to propose solutions to the problems it brings.

There is another problem though, which is very important to note. As much as this change is impactful, there are so much nonsense and bulsh*t going around it because some people are financially invested. Sometimes people make statements without revealing their true intentions. I imagine, that a person who is right now, integrating ChatGPT into something and dreaming about getting rich fast, is not gonna believe into whatever cautionary tail others tell. This specific aspect of the current hype, unlike the actual product, is dramatically similar to the crypto hype. It doesn't help either.

Edit: fix typo


Well, it's obvious that people back then were simply too short-sighted to comprehend the revolutionary potential of electricity. As usual, it took the genius of a few forward-thinking entrepreneurs and engineers to drag society, kicking and screaming, into the modern age.

Looking back, the skepticism seems laughable, but it underscores a recurring theme throughout history: people's irrational resistance to change, even when the benefits are as clear as day. It takes those with true vision to push past the naysayers and bring transformative technologies into the mainstream.

I mean, just look at the internet or smartphones. The masses were skeptical, but visionaries like Steve Jobs and Tim Berners-Lee paved the way for the indispensable tech we enjoy today. It's almost too easy to predict that many of the technologies we doubt today will become cornerstones of tomorrow's society. But, of course, only a select few have the foresight to see that.


It's ok if people want to be behind the curve. No skin off my back if they want to delay their personal use of a transformative technology. Progress will happen with or without them.


people had to be convinced that the investment was worth it.

In more recent examples, people didn’t have to be convinced that mobile phones were useful, they had to be convinced that it was worth paying 100€ upfront + €30/month to use one of these.

One area where the usefulness of the technology vs investment is abundantly clear is EVs. Everybody knows they’re better for the environment, way cheaper to recharge, less vibrating, noiseless and gearless. But they’re almost three times as expensive than non-EV!


And one of the first things they did with it was use it as an inefficient means of execution.

Marvin the Paranoid Android: "Humans; you've just got to hate them"


Curious about how the blogs and aggregators of the yesteryear referred to these proponents of this new technology, was it A/C-bros or D/C-bros?


FYI the author of the article is Rose Eveleth and she did the excellent "Flash Forward" podcast. She's recently wound it up but I'd recommend the old episodes (probably don't start with the final season though).

https://roseveleth.com/


"Communism is Soviet power plus electrification of the whole country!" - V Lenin

Russian Joke: "Consequently, Soviet power is communism minus electrification, and electrification is communism minus Soviet power."


Rightfully so. If you want people to get hyped about a thing, explaining what the thing can do for them should be an obvious necessity. But I guess that's an outdated mode of thinking, modern advertising campaigns rely more on emotional manipulation than a rational exposition of product features and benefits. Instead of promoting electricity by showing people light bulbs and electric appliances, I expect a modern advertiser would instead tell you that popular people all like electricity and that if you like electricity too, you might also become popular. Instead of showing people electric lights, you could just show some young attractive models having a picnicking in a lush city park with a narrator saying something about 'trailblazers and innovators', maybe referencing famous popular figures like Gandhi for no apparent reason.


> explaining what the thing can do for them should be an obvious necessity.

Obvious?? You are taking too much of American way of life and values for granted. The Party can just order something progressive and the people will jump with enthusiasm. You just need to train them that not jumping with enthusiasm when the Party orders something is dangerous for their career or, more effectively - for their life.

At the same 1920 time on our side of the ocean, the State Plan of Electrification of Russia: https://en.wikipedia.org/wiki/GOELRO.

Communism is Soviet power plus the electrification of the whole country.

— Vladimir Lenin

Works like a charm. The only drawback of this approach is that you’ll get not the American way of life but the Soviet way life and the progressive will mean not what is really progressive, but what the Party orders. But well, is it that important?


COMMIES!!! COMMIES EVERYWHERE!!!

Seriously, is it 1953 where some of you live?


My comment describes the presence and practices of communists in Russia in 1920.

If something after reading it makes you think that communists are somewhere else - this “something” is not my comment. I can only guess, but may be this “something” is you own living experience. No?


Demonstration > Explanation.

If you can't do that for one reason or another, you use Marketing.


There are plenty of products where people choose marketing even though demo and explanation are possible. Most.


Crypto dudes be like "this proves that Web 3 ect ect ect"


It obviously doesn't prove anything.

However: History doesn't repeat, but it rhymes


"It rhymes with this specific instance of an event in history I cherry-picked to suit my narrative"


“To the electron: May it never be of use to anyone” ― J.J. Thompson


Even a new consumer of heroin needs to be sold it the first time


Somehow this doesn't surprise me.

And if I lived then, I'd be the crazy one talking about all the possibilities while people look at me dumbfounded. Same as today. >..<


Well, people have to be convinced of the usefulness of anything. You can't just approach a person with "trust me, it's good for you".


This is nothing new. Cue the old joke about when cars were new customers wanting faster horses instead.


Thankfully no one followed with the potentials risks and costs. So a century later when the planet is baking & sinking, no one could possibly imagine giving up the new "necessity."

Thank you, marketing & public relations.

Now to do it all again, with AI!


Electric cars are also viewed this way.


ITT: the libertarians of HN, having just gottent their bailouts, pivot back to their boostraps narratives and objectivism.


Note to crypto bros: This doesn't apply to cryptocurrency or blockchains. If cryptocurrency or blockchains could be more useful for most people than what already exists (fiat currency, traditional banks and payment systems), we would know it by now.


HN community member response: Given that Ethereum launched in July 2015 and prior to that, no programmatic blockchain existed, on what basis did you select your parameter of 7.75 years as the amount of time necessary to elapse before we're sure no valid at-scale use cases exist?

Crypto bro response: loaning my USDC on Notional for a fixed rate of 4.6% and then bridging it to Arbitrum Nova to then send my friends and family interest-bearing US dollar payments for zero transaction fees seems pretty f'ing useful to me.

Degen response: lol ok bro gl with that


> loaning my USDC on Notional for a fixed rate of 4.6% and then bridging it to Arbitrum Nova to then send my friends and family interest-bearing US dollar payments for zero transaction fees seems pretty f'ing useful to me.

/r/ThatHappened.

Nobody does that.


post from 1965

Note to AI dorks: This doesn’t apply to neural networks. If neural nets could be more useful for most applications than what already exists (expert systems, human intervention), we would know it by now.


I stand by what I said.


Funny enough, with the current financial meltdown, we might find out very soon if BTC will live up to a large part of its original intended purpose.


As long as BTC has tons of "price action," it can't live up to the dream of a stable non-central-bank-backed currency. It will continue to be a speculative asset. At this point, the only actual "inflation hedge" in the cryptocurrency space seems to be Monero. Everything else has "price action" like levered NASDAQ.


I hear you and agree that it has always been speculative, and has been trading in sympathy/speculative bubble with US tech for quite some time.

However, it did de-couple from the NASDAQ after the SVB dust began to settle.


Good that you are able to declare that unilaterally.


I had to get it out of the way because crypto bros rarely miss a chance of interjecting their fad into any conversation.

I realize the irony of my comment.

I embrace that irony.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: