A well-functioning organization would not devalue people who are more judicious about their use of time, preferring productivity over unnecessary socializing.
But while what you're describing does not describe a well-functioning organization, it's definitely true in practice. People who buck the silly social dynamics in office cultures will be perceived as less productive whether it's true or not and are frequently devalued.
A knee-jerk response to what I just wrote of course will be maybe those people just can't see the real value of all these allegedly silly office rituals, but before you jump to that conclusion, consider the possibility that it's at least equally likely that the people perpetrating the rituals are overvaluing them.
The point is all of these social dynamics and office rituals should be open to being reexamined every so often to see if they're truly adding the value people think they're adding so they don't devolve into rituals people do because they're rituals. Keep the good ones, ditch the useless ones, and be proactive about objectively evaluating which are which.
The organization doesn't devalue people. Other individuals feel put-off and alienated by people who act the way the author of the article describes. Like it or not personal relationships matter, reducing friction matters, and small talk and the apparently wasteful social rituals can add to team and organization cohesion. Lone wolves, high-performing or not, get perceived as not team players, not someone willing to help others even with small things, hostile to routine human interaction.
Some workplaces go too far in one direction or another. I would prefer working in a more casual and friendly environment even if that meant engaging in idle chit-chat and signing birthdays cards, rather than a workplace where everyone had to shut up and pretend to optimize their performance. In my long career I have always found jobs and freelance work through friends and former work colleagues, and a big part of that comes down to them perceiving me as someone they enjoyed working with and hanging out with, not just someone who optimized my productivity and told them to buzz off because I had to write more code.
> People who buck the silly social dynamics in office cultures will be perceived as less productive whether it's true or not and are frequently devalued.
Younger, I would have agreed with your sentiment. Now, I appreciate good coworkers. If I don't have a socialisation outlet during the day, it's just draining and I burn out faster. If you're a person that is just a grumpy Gus isolated in their cubicle, you can make your team less effective and undermine the team spirit.
This is where I feel like management fails. To build a team you need to really pick personalities that work well together and honing and tuning the group composition is something that managers can do. Put the introverts together. Put night owls together. Parents are more understanding of taking something to over to cover for someone because they need it sometimes too.
At a certain point it just doesn't make sense to over-optimize for being highly interruptible, when the sacrifice (productivity) isn't worth the gain (satisfy outdated notions of office etiquette designed by extroverts who want to vampire other people's attention unnecessarily).
While I am very enthusiastic about reforming our system to a single payer system, one fact about the US that is unique that a lot of people on my side of the issue need to understand better is that making our healthcare system more socialized will almost certainly not bring down the costs. It may even go up.
One of the big factors driving the lower costs of healthcare in other countries isn't the cost savings from socializing it, it's the fact that the US essentially subsidizes the entire drug research industry and the rest of the world doesn't have to pay for that like we do.
So even with single payer healthcare, we're still gonna have to be the nation that shovels all the money into the drug industry. I don't personally see a problem with that per se. We are a rich country. We should pay the lion's share. But the cost shouldn't be so directly passed on to consumers in the form of insane drug prices. Switch to a prize system in exchange for taking stuff off patent or something, e.g. the federal government buying the patent whenever something new comes out for an extremely high one time price, then let the generics market go nuts with it. That would bring down consumer prices without eviscerating drug company profit margins and destroying their incentive to innovate.
> the US essentially subsidizes the entire drug research industry
This isn’t entirely true. The US does pay more for drugs but a lot of this money isn’t spent on research. In fact pharmaceutical companies spend far more on advertising than research:
It's true that the US pours a lot more money into drug R&D, and I understand that a lot of push for more global + stronger patent law comes from pharmaceuticals. However, when medicines change owners, and then get 10x price hikes years after being developed (like EpiPens, or Generics that get a tweak + a new patent), the problem is not just repaying for the R&D.
The more serious problem, is that reforming the machine that is the US health care industry is going to cost a LOT of jobs. That is going to make any kind of meaningful reform very difficult.
The small tweak = new patent loophole is easy to solve with patent abuse reform. The sting of the reduced profit margin by stopping patent abuse can be offset by making prizes for making stuff generic bigger than the profits they would get from patent abuse.
The job loss from transition to single payer is a tougher problem, but I would rather just rip that bandaid off than keep a lot of unnecessary insurance jobs around.
No. The largest cost of healthcare in the US is all the admin associated with it.
---
OECD Health Statistics data show the U.S. spent $1,055 per person on “governance and health system financing administration” in 2020, compared with the OECD12 average of $193 per person.
A 2021 study by McKinsey estimates hospital administrative costs at $250 billion and clinical services administrative costs at $205 billion, representing 21 percent and 27 percent respectively of 2019 NHE spending in these settings.16 A 2014 study by Himmelstein and colleagues comparing hospital administrative costs for the U.S. and five comparator countries found that the other countries spent 42 percent less than the U.S. on hospital administration.
the per capita health care cost is estimated to be around $11,590 in 2019, per the CDC for the USA.
Your $1,055 in administrative costs is not even close to "The largest cost of healthcare in the US is all the admin associated with it.", in fact it is less than 10%.
Salary and Wages for Physicians and Nurses: About 15 Percent
Prescription Drugs: About 10 Percent
Medical Machinery and Equipment: Less Than 5 Percent
-
Administrative Costs of Insurance: $1,055 per person on “governance and health system financing administration” in 2020, compared with the OECD12 average of $193 per person.
Administrative Costs to Providers: A 2021 study by McKinsey estimates hospital administrative costs at $250 billion and clinical services administrative costs at $205 billion, representing 21 percent and 27 percent respectively of 2019 NHE spending in these settings [aka 48% in sum, troupo]
Profits are a strong incentive to innovate. We should subsidize those profits but not do it the stupidest possible way by making consumers pay insane drug prices.
all you need to do is look at the profits (and running costs) of the health insurance industry to see where a lot of Americans health spend is leaking to, completely ignoring the fact the government is already spending more! Not to mention Americans still have to pay out of pocket costs over and above what insurance is paying for as well. You could still fund the drug industry out of the spend and make massive savings. Better yet, you can make it so no ones incentivized to push drugs other than what is evidentially effective (given a non perfect world where you can't 100% guarantee such things).
I have looked and I don't buy it. I think any realistic transition to a single payer healthcare system in the United States is going to result in us still being the most expensive health care system in the world for lots of complex downstream consequences reasons. But that doesn't mean we shouldn't do it. It means we should just embrace paying the most to get the highest quality product instead of pretending that the goal should be to somehow be less expensive than everyone else while still getting the best care. It's unrealistic. You get what you pay for.
IMHO that's almost a meme: The US spending is paying for all the research other countries benefit of. Usually used to defend the US system against criticism. The arguement, by the way, isn't true.
Kind of like the argument Europe is benegiting of US military spending and would be overrun by the Russians if it weren't for the US...
And give up membership in the most powerful military alliance in human history, reduce US international influence and give up the most lucrative defence market in the world. This idea is so spund, even the Congress GOP voted for a law preventing the POTUS from pulling the US out of NATO on his own.
I agree so, that Europe should be less dependant on the US, we sure shoupd have a very solid and competitive defence sector supplying the various European armed forces.
Being a NATO member is not being a client state... And the thing I support is a stronger EU defence industry, not the US leaving NATO or whatever you read into my comment.
And yes, the idea of the US leaving NATO is so bad it is only loughable, the only ones happy about that would be Russia and China. US influence is not limited to Europe so, NATO activities stretch to Afghanistan (past tense), Iraq (same), Ukraine, Africa, the Balcans...
As bad as NATO intervention was during the war on terror period, and boy was it bad, tue alternative would be either Chinese or Russian dominance in those regions. And that would be even worse. NATO, human rights and all that is a different topic so.
And where is Europe dependant on the US for defence? NATO is an alliance, one that served, and serves, each member state quite well... Sometimes ignorant American exceptionalism is tiresome...
Clearest example is Iceland that doesn't even have its own military.
If NATO is such an equal alliance, why is US the only "partner" that e.g. has (a lot of) military bases, running on their own laws, in other "partner" countries.
Edit: And my point of view definitely isn't from American exceptionalism. I oppose my country's (Finland's) NATO membership and watch in horror how there's now going to be 12 bases in Finland that are essentially under pure US control (Finnish laws don't apply there, or even to the US personnel outside the areas, and Finland has no say in US military coming and going as they like).
Iceland as a nation has less residents the mid-sized German town I live in, guess what my home doesn't have a military of its own neither. What Iceland has is strategic importance, just google SOSUS.
Agree on the bases thing, some of the CIAs kidnapping flights went through Rammstein in Germany and nobody did anything.
Not much I can say about Finlands NATO memebership, I am actually neutral on that. Or rather fall into the group of people that see NATO expansion to former Warsaw pact countries as something Russia could be less than thrilled about. That being said, only the future will tell if NATO membership is agood thing for Sweden, Finland and co.
Being a NATO member defenitely doesn't make a nation a US vasal, se way EU membership doesn't make a nation a vasal of Brussels. And yes, I know opinions differ on that one as well.
Of course what is meant by being a vassal is up to definition and degree. But a foreign standing army probably is somewhere on that spectrum.
Brussels at least has the pretence of the member citizens having (some) democratic control, although I find EU so antidemocratic by design that this is indeed mostly a pretence. And EU e.g. sets quite tight limits of how the member countries can structure their economies.
> Edit: And my point of view definitely isn't from American exceptionalism. I oppose my country's (Finland's) NATO membership and watch in horror how there's now going to be 12 bases in Finland that are essentially under pure US control (Finnish laws don't apply there, or even to the US personnel outside the areas, and Finland has no say in US military coming and going as they like).
That's a very sensationalistic take. No military bases will be established. The agreement between Finland and the US is about prepositioned stockpiles for use in case of war. Since the stockpiles belong to the US government, it is natural that they demand unrestricted access to warehouses holding their stuff.
There will be areas which are under the rule of US military and can house weapons and troops at their will. What more would be needed for those to be characterized as bases?
Also a foreign country positioning weapon stockpiles is a bit questionable for sovereignty in itself, and I find it quite wild to use that as a rationale for giving up the country's rule of law in the areas. You think USA would be happy to have Finnish military areas within their borders because there happens to be Finnish weapons there?
When someone says "military base", then most people imagine barracks full of life, tanks and IFVs being worked on in garages, people coming and going, groups of soldiers doing their PT in the background like in establishing shots of Hollywood movies. Guarded warehouses in the middle of nowhere are none of that.
And defense agreements that establish exceptions from local laws are nothing out of ordinary either. For example, the agreement between Finland and Sweden stipulates that visiting forces are excluded from customs procedures related to weapons, explosives and other dangerous goods.
The Finnish-American one is much more detailed and goes into weeds like excluding vehicles transported by the US into Finland from car tax and VAT. :) It's common sense, but countries that are ruled by law must have these things written down.
Guarded warehouses in the middle of nowhere give at least as wrong image as military base. The areas are mostly next to Finnish military bases and USA will be taking over some Finnish military infrastructure. And the DCA allows for permanent US troops there.
My hunch is that the word "base" is avoided because the public opinion on even NATO bases is split at best, and I'm quite sure US bases are significantly less popular.
You'd be surprised how detachments from NATO militaries are treated at US bases. Pretty much the way NATO countries treat US bases and detachments. Obviously there are more US bases abroad than non-US ones in the states, but still.
US military dominance (in the form of NATO in Europe) isn't so much about defending the counties per se but to defend its interests in the countries by low key taking over their armies. You have a country quite well by the balls if their defence depends on you.
It takes a lot of money to bring a drug to market. Those costs have to be recovered and there has to be a net profit or no one would do it.
Pharma companies charge different prices for drugs in different markets. Markets with single payer systems usually restrict expensive drugs (either not permitting them or restricting their use to fewer cases) and/or cap prices. Some countries don’t honor pharma patents. Together these controls may make the drug unprofitable in many markets. Someone has to pay full price to make the drug research net profitable, or the pharma companies reduce research into stuff they lose money on.
It happens that the US market is favorable for pharma companies to recoup R&D costs by charging more than most anywhere else. This is possible because of the US regulatory environment and heavy lobbying by pharma.
This is what is meant when someone says that the US “subsidizes” drug costs for the rest of the world.
Note that I am not saying this is a good system; I’m just attempting to describe it.
> It takes a lot of money to bring a drug to market. Those costs have to be recovered and there has to be a net profit
A friend of mine who works in this industry explained how they come up with drug pricing and whether they decided to bring a drug to market. It's pure capitalism, of course--and why shouldn't it be?
It's a bummer, though, to think of the drugs that could have really, really helped some people but weren't lucrative enough to bother selling. Oh well!
> or no one would do it.
On the other hand, that's sort of like saying "There has to be a net profit in going to the moon or no one would do it" or "there has to be a net profit in selling flood insurance or no one would do it". Neither of those net a profit, but sometimes does it all the same.
There's a non-trivial amount of R&D being done with public funds (universities, and/or grants) that go to medicines that end up being locked up with patents and privately owned.
I lived overseas for work. I once went to the pharmacy and they said "4" -- and I was like "4 hundred?"
No, they meant 4. The same medicine I'd pay $25 copay (with insurance kicking in another $70 or $100) cost $4 out of pocket overseas.
This is because single payer systems overseas negotiate down prices. The US does. not. This effectively means that the US is subsidizing the drug industry.
Love that site. I used their page on Secret of Mana as a reference [1] during my work making ROM hacks for the game [2] and a large focus of my work was restoring content that was cut, including unused graphics, dialog, and even unused music. Fun project.
Attitudes like that are why so many "real applications" have things like progress bars assembled from divs and JS instead of using a progress element or show/hide toggles assembled from divs and JS instead of using details/summary elements. Turns out there are so many HTML elements for a reason.
Everyone likes the idea of keeping things simple and using web native constructs. The problem is that web native constructs can't do the things people them to do.
Even the progress bar example is a good one. Yes, there is a minimal progress indicator element shipped in the browser. It is completely useless for all but rudimentary cases.
I think what that person meant was if browser default styles made semantic HTML look more beautiful, it would probably reduce the incentive for lazy devs to make div soup.
Like imagine if every browser preloaded a dozen attractive classless CSS frameworks for users and/or devs to choose from sort of like CSS Zen Garden.
If all browsers had that, I think we'd get less div soup.
(2) There's relatively few native components and the ones that exist are limited. Like not even what JQueryUI gave you 15 years ago limited. No cards, accordions, avatars, and other sorts of basic building blocks.
(3) No real support for common page layouts. Like a Dashboard or Hero marketing page sorts of things.
The argument is we'd be better off if ES Modules were never proposed or implemented because the require syntax is better. By splitting the ecosystem, the developer experience has been made worse for no tangible benefit. Like this is not just the pain of transition from an old thing to a new thing, it's a straight up mistake and it should've never happened.
It seems obvious to me too that leaving the module system not officially spec'd as a JS standard would be terrible to JS... but maybe not to OP? Or they think "exactly like what CommonJS currently is" could and should have been that standard?
(I'm guessing there are reasons people, at least at the time, thought CommonJS wouldn't have worked as the standard, or needed to be improved upon and it was worth it to do so?)
I wish the OP actually covered some of this stuff. Without it, not being an expert, I'm having trouble following what exactly their critique is, although I understand they wish they didn't have to deal with both CommonJS and ES Modules... but yeah, we clearly needed a JS standard, right? Or not?
I think any discussion of the comparisons between CommonJS and ES Modules can be tough because "everybody" has already forgotten (or never learned in the first place) the lessons of AMD modules (and UMD modules).
CommonJS cannot run in the browser unbundled. Eventually the Node ecosystem built a lot of good bundlers that fake it well enough to kill AMD (and UMD) modules, but before all of that AMD (and UMD, which was basically CommonJS in an AMD wrapper that could itself run as CommonJS, and exactly the turducken that sounds like as a design pattern) was built to solve real browser runtime problems that CommonJS ignored or never solved: browsers are inherently asynchronous (downloading a JS file from a URL takes wall clock time more often than not); browsers don't have any view of a server's file system beyond the basic HTTP verbs on previously-formed URLs and to query for files even if it isn't to download them is still a slow discovery process; etc and so forth.
It was good riddance to AMD (and UMD) when bundlers killed them, but bundlers in turn gave too much of an impression that CommonJS was "sufficient" as people forgot the lessons of AMD modules. It kind of was sufficient, for a time, if you didn't mind browser code being trapped in bundlers and build processes for presumably the rest of time.
ESM took the lessons of AMD and applied nice syntax to them (writing AMD modules by hand was an awful experience, from personal experience; it led me to adopting Typescript 0.x in a past life before Typescript was even officially "stable") with further advantages that even AMD didn't support. ESM is async-always (and tree shakeable) and makes no assumptions about server file system trees.
Good riddance to AMD and UMD. I can't wait for CommonJS to go the way of that dodo and finally wish good riddance to "mandatory" bundlers for CommonJS dependencies, too. I'm glad so many frontend developers never experienced the pains of AMD that they've almost collectively lost the lessons of AMD already. I can't wait for CommonJS to similarly feel like a bad memory of an old nightmare.
reply