Hacker News new | past | comments | ask | show | jobs | submit login
Microsoft researched what made employees happy (zdnet.com)
201 points by alok-g on July 10, 2022 | hide | past | favorite | 171 comments



I think there is a major framing issue here.

Speaking for myself, and the many colleagues I’ve spoken to about this, collaboration isn’t the issue.

“Collaboration” is the issue.

I’ve had some of my most meaningful (and happiness-inducing) experiences at work when collaborating deeply with colleagues.

I’ve had some of the most frustrating experiences at work when “collaborating”.

The distinction simply being: “Collaboration” in the form of “interacting in realtime” or “existing in the same space” can also be 99% distraction.

“Real” collaboration - the kind where the collective consciousness of the group results in outcomes that could not have been achieved solo - is where the magic lies.

I’m absolutely happier with more autonomy and less distraction. But that is not the antithesis of collaboration.

A pendulum that swings too far in the other direction will leave folks feeling isolated, and will reduce the potential of a group. Finding the right balance and distinguishing between real collaborative interaction vs. “acting out” something that looks like collaboration on the surface is critical.


They probably could have just shortened it to autonomy. Collaboration is great when it’s not forced - hence autonomy.

I also like to call autonomy to being treated like an adult. So much of what companies do is infantilizing now.


I believe this is key. Reading texts about early industrial revolution there is this concept of alienation, and I think it can be applied to a lot of jobs pretty well.

Having a stake in the result and freedom to choose which exact job to perform at a time to accomplish the result, as opposed to a top-down approach seems to be more satisfying as an employee.

At the time of industrial revolution the comparison is often between farming (where the farmer works hard but has some autonomy) and the factory, with more specialized and repetitive tasks.


That doesn't really capture the nuance either. Autonomy can be isolating too. Collaboration is something your team needs to at least enable if not explicitly foster.


The article links to another article by the same author about a restaurant and its secret to success in keeping staff 20-30+ years being autonomy:

https://www.zdnet.com/article/a-restaurant-owner-gives-tech-...


> infantilizing

Curious to understand this better. Could you give some examples?


Not the OP, but I have a few from my workplace: My favorite example was being reminded to wear sunscreen because it is summer. by our "safety coordinator" at an all hands meeting. Or the "ladder training", where I was taught to use a ladder. The cause for that is is a "zero accident policy". Which is perceived as a good thing, and how could it not? Accidents are bad. Never mind the absurdity of a zero accident policy. Criticizing a policy like that very quickly puts a "reckless and dangerous" stamp on you, as the only reason accidents do not happen is the policy. The policy leads to reams of paperwork being filled out which creates a need for more administrative personnel and as those people get paid to do that, it is in their best interest that it stays that way. So if you think about lifting that 50 lbs box as a healthy adult in prime physical condition, you better ask your safety coordinator first, because if anyone sees you, you will get a slap on the wrist. My theory is that at the heart of what I am describing is fear. As long as the company can demonstrate that the employee was instructed to play it safe, they have the upper hand, should it come to a lawsuit. And I have a hunch that it also has to do with philosophy. Some people believe that the world can be a safe place for everyone. And they will control and subdue to make that a reality. And they are right. Lack of action is safe. What they fail to see is that it leads to stagnation and death.


That’s about insurance, while someone might care about your safety, they spend to save and reduce liability.

It varies by state as well. In New York, for example, any third-party work related fall from a height can be the company’s responsibility. You you need to have a bunch of procedures around ladders and lifts to get insurance or avoid excessive rates.

A zero accident policy means that you don’t accept employee injuries as part of the job. There’s usually a safety committee and they look at root causes for accidents, and processes to control them. From a company POV, I have no idea what your physical condition is, but when you something stupid like deadlift and transfer that 50lb piece of equipment on a ladder, I have to pay for it.

I worked at a place years ago where the procedure to decommission a hard drive was to puncture the platters with a drill press and remove the electronics with a pry bar. All good until an IT guy got metal debris in his eye, and required extensive surgery. Turns out the IT folks removed safety equipment to make the process faster, didn’t wear eye protection, and had the press improperly situated. Correcting any of those things would have prevented that accident - and they were fubar because no process existed to question it, and everyone felt comfortable doing what they were doing. It may seem like infantilism, but there’s a guy who suffered and continues to suffer from an eye injury because nobody was accountable for his safety.


The reminders to be safe are to ensure that people don't grow complacent about things that are dangerous.

Our school has a mandatory safety training before students go on internships, and the reason for that is that interns have died in the past. A lot of things can be dangerous at work (including ladders) and not everybody is familiar with the associated hazards. Ladders, compressed air, electricity, machinery, confined spaces, chemicals, dust, and so on are all things that are dangerous, and the fact that they injure or kill people every year even though they're known hazards is a testament that people are too complacent around them.


> The reminders to be safe are to ensure that people don't grow complacent about things that are dangerous.

Which doesn’t work, since now everyone gets complacent about the training instead.


I'd argue it looks like fear, but really it's fusion. People want to fuse all their needs together into one hive-mind reminder system so they have more time and energy to spend on work. Smartphones and management are all too happy to help.

People can't see the self-evident anymore. The Kantian and Nietzschian thing-in-itself is dead. People have to be told not to walk into boxes otherwise they will, while they dream of some heavenly work project in their mind.

How do we control it? Zero accident policies, you gotta remember not to walk into boxes and you gotta remember to slap on sunscreen, as if the material world only starts to exist again in people's minds when you mention it.


First we killed god, now we try to kill the thing-in-itself. Probably a lot harder to do the latter. I appreciate your perspective on this, I hadn't made that connection before. When you put it like this, I wonder if this fusion is a force that acts counter to individuation, basically being part of "something bigger", the workplace replacing community?


I once worked in the head office of a mining company in an IT role. There was a guy wearing a hard-hat standing next to the stairs with a clipboard taking notes to see if people were holding on to the handrail or not (as per corporate safety regulations).

You see, there must be "one rule for everyone". The head-office suits can't be seen as discriminating against the plebs at the mine sites. It's a perception thing, people won't obey if it's "rules for thee, not for me".

So... I got a talking to by HR for a solid hour because I didn't hold the handrail when taking two steps down to the cafeteria.


The US judiciary is far too friendly to personal injury lawsuits.

The prevailing atmosphere is it's never the injured person who is at fault, it's always somebody nearby who has money.


How do you know this to be true? From a cursory search it looks like 95% of lawsuits for personal injury never make it to trial, so the judiciary is never involved in the vast majority of cases.

Of the 5% that do make it to trial, only 56% of them result in any award for the plaintiff and if you look at the breakdown [1], that's because of auto accidents which can hardly be said to be a case of "blame is laid on the person who has money", especially when you look at how much money is awarded in such claims.

Premises injuries has only a 39% success rate.

Product liability has a 38% success rate.

Medical malpractice has only a 19% success rate.

What do you know about the judiciary that leads you to your claim?

[1] https://www.cloudlex.com/tips-and-tricks/personal-injury-cas...


How about the playground equipment at schools has all been wussified due to lawsuits?

Or kids' chemistry sets that went in the 1960s from an advanced chemistry course to little more than kitchen recipes?

All the result of lawsuits.

Then there was a painter who fell off his own ladder while painting Steve Ballmer's mansion, who successfully sued Ballmer for zillions. And people in general who carry millions of dollars in liability insurance.

As for "not making it to trial", people often settle out of fear of what would happen in court, not because the lawsuits wouldn't get traction in court.


I have fond memories of my Gilbert chemistry set, circa 1964. It allowed me to learn chemistry and get in slightly serious trouble if I didn't pay attention. Totally excellent.

Not long ago, I checked out buying a chem set for my grand-niece. Completely uninspiring. Color changes. Baking soda/vinegar fizzes. I bought her good archery equipment. I'll take care of the chemistry inspiration with my own curriculum.


My Gilbert chemistry set was the last one before they were emasculated. Had a lot of fun with it, but didn't learn much chemistry :-)

I was smart enough not to eat the copper sulfate.


I think you're just making things up based on an unfounded and certainly unsubstantiated perception of reality.

Do you have any reference for your claim about Steve Ballmer? I tried numerous different ways of searching for it and none of them produce any kind of result. I also tried Bill Gates, Mark Zuckerberg, and some other rich tech CEOs but nothing comes up. Once again, you may have heard some kind of anecdotes or stories and mixed things up in your mind to create an image of what you think is real, but the actual statistics and facts do not back up your bold assertions.

For example, it's not lawsuits that put an end to the chemistry sets of the 1960s, it's the Toxic Substances Control Act together with the The Toy Safety Act that removed the use of lead, poisons, acids and other toxic chemicals from chemistry sets that were marketing towards kids. Had nothing to do with lawsuits but rather due to legislation.

If you want actually want to blame something for the decline of the modern chemistry set, it has more to do with policymakers concerns about their use in illegal drug manufacturing than it does with any kind of lawsuit.


> Do you have any reference for your claim about Steve Ballmer?

It was some years ago in the Seattle Times. Sorry I didn't keep a clipping.

I was also injured once in a car accident, being hit by a garbage truck. The ambulance people told me I hit the jackpot. People came out of the woodwork recommending personal injury lawyers who could set me up for life. People at work all told me I had the million dollar injury. People were coaching (unasked) me on how to pretend I was much more injured than I was.

All I asked for was to pay the medical bills and lost time at work and my car. The opposition lawyer was aghast, and was certainly eager to sign that deal.

But I'm just imagining things, right?

I could be wrong about the chemistry set emasculation. But just a couple weeks ago on HN there was a long thread about the wussification of school playgrounds.


>It was some years ago in the Seattle Times. Sorry I didn't keep a clipping.

The Seattle Times has a searchable archive of all their articles going back 30 years.

Searches for "Ballmer lawsuit", "Ballmer painter", "Ballmer ladder" all turn up nothing. It looks like you made this story up; not in an intentionally deceptive manner mind you, but the same way old wives make up stories by mixing together pieces of different stories taken out of context together to fit a narrative.

Your anecdote about getting into an accident is tragic, but has nothing to do to support your claim that, and I quote:

"it's never the injured person who is at fault, it's always somebody nearby who has money."

Finally your point about wussification of school playgrounds is precisely what I wish to avoid. There was basically a conversation on HN where people parroted opinions about how playgrounds are wussified, and just like with this discussion it probably had no facts or evidence to substantiate the claim but you accepted the claims made in that discussion simply on the basis that it appeals to your belief system.

Do not believe everything you read on this site. You can be mindful that people on this site have opinions on certain topics, but do not take those opinions and then reassert them as if they are facts. This site is just as full of misinformation and biases as any other, where people pretend to be experts and assert categorical statements on subjects they have no expertise in because it appeals to their beliefs. Yes this place is more civil about spreading misinformation, but misinformation is no more factual just because the people spreading it do so politely.


> It looks like you made this story up

I understand that it is not in the ST archives, and because of that, you do not believe it. That's a perfectly reasonable position for you to take. I did not make it up, though. But I'll still withdraw it pending finding citable evidence.

As for playgrounds:

https://www.facesoflawsuitabuse.org/2018/06/lawsuits-take-sc...

https://www.paloaltoonline.com/news/2020/09/25/elementary-st...


> but has nothing to do to support your claim that, and I quote: "it's never the injured person who is at fault, it's always somebody nearby who has money."

The intended target of the lawsuit was the company that operated the truck that hit me, not the operator. The reason is simple, the truck driver didn't have any money, and the company did.

I hope you carry liability insurance, because if someone trips in your yard and breaks an arm and your house looks expensive, they're going to sue you.


>The reason is simple, the truck driver didn't have any money, and the company did.

The reason is simple, but has nothing to do with what you think it does. The legal principle is known as vicarious liability which means that employers are vicariously liable for accidents arising as a result of the actions of their employees during the scope and course of employment.

You were advised to sue the company because the company was the only entity who could possibly be liable for the accident you suffered.

https://en.wikipedia.org/wiki/Vicarious_liability


Exactly what I was talking about:

"The prevailing atmosphere is it's never the injured person who is at fault, it's always somebody nearby who has money."

And the reason for this is money. You can't sue an employee for $200 million, because he doesn't have $200 million. So let's make the employer, who has $200 million, liable, even though the employer was not negligent and did not cause the accident.

This isn't justice.

Thank you, though, I did not know there was a specific term for this. "Vicarious" liability, indeed.


> it's the Toxic Substances Control Act together with the The Toy Safety Act

I think it's good to note in cases like this that both were bipartisan (in passage if not in origination), lest one side or the other think this was an ideological issue.


Let's assume that all the successful "awards for plaintiff" claims were 100% justified.

That's 56% of 5% of total lawsuits.

So that suggests that 97.2% of those lawsuits are unsuccessful.

From WalterBright's comments my guess is that his position would be something like "a ton of money and time is wasted lawyers and other legal stuff due to a system that accepts so many frivolous suits."

But another person could interpret that as "there are a lot of legitimate claims in that 97.2% that end up being unsuccessful because the US system is unfriendly to you if you don't have $$$ to pursue a case." Close to the complete opposite.

In terms of "how good the US judicial system for these sorts of liability cases," I'm not sure that actually matters much, though! In either case, there's a lot of waste and a lot of failures in the system. A system that relied less on adversarial procedures and each side retaining their own council and more on judges with independent fact-finders or arbitrators seems very preferable! If you're of the "it's rigged against the plaintiff" view, this system could be much harder to drown in paperwork and spurious procedural stuff like a large corporation can do to a small plaintiff today. And if you're in the "most of this shouldn't even get this far, it's too easy to file nuisance suits" camp, this system could toss out ridiculous things sooner.

This sort of "inquisitorial" or "nonadversarial" system is used in a few places in the US (like traffic court) and much more extensively in some countries.


I interpreted WalterBright's comment as an ignorant way of saying that people today don't take responsibility for their own actions and instead try to find someone rich enough to pin the blame on, especially as a means of enriching themselves.

The statistics do not support that claim.

If you wish to make a more nuanced claim and discuss it, then by all means feel free to do so directly, without speculating about what WalterBright meant. It would be nice to do it on the basis of some kind of evidence though.


" The US judiciary is far too friendly to personal injury lawsuits." from WalterBright is a quite different claim than simply "people don't take responsibility." It's about who the system pushes responsibility onto, plaintiffs or defendants.

My own position was in my second to last paragraph: the US system sucks regardless of if you think it favors plaintiffs or defendants.


The original post is literally two sentences and you omit the sentence that is pertinent to my point. I think that alone says all it needs to about the manner by which you are evaluating this topic.

All the best to you.


You keep saying you'd prefer to engage with a more specific claim about the legal system rather than argue about what WalterBright meant, yet you keep avoiding engaging with that part in favor of debating that meaning!

Rather, you're now narrowing the debate to whether or not their "The prevailing atmosphere" is refering to "The US judiciary" - the subject of the previous sentence - or society at large. Which I read differently than you, but also doesn't matter.

Sooooo if you want to engage with my actual position, do so! What are your thoughts on adversarial legalism? Or if you want to keep playing debate cop instead, have fun, but... why?


> So that suggests that 97.2% of those lawsuits are unsuccessful.

If you assume the 95% of lawsuits that settle were 0% justified. On the other hand, I'll just assume that the guilt was so apparent there was no point of putting up a defense and wasting the money. So that means that a shocking 97.8% of personal injury claims are valid.

The rest of your post relies on your assumption being valid, which I vehemently disagree with. I don't really think it's 100%, but I do think it's closer to 100% than 0%.


The rest of my post is actually saying the system sucks regardless of who's getting the short stick today.

I didn't interpret "don't make it to trial" as "100% settle" - my initial assumption was that they got thrown out instead. Probably it's a mix, that's a fair point.

Cases being settled, to me, though, still suggest that we would do better in a system where "bothering to fight it" isn't a question of deep pockets, and buying people off isn't an option, and things actually end up in front of judges. For transparency alone, vs confidential settlements.


> I didn't interpret "don't make it to trial" as "100% settle" - my initial assumption was that they got thrown out instead.

According to the source the OP cited, the vast majority settle (according to the source's source, it seems like 10-20% are thrown out.)

They list the primary reason that cases settle isn't a question of deep pockets. It's that the plaintiff's attorney makes a case for the jury amount to be $X-$Y dollars and Y-X is a small enough spread that everyone would rather just settle the case than go through a jury trial when they are 90%+ (I made up that percentage) in agreement. That is, when they agree the defendant owes $XXX +/- 5%. Why waste the time on the last few thousand when they can split the difference and go home?

Meanwhile, if you're considered about regulatory capture, a jury and not a bench trial seem better. For what it's worth, the US system totally lets the judge decide the entire matter if both sides agree to waive the jury. It seems like that's the case in ~25-33% of cases that make their way to trial.


> From a cursory search it looks like 95% of lawsuits for personal injury never make it to trial, so the judiciary is never involved in the vast majority of cases.

These decisions don't happen in a vacuum, they happen in an environment where the companies (their lawyers) know that settling nearly every unreasonable lawuit is still cheaper than risking going to a trial.


Do you have a good, unbiased sample of instances of people injured by businesses?

Any random major corporation is dealing with hundreds if not thousands of lawsuits at any given moment. They can, and do, wreck the lives of thousands of people without it even being significant enough to report in SEC filings, let alone spark a journalist's interest.

And then there are all the injuries and deaths that lead to consulting a lawyer who says the life lost isn't worth enough to make a lawsuit practical. If you don't assess the number of these, it's pretty hard to say how friendly things are for personal injury lawsuits.


I'm not referring to corporations wrecking the lives of people. I'm referring to people accidentally injuring themselves and going for the jackpot lawsuit against any entity nearby with lots of money to extract.


My point was that what you are referring to does not represent personal injury in the US. You can be angry at specific instances like you describe, but it's totally unrepresentative.

In my opinion, which is not based on expertise as such, but having worked in the corporate litigation industry, and having had friends of friends and family injured and killed by corporations and businesses.


> having had friends of friends and family injured and killed by corporations and businesses

One wonders what profession these people are in. Also, being in the litigation industry, I'm sure you know that injuring and killing someone is a crime. I'm talking about accidents here.


>One wonders what profession these people are in.

Both people I'm thinking of were victims of interactions as consumers (of medical services/products), not employees, as it happens.

In one case, the lack of a lucrative career factored directly into the futility of litigation.

In the other, the harm was largely that a medication side effect led to impulsive decisions that undermined a good career and other relationships, and would've been difficult to prove in court.

How do I know this latter was real? Because I coincidentally know there was a massive lawsuit on behalf of victims less ambiguous.

>Also, being in the litigation industry, I'm sure you know that injuring and killing someone is a crime

I am not a lawyer, and I'm not "in the litigation industry" for some years now. However, it's a strange thing to say that injuring or killing someone is always a crime. It's frequently not even a tort. Have you known nobody that's died due to a doctor's mistake? A prescription medication? Seen a "death panel" at a nursing home?

>I'm talking about accidents here.

Sure, and so was I. I mean, you can never really be certain, especially in the many cases that don't get litigated. That's why discovery is a thing, I think. Many, many cases just don't meet cost/benefit criteria and are never filed.

I like repeating this no matter how much I get downvoted each time: medical errors are one of the leading causes of death, behind only one or two like cancer. They are vastly underestimated by most people because there is no diagnostic nor billing code for fuck-ups. Anyone who's worked anywhere near the medical/insurance industry knows how everything revolves around codes, so it had a bitter ring of truth to me when I first read a certain article from Johns Hopkins.


Actually, I'm aware of the deadly mistakes that go on in the medical industry. A friend of mine's mother was killed in a hospital by administering the wrong drug. They could learn a lot from the aviation industry in how to minimize human error.

The software industry could also learn from the aviation industry, I've blathered on about this for years :-/


Seems like that’s been baked into the culture of American arbitration since its founding. Instead of top-down collective regulation, which was a relatively new advance made in the last century, the system from the start has always promoted individuals hashing out their differences in courts with torts. I am reminded that Lincoln had the rustic profession of lawyer when Illinois was still a semi-frontier state.

Isn’t that the aim of a libertarian society? In the absence of regulation, of overbearing government oversight or bureaucracy, individuals hash it out in courts? Would think that’s something that the American system since the Anglo common law days and the libertarian ideal agree upon.


Regulations that prevent force or fraud or infringing on other peoples' rights are perfectly valid in libertarianism. It's a proper function of government.


So should such regulation should be enforced by government bureaucrats, rather than by individuals pursuing the proper enforcement of contracts via courts of arbitration?


I thought I was pretty clear.


What's wrong with telling people to wear sunscreen?

What's wrong with ensuring that people know how to use a ladder safely?

What's wrong with aspiring to have zero accidents in a work place?


> What's wrong with telling people to wear sunscreen?

The same thing that's wrong with telling people to wipe their asses when they use the toilet. I'd be offended that you don't think I've got that handled for myself.


Every office I've ever worked in has had sooner or later has someone who's needed to be told by management something about clothes/behavior/hygiene that you'd think should be obvious, and it was something different each time.


If it's in response to a issue, that's constructive and helpful, but imagine some 20 something restroom safety coordinator reminding you that you'll get a rash if you don't wipe yourself as you leave the restroom unprompted.

These "helpful tips" are mostly a power play. The framing is you are an idiot that has to be told that you get sunburned in the sun, let me guide you so you don't hurt yourself, aren't you so glad that I'm here to look out for you.


I forgot to wear sunscreen the other day at a work event and I’m still bright red two weeks later.


MSHA agrees with you about safety. And MSHA reacted sanely to the corona-panic by basically doing nothing more than reminding operators to allow miners to stay home when sick. https://www.msha.gov/news-media/press-releases/2021/03/10/us...

Yet OSHA got the publicity by attempting to mandate shots.


Wearing sunscreen is excellent advice, I do it consistently myself. In my opinion, it's highly appropriate to offer advice like this to friends and family.

But it would grate on me to be told this, or to be given advice on any other non-work matter, by my employer. It suggests a sort of paternalism that is out of line, and I would immediately assume ulterior motives (e.g. saving on health insurance bills) rather than genuine concern.


Seems you have a keyboard career.

I was glad to have been trained on proper blocking and hoisting, because things in the physical world at work can drop and hurt someone. During yearly MSHA refresher the trainer encouraged us to take eye protection from the company and use it at home.

>> ... sunscreen ... it's highly appropriate to offer advice like this ...

But you think it is appropriate to remind people about sunscreen use while workplace safety reminders are not good for you.


Would you prefer an employer who didn't give a shit about your well-being and treated you like a lump of labour to be wrung of all economic value and tossed aside when there's no more to get from you?

Like is it really so grating that someone tells you to do something good for yourself, even if their reason is because it's good for them?

Really?


The same employer treated my like a lump of labor, wrung me out and I could not leave because I was shackled by being on a work visa. And they knew it. Those were the most miserable years of my life. I'd do it again, still, cause of the perspective it gave me, but there are probably other ways that involve less trauma.

Which makes them telling me to use sunscreen even more grating. The reason I find it so off-putting to tell employees to wear sunscreen in an all hands meeting is that I do not see this as the role of my employer. What I do in my private life is my responsibility and my choice. This might be a cultural difference too, I remember being alienated by the possibility of drug tests. The copious amounts of cocaine I might or might not do in my spare time are none of my employers business.

And about cloaking things in "good": I have yet to see a bureaucracy that does care about "good". In my experience bureaucracy cares about risk mitigation and liability. My personal risk assessment works differently than that, it takes things like my human experience into consideration. So I get cranky when I feel my employer is encroaching on my personal space.


If someone, i.e. an individual person whom I know in real life, gave me advice then I would happily listen. This includes bosses or coworkers.

But when it is coming through a bureaucracy, by mass email from someone whom I don't know (if it is signed by a named individual at all), I would be less happy.

Maybe this reflects excessive cynicism on my part towards large organizations. I recognize that not everyone feels the same way.


> But when it is coming through a bureaucracy, by mass email from someone whom I don't know (if it is signed by a named individual at all

What if it comes from XXX in HR, who just lost a family member to skin cancer and decided to use their corporate authority and "send to all" powers to remind everyone to use sunscreen.

What if it comes from a new study that shows that monitors cause skin cancer and it's new information most desk workers don't know to use sunscreen at their desk?

What if it comes every year in the "get ready for Summer" info packet?

What if it came with a reminder to be careful on Friday half-days that you weren't expecting?

I'm not sure what the cynicism is geared towards? It's impersonal? Isn't it a benefit that these aren't people who think you personally don't know to use sunscreen?

You said you questioned their motives. Would it help if you knew the people in HR and knew they were legitimately trying to help you? Would it help if they said "hey, don't get sick and raise our premiums. It comes out of the same pot off money everyone's bonuses come out of"?


> What's wrong with aspiring to have zero accidents in a work place?

The same thing that's wrong with aspiring to have "zero bugs" in a codebase. You can have zero open issues (redefining "bug"), or zero known bugs at release (JPL-level quality metrics, at a huge cost), but any reasonable software developer knows that bugs must be accepted as part of the process of coding. Similarly, humans must accept that "accidents" are part of living. Things are not and never will be perfect, no matter how many rules or systems or metrics you put in place.


Human injury is radically different than a bug in FB/TWTR mobile app.

Safety is expensive. In USA mine safety is regulated by MSHA. Is anyone surprised that much mine production (except heavy stuff like gravel sand coal) is done overseas and imported? The lithium in your EV is not from USA.


The US is third in global mining activity, behind China and Austria.


I think we mostly agree, though likely Australia. It takes a lot of gravel and sand to build stuff. Due to the weight and bulk, this mining is kept close to home.


There are thousands of examples if you work for a paternalistic megacorp. Case in point, our office now lets people "hotel" at desks, but we have to reserve them for time slots. We could, like the library, just sit and use unoccupied desks. But they add this level of control that's geared towards toddlers that can't operate in a shared public space.


This killed my spirit at one of my previous gigs at a stupid megacorp. I needed the cash and told myself it was worth a shot, even though I had a hunch thay had ADHD and it just wouldn't work. I needed to isolate myself to get work done, but I was looked at almost in horror when I brought my laptop to a quiet space away from my desk. I didn't use spare offices because those were reserved ahead of time like you say. So many wasted resources. Back at my desk there was the constant clacking of keys and jackass on his bluetooth headset making sales calls from his open cubicle, and typing away on his mechanical keyboard, or buddy eating chips.

It could have been a really interesting technical experience, because the frontend was an absolute nightmare, but it serviced millions of people daily and billions in transactions annually. But I just couldn't catch up with so much bullshit. The best time was when nobody was there, so sometimes I'd work later, but then I'd be chastised for not showing up to the fucking standup on time.


Not to distract from your main point, but just an aside, have you ever looked into misophonia? A few friends of mine have it and your post heavily reminded me of them (the chips and keyboard, especially). For them, earplugs and/or ANC headphones made a world of difference in being able to focus on their work and not be driven crazy.

The open-layout office is hell for people who need quiet for focused work :( Sorry you had to deal with that.


Yes, it's something I've known about for a few years, and for that I'm very grateful for the move to remote work. At this point I attribute it to a comorbidity with some other bigger thing that results in momentary ramp up of anxiety, because it's not always a problem, but when it is I need to find somewhere else to be and it's really derailing. Headphones only help if I already had them on, I can't use headphones or any other sound dampening devices to drown it out if I've already tuned into it. I've literally tried everything I could find. Concerta seems to help


We all “need quiet for focused work.”

Focused work should be challenging or at least require a person’s dedicated attention, which is not possible when distractions are a normal occurrence in a workplace regardless of whether someone has a mental or physical condition that makes them less functional in a “normally” loud workplace.


I don't doubt that, but for many of us, it's a matter of degree. Like I don't mind soft conversation in the background, and I prefer soft bluegrass over dead silence... having SOME sound helps me focus and stay on rhythm mentally. If someone eats lunch while I'm working, I probably wouldn't even notice it, and even if I did, I'd just be like "What's for lunch?" And then go back to work.

But for my friends with severe misophonia (self-diagnosed, for the most part), hearing chips crunch or lips smacking will drive them out of the room in abject fury. It is, according to them at least, uncontrollable and unavoidable and it turns what would be "mild annoyance" for most of us into "severe anguish". At that point it's not even about losing some productivity, but being driven into such severe discomfort that you can't physically be in the proximity of the sounds anymore and you start to hate the other person who's caused it, and it all just unravels from there.

There's a huge difference between that and "oh, well, that construction noise outside is kinda annoying, I hope it stops soon".


Agreed. Though some ppl seem to not have much issue with certain noises.


Yep, my dad was forced to this "hotel" crap too.

...but he works in a field where stuff is still 80% on paper and he has colleagues with self-purchased reference books and the ilk literally from floor to ceiling.

He spent a week clearing out his office and will be going into full retirement now. There's no way he'll be spending time lugging around a pallet worth of books to a new hotel desk every morning - nor will he give his own books to the company.


This is only useful when headcount exceeds available office space (cubicles / desks). If that isn’t the case I agree it’s infantilizing


I think it's more than that. It's a factor of:

1. Having bad actors in the social group that will behave selfishly to the group's overall detriment. 2. Having ineffective mechanisms for punishing bad behavior.

I've been at a few different startup where we had a shortage of office space because of our growth. But instead of some reservation system and official corporate guidelines we all managed to share effectively.

I don't remember anyone hoarding the short space but even if they had I imagine that someone would have pulled them aside and pointed out their behavior.


if you plan for it in advance, you lower the desk count to where its useful so as to save money. My current employer has started talking about moving to a new office building and hot desking etc. I've already started looking for a new job.


Must be annoying… Sounds like a poorly communicated capacity planning thing though?


Does the company need capacity planning in the form of making everyone do extra work instead of making one person glance around and report on the level of free desks or making a button one can click only when they tried to find a desk and couldn't?


Agree, and nothing tells an employee they aren't valued more than not providing them an adequate workspace


> Does the company need capacity planning in the form of making everyone do extra work

Serious question - is it a hard process? Or do you just have to check a box that says "I will be in the office and not WFH tomorrow"? Because it definitely seems like it's a minor inconvenience at worst.


I don't know about this company, but I've used something similar at other companies, and it involved loading a web site that's unbearably slow and picking a desk on a map of the floor plan daily. Maybe 6 clicks and 10 seconds of waiting for the thing to load if everything goes well and you don't have to zoom around the map or change buildings or floors. What's flow anyway?


That sounds like a horrible interface given the most common use is likely to be "same desk tomorrow please".

Although I wonder how that impacts flow. Presumably you can do it anytime during the day, so you can do it when you were taking a break anyway. Unless it opens at a specific time and you have to make sure your desk isn't sniped!


So the extra work part is infantalizing? I am not here to judge, just genuinely trying to understand.

PS: Should be obvious by now that I don’t work for a “paternalistic megacorp”…


It's not the extra work, it's the requirement to ask for permission to have a desk to do the work they're asked to do.


Let's say you are a company that hires a lot of flexible-remote workers? The deal is that everyone works from home but can come to the office if they want to have a real meeting or if they feel like it. Makes prefect sense they should book a desk if they intend to work from the office. What am I missing?


For some, hybrid work is becoming mandatory. Having to beg for space to work in the office when it is required is infantilizing.

Further, being forced to be in the office so the boss can see you're busy (and thus having your location tracked) is clearly infantilizing. That's what we went through in school as children. A lot of us expect better treatment as adults.

In any case, I don't personally find the argument that there is a reason compelling. There's plenty of things that have a reason but are unreasonable.


Severance, an Apple TV+ show, has a good take on this. It does s tremendous job mocking the corporate "culture" and it's infantilization of the workers.


Waffle parties and Hawaiian shirt days come to mind.


Daily standups. Open office plans.


Collaboration implies underlying work. Something the collaborators are doing together in addition to talking. The vast majority of the talking in my corporate environment either has no such referent, or is a one-sided accounting of the work you are doing up to someone whose job it is to allocate your time or supervise that work, not to personally advance it with you.

A good litmus test is what someone's calendar looks like. People who are in wall to wall meetings every day (managers, Staff+ engineers, TPMs, etc) can never, properly speaking, collaborate, since there is no time where they could be doing the work being collaborated on. They have coordination and supervisory functions that may be valuable, even indispensable, but what they do is different from collaboration.


Too many orgs conflate communication and collaboration. The heavy handed push for “Collaboration” always seems to stem from folks with anxiety around feeling left out, and leadership who is equally disconnected. E.g. “I’m hearing you’re not being very collaborative, spend more time bringing others along!” When really the problem is lack of communication and transparency creates fear/uncertainty/doubt. With that said, communication and coordination are the foundations for collaboration. Some orgs address those well, most don’t; too often it falls on the right interpersonal dynamics to create the environment for meaningful collaboration where 1+1=3.


"Collaboration" is the kind where someone read an article about a team building a successful project with lots of collaboration once, so they mandate pair programming on every ticket and mob testing for every PR.


Collaboration has its place for when you need it. Sure, you can get great results despite paying the mental cost of collaborating (which is even higher if it's happening through a computer screen).

The problem is that we see the great results of occasional collaboration and we want more of that. Creating more artificial collaboration when you're capable of doing the task by yourself will just be expensive (both in terms of salary and mental health): it's not the answer.


> Sure, you can get great results despite paying the mental cost of collaborating

When I refer to the beneficial kind of collaboration, I’m referring to the kind where the cost of not collaborating is to not achieve the outcome at all.

Paul McCartney is an excellent musician on his own. The Beatles achieved something as a group that no individual can achieve on their own.

I agree that adding collaboration in a setting that doesn’t require it just adds mental cost. But in other settings, that mental cost is minimal compared to the enhanced output.

> Creating more artificial collaboration when you're capable of doing the task by yourself will just be expensive

We’re on the same page here. That’s what I mean by “Collaboration” in quotes. The kind that is not necessary.

My main point/issue is that the article doesn’t seem to really acknowledge or explore that there is a good/bad kind. Just focusing on “employees need more time to do their actual work” is not useful if that employee’s actual work actually does require some form of collaboration.

The ability to distinguish between the two seems to be the core issue that requires exploration.


This is a great insight. Mapping productivity over collaboration intensity is a concave function. Its maximum is nowhere near the extreme ends.

Yet, if people are near the maximum they might desire more of what has worked in getting them there, but to their right they’ll only find decline.

This is such a common fallacy I wonder if it has a name.


Exactly the same for me: some of my best work and most enjoyable time has been collaborating closely with people. And yet, some of my worst time has been in settings that can also be described as collaborative!

What made a difference? I see it as having two types of "collaboration": working together vs working on the same thing in parallel.

The problem is that the formal processes managers use to ostensibly encourage collaboration—and, of course, to track and direct individual work—push hard towards the latter model. Everybody working on their own tasks makes the work so much more legible! Having some other person or some process determine who works on what, when and then implicitly (or explicitly!) judging people based on how quickly they complete "their" tasks is not only awful for autonomy and job satisfaction but also actively makes it harder to collaborate meaningfully. I've worked on teams where people were afraid to spend too much time working together or helping each other because they didn't want to "waste" the other person's time—after all, you're clearly wasting time if you aren't consistently producing finished tickets!—which is absolutely toxic for creativity, culture and collaboration, but also a natural response to the incentives and structure imposed by the process.

I'm gradually starting to look for my next job and finding a culture where I can actually collaborate is one of my main goals, but it seems hard to evaluate from the outside. It's not clear how much an individual manager can do to foster the right sort of environment—a lot depends on the culture and structure at higher levels on the organization, as well as cross-team interactions—and it's the sort of thing where everyone is going to say their team is collaborative (and even believe it), and I don't see how a team could show rather than just say that.


I work in a FAANG, and performance reviews are all about component ownership and getting other people to work for you (leadership!) but never ever about helping somebody else. What collaboration can we talk about when it is so explicitly discouraged by management?


Me in mind that at Microsoft and every large company. A lot of time is wasted in meetings. What you’re seeing is that the happiest workers AT Microsoft are the ones with more time for deep work and the ones with high autonomy. That lesson is probably generalizable. I don’t think anyone is suggesting an end to collaborative work.


a good collaboration aims at making parties very independent at some points. because you all shared the same outlook and plan and can now go on your own way until the next mental rendez-vous.


> Does it, in fact, intimate that collaboration may have become a buzzword for a collective that is more a bureaucracy than a truly productive organism?

I recently worked at a company that hired Agile consultants to install SAFe. We had PI planning each quarter, which ran for 3 full days. At its conclusion, the consultants, leaders, and PMO always called it "successful collaboration." I guess they were having lots of fun ... but the engineers were not. Most of it was us guessing estimates for things 6 weeks out, and management turning that into commitments. Needless to say, this whole circus became negative experience for almost all the producers.


For people who don't know SAFe, here is the diagram: https://www.scaledagileframework.com/wp-content/uploads/deli...


Lol they should just change the name of “agile” to “brittle” and go with the slogan of “processes over people”


Agile has become exactly that. It is where you have people in marketing and business thinking they know more about tech than actual tech professionals.


Amazing how they can pack so much into a diagram yet still show absolutely nothing of value for understanding SAFe.


How does upper management look at that and not immediately say “Hey, waittasec, this is clearly bullshit!”?


That reminds me of the cell biology diagrams from grade school.


And to think that there are people who get paid for this.


LOL, safe uses an "agile release train", because everyone knows how agile trains are. They can stop on a dime and move in any direction, the very definition of agility.


How do you propose stopping estimates from becoming commitments? Genuinely asking because I struggle to not do this as a manager.


One thing to do is to show your estimates as distribution curves instead of averages. It can be helpful to choose estimates from a Fibonacci scale. Set the pessimistic estimate high enough that it’ll rarely be missed.

Edit: I change the units based on how far out I’m estimating. If I’m estimating the next six weeks, I’ll use hours. If I’m estimating a roadmap I’ll use weeks. The numbers get large quickly (1 2 3 5 8 13 21 34 55 89 144). It’s rare you’ll find yourself planning farther than 2 years out (104 weeks).

As you work, keep updating the estimated time remaining. If the initial estimate for a task is an 8 (between 5 and 13), keep reporting 8 until your estimate falls below 8, then report 5 (between 3 and 8). Avoid reporting intermediate values like 7.


Part of the answer is don't give estimates when you can help it. I've had a lot of success just having a roadmap and telling people, this is what we're working on now, this is what's next in the pipeline, and here's where you can track our progress for yourself.

Another tip I have is to constrain estimates to board timescales like Q3 or H2 when you are able to (there are certain business scenarios that do require estimates).

A similar thing you can do is give your "estimate" not in the form of a date but in the form of ambiguity and scale. Or better yet just state the factors that would go into giving an estimate without offering one.

Lastly I'd suggest just not shying away from commitments but making them about effort not outcome. "I have 3 developers working on this full time and it will be their sole focus until we ship. Nothing you say or do is going to get this feature launched any faster than it's going to be now."


Given that other skills work at different time scales, this doesn't always work. Two of the most common examples:

1) You work anywhere near money and commerce. The existence of Black Friday and the following weeks of shopping frenzy ensure that you will always be very aware what date Thanksgiving is. And that everybody will need timelines especially close around that date.

2) You work on a product that also gets marketing. There's a lead time of several months for a good marketing effort with coordinated press, and you really don't want to have all that lined up and then blow your timeline.

If you can help it, at all, learn estimation. Sure, don't share it if your management is incompetent at handling estimations, but the ability to predict a timeline with error bars is extremely useful. Practice by yourself. You'll be happy you did.

(And if your estimates are reasonably correct and you have decent management, you experience magic like "OK, then let's cut scope" or "Is there anybody who'd accelerate this if they were on your team". With a heavy nod to the fact that there are not enough managers who can pull off that magic - because they never understood estimation)


If you run estimates with an abstracted unit of measurement ("ideal developer days," "story points," "cups of coffee," "tshirt sizes" etc), then you get three important super powers for this.

1) Your long range timelines come with a specific, quantified error margin. It's no longer "we'll be done by December 15," it's "we'll be done by December 15 with a 95% confidence, or January 15 with a 99% confidence." (Not to mention, those CIs are real and so your estimates have a very high degree of certainty)

2) your estimates have explicit conditions built in, most notably "based on our current understanding of the work." The door is already explicitly open to respond to feature requests, changes, or just new information with an estimate change.

3) Your estimate adjusts very quickly, so the conversations from #2 are also clear.

You can make commitments under those circumstances a lot more easily.


No idea why "story points" or "cups of coffee" or "shirt sizes" have much relation to time. I mean... I get it, but... many places I've worked also go to pains to say "this isn't hours, we're just estimating relative complexity". But plenty of issues are extremely complex, but may only take a few days, vs some items which are less complex, but larger (touching a number of files, or screens, etc).

With that rubric, 8 story points might take 3-4 days of focused concentration, and 3 story points might take 5-6 days of less focused but more brute force. Nowhere I've worked accepts that as legitimate, and want to redefine the language in to something that approximates time. So... why not just estimate days or hours anyway?

Both an estimate of "large shirt" and "30 hours" can have 'explicit conditions built in'. This will be 30 hours with my current understanding of the request. If that understanding changes, 30 hours will change. I don't think you need 'shirts' for that?

I can easily make commitments if the people I'm committing to are fine with a change in the dates. That's a big 'if', and not one that plays out positively most times.

This is the rub, because most places I've been at, the commitment ends up being treated as a deadline, because... that seems to be how people work. "Dec 15 with 90% confidence" becomes "dec 15" and other parties start making plans and decisions based on "dec 15" without any consultation or being looped in to the process, and when 'dec 15' has to become 'jan 10', many many people are impacted and generally upset.


I hear what you're saying in two parts:

"Why not just estimate hours anyway?"

Because humans are extremely bad at estimating time, which is borne out by studies many times over. A good overview is the original research on this, for which Kahneman et al won a Nobel prize (yes, the same Kahneman who would later go on to write HN favorite, "thinking fast and slow"). The broad stroke is, the very best time estimators in the very best circumstances only underestimate their time needs by 33%. The norm is more like 80%. They propose a few time estimation strategies to get around it, like "third party estimation" and "tripartite estimation". But the simplest approach (which emerged in later research) is to ask them to estimate "size" of task, and use statistical corellation to convert that to a number.

This last is hand wavy unless you're familiar with the law of large numbers, the law that makes casinos profitable. A casino cannot (without cheating) determine the outcome of a single roulette spin. But they can predict with extremely high certainty the aggregate outcome of a thousand spins. This is the same with your estimates. You can't predict the corellation to time of a single story point. As you pointed out, sometimes something that looked complicated turns out to be easy and vice-versa. But given a sufficient sample size (of estimates with a consistent corellation to time), you can predict with extreme accuracy the time for 1000 story points.

"Consistent corellation to time" is a bit of a PITA in a group, BTW. If you have developers do their own estimations individually, each one will have a different corellation to time. You would need a very large sample size to overcome that much variation. This is why so many systems encourage team estimation, so the consistency is dependent on the team dynamic, which is much more stable even when adding/removing engineers. But as I said, if it's the same person or team always writing your tasks, you can use their team dynamic instead, since their story size will be consistent.

FWIW by sufficient sample size, I mean after about 3 sprints (of any duration) you can make reasonable predictions. After 6 sprints you'll have confusing outliers, and after about 9 sprints it will be clear with some numerical weight to it.

Which brings up question 2, "the commitment ends up being a deadline". This is a human nature thing, you're right! But the problem isn't a mismatch between human nature and your estimate. The mismatch is between human nature and the uncertainty of reality. How you push to improve this is contextual to your org. In hard situations I reverse the statement of my estimate, to "if we set dec 15 as the deadline, there's a 5% chance we won't make it. What's our fallback?" Asking that question a lot is helpful. But there's no magic bullet to making leadership - or worse, people who are afraid of leadership - plan appropriately for uncertainty. The best you can do is expose the uncertainty as clearly as possible, and give lots of lead time for the times when they still run into conflict between deadline, resources, and scope. After that, it's the manager's job to "manage" things and decide which variable they will alter to break the conflict.

Put another way: reality is uncertain. When that uncertainty leads to a conflict between deadline, scope, and available resources - because that will happen sometimes per point 1 - only someone with deadline, scope, or hiring authority can solve it. That's (usually) not within your purview as a lead engineer. The best you can do is to 1) call out the uncertainty as clearly as you can, as early as you can, and 2) signal that conflict as early as you can, so those managers have maximum leeway. Abstracted estimation makes that possible. Guesses and hopes don't.


Apparently if you do zero estimating (or just count the amount of stories), it makes ZERO difference when calculating your cumulative-flow diagram.

https://www.youtube.com/watch?v=QVBlnCTu9Ms

Money segment is at the 26m15s mark.

The wasted effort of trying to estimate stories provides zero business value in this context.


Yes, you can do the same math if your stories are consistently sized, or if you have a sufficiently large data set. All you need is a consistent unit that is related to complexity, risk, and time. The less precise that relation, the larger sample size you need.

For many teams who get requests from external stakeholders in widely varied technical environments - ie consulting, often - estimation is functionally just a conversion process to a consistent unit. But you're absolutely right that for some teams the stories themselves are a good enough unit.


I think the demand for timeline commitments comes from a worry that otherwise the request will never be done, or that it will languish in the queue behind less important things.

Where possible, I try to frame my relationship with my stakeholders in terms of their priority order for their requests of me, and to demonstrate consistent progress on that stream. Thus, we have an understanding that the things they ask for do get done, and they want a particular one done sooner, they can move it up in the stack rank.


Could not agree more with this, and it can scale to small teams. Establishing consistent “velocity” and demonstrating progress is way more important than estimating individual features.


It starts understanding the meaning of the word "estimate" in the English language.


You could start by not doing estimates? Instead, take an educated guess of the complexity of the problem and then ask how long the enterprise is willing to commit to implementing it.

Estimates are just a way of delegating responsibility downwards. The organisation is rarely asked to estimate how much a feature is worth, despite this being probably the more important factor.


Imagine a timeline from left to right. On the left is where there's the least knowledge about the project, and on the right the most (when the project is done). Estimates are made at the far left, when the developers know the least. So as you move along the timeline and more knowledge is discovered, you need to reset expectations frequently.


If you need to actually make meaningful commitments, you need to track how long stories/projects/whatever actually take, and use the data to make a prediction. Subjective “estimates” can be an input, but they’re basically guesses.


My experience: Collaborating closely with smart people makes you happy. It can be so exhilarating - in a sustained way for a long time, if you're lucky and the universe aligns correctly.

And yes, frequent interruptions are okay in this context. You get into a rhythm.


Why do they want happy workers and how do they measure happiness?

When exactly did Microsoft abandon all the lessons of Peopleware[1]?

And what about flow[2]? I recall people blocking out calendar time and putting do not disturb signs on their office doors[3] when they needed time to get into flow. Good managers helped make that happen. Does that not happen there anymore?

[1] https://www.goodreads.com/book/show/67825.Peopleware

[2] https://www.quora.com/What-is-a-flow-state-of-mind-How-succe...

[3] We had actual offices, doubled up and individual. They had doors. It took almost 2 decades to finally work remotely so that I could have an office with a door again, this time in my home. Bliss!


Is this really all that surprising? In school everyone always complains about group work sucking.


Group work has all the bad attributes:

- can't be fired

- reward is low

- members often have the exact skillset.

- nobody has superior authority. How does one even make a decision where people disagree?

Teachers have no experience working in the real world and will often say "you will need to learn to work with bad people".

Meanwhile every successful person will tell you to fire bad people as soon as possible.

If you assign a bunch of low performers to Steve Jobs and tell him he can't fire them and have to give them work, even Steve Jobs would probably fail.


School group work doesn't have to be bad.

We had a semester-spanning 6-person group project during my software engineering study. Long story short, by the fourth and final semester project, I cooked up a recipe that solved some of these problems. I had re-invented having a manager.

Because I was good at coding, and by this semester this was known so people trusted me to not just idle around, I could propose on day 1 that I'd just be the 'manager' with no assigned share of the work. I'd be the solver of problems as they popped up, the person that goes around making sure nobody is stuck and everyone can work. And also that everyone does work, which turned out to be important because the notorious Joel was assigned to us and, indeed, he barely showed up and didn't do work at home either. After some conversations, going back-and-forth, and spending time on this, I managed to convince the teachers to officially kick him out after about three weeks (into a ~half-year semester). The rest of the semester, I spent most of the project time helping with problems that others were stuck on. (Not that I was a superbly amazing coder, but two know more than one and this really helped our pace.)

It's a real shame that was the last semester of that study because the setup worked like a dream. Two of the group members weren't so strong in coding and we were one person down with roughly the same workload (though for this, too, I had some time to spend on 'lobbying', and we got a slight reduction), but by juggling tasks correctly and sitting together when necessary, it ended up working really well. One guy in particular I remember wasn't too good at the job, but he would always work hard and, once unstuck, make good progress for the day. Great guy, always in good spirits, fun to work with and I'd hire him today if he applied despite the mediocre skills. I think knowing his strengths and weaknesses (knowing what tasks to assign and do some pair programming) is all you need to have a great colleague in him that cheers up five others and is still a net-positive in the amount of work done.


> a great colleague [who] cheers up five others and is still a net-positive in the amount of work done.

I wanted to clarify here that I mean he would be a net-positive in the amount of work done regardless, and in addition cheers up anyone who's around him. I have positive memories and not sure that really came through when speaking of 'net positive' which is a rather low bar to reach.


I think this is a good anecdote that reiterates the kind of “collaboration” the article is talking about, but is not the “real” collaboration people strive for.

The reason group projects in school felt so terrible was because the end goal was collaboration, not some higher outcome. Collaboration can unlock outcomes that can’t be achieved as an individual, but collaboration can’t be the desired end state.


> The reason group projects in school felt so terrible was because the end goal was collaboration, not some higher outcome

At least in my school the end goal was never collaboration but getting a good grade. Thus individuals who cared did most if not all work.

Those group projects had sense only if you were paired with equally skilled and motivated ppl.

And ofcourse you had to LIKE them. Its exteremely hard to be cooperative with ppl you consider as assholes.


I’m sure that was your end goal, but when someone assigns a collaborative project, it’s usually to teach students how to collaborate, while forgetting that just forcing a group to do the same task together is not the same thing as collaborating.

It is from this base that problems emerge, IMO.

Kind of like “synergy”, which is a real thing, but hard to define and difficult to artificially achieve.


> In school everyone always complains about group work sucking

I supervised software engineering projects at university. It works well when students in a group have similar motivation and abilities. This is the case when they are free to choose who they collaborate with. Otherwise, students are usually frustrated.

In a company, teams can be much more heterogeneous (cultural background, skills, age, social status, experience...), and the pressure is higher: goals can be loosely defined, evaluation is less fair, stakes are higher for everyone, managers can be less benevolent than teachers...


In my corporate software projects, a particular engineer owns the outcome. They have lines of communication with the managers of other assigned engineers, and considerable pull in their performance reviews and promotions.

The floor is that someone cares and the others will make what could be mistaken for a good-faith effort, which is lightyears better than a school project.


> managers can be less benevolent than teachers...

Depend on what benevolent here means.

It is much easier to address low performers in a company setting.


It is perhaps surprising to a management class that believes collaboration is the ultimate good in all circumstances and you can never have enough of it.


FTA:

"By combining sentiment data with de-identified calendar and email metadata, we found that those with the best of both worlds had five fewer hours in their workweek span, five fewer collaboration hours, three more focus hours, and 17 fewer employees in their internal network size."


So if they are looking at calendar data does "five fewer collaboration hours" only mean five fewer hours in meetings?

Some of the funnest and most productive work I've done was a project with three others where we were all in a single office. We were completely ad-hoc with working alone, pair programming, having quick design discussions, whatever was needed at the time. There was an enormous amount of collaboration (and an enormous amount of deep, solo work), but there were no meetings in my calendar.

Meanwhile in pretty much any job I have had my happiness has gone down the more full my official calendar becomes.


It would be awfully useful to know what the baseline from which the 5 & 17 fewer reductions are based on.


TL;DR^, came here to post the same quote (they are citing the original authors here). The rest of the article is filler and speculation based on this statement.


I dumped my last job exactly because collaboration became the mantra.

I tried to effect change but leadership didn't listen and my team wanted to apply the collaboration koolaid and be good team members.

Glad there's research I can point to.


The structure of the experiment seems to have been just a correlation between collaborative hours and some measure of happiness. I wonder if the causation doesn't run the other way to what they propose though. When I'm happy in work, I don't really feel the need to have to work to grow my network outside my team. I have worked a job though where things just weren't clicking, so I did put more effort into trying to make them click by meeting people. That would have included getting involved in more projects, and doing a bit more collaborative work.


This.

Things aren't going so well at work? Work harder. Work more. Then burn out and quit, of course.


I reckon that it's not the collaboration itself makes you unhappy, it's the contemporary methods used such as constant interruptions over a multitude of communication tools, frequent brain-melting meetings with no obvious purpose, not having a solid leadership leaving the team members to their own devices.


By reading the article, the word "collaborating" somewhat felt like they were referring to meetings based on the word used around the context of people's calendars and availability that week.

Having less meetings and more individual contributor time will make anyone happy. You can at least do the job with that time and not stress that you will get it done that week with whatever artificial constraint that week has created based on the "collaboration" you were involved in.

People like determinism, no surprises here.


I think it should be healthy to be allowed to simply do your job if you are an engineer. If you are a good one you'll know when you need help and when you need to grind.

Instead you are constantly in a state of being micro managed by people who think that if they have lots of meetings and gather lots of metrics that will increase productivity but it's a silly idea when you think about it

The only thing that gets work done is well...doing work and any amount of time you take away from that isn't productive 9/10 times.


What this article is dancing around is: Our employees are doing fuck all or just enough and we have managed to create an environment that actively encourages this behaviour.

You'd (not) be surprised how often this happens. $BIG_TECH hire people clever enough to look after themselves as well as their employer.

Now, what makes people happy? This doesn't:

https://hbr.org/2022/06/why-microsoft-measures-employee-thri...?


Some companies have a culture that favours meetings and cross-team builds over documentation and internal platforms. It might just be that someone doing more collaboration is working on more ill-defined problems, or are at a different part of the development cycle.


I've done some work adjacent to this. Not this project, not with microsoft. By internal network size they probably mean people you have a lot of traffic with in meetings and emails. Presuming they're competent they probably only look at interactions with small groups or 1:1. By collaboration they just mean meetings with coworkers.

Without having read in detail, its likely just drawing correlations between communication metrics and survey results.

Lots of companies are paralyzed big their meeting culture. It's not really a surprise to anyone. But the data can paint it a bit more clearly if people don't know quite how to articulate the problem. It's pretty much always the same in a large org.


Is there a link to the study anywhere? I'm curious what kind of "internal network" survives removing 17 people while being more productive.

Certainly some do. But it means very very different things if their network was closer to 20 people or 200+.


The ZDNet article links to the HBR article [0].

Which traces to this networking measurement article [1].

[0] https://hbr.org/2022/06/why-microsoft-measures-employee-thri...

[1] https://hbr.org/2020/08/can-you-be-too-well-connected


Both of those mention "network" without really saying what they consider a network, and without even an average number, as far as can tell :/ they spend much more time talking about how many meetings and emails are involved. (And their meeting load sounds low! Dang, all of my managers have had like 30+ per week. Though I don't think I've ever sent as many emails as even their low end lists. Maybe they're just avoiding Teams...)


> and without even an average number, as far as can tell

Taking their analysis at face value, the average/median doesn't matter. This was a relative comparison for people that were (i) motivated; and (ii) had work-life balance.

"network" was a function of how many people you interacted with via email/meetings. It's not surprising that people with better work life balance had less interactions. It is slightly surprising (IMHO) that people with less interactions were also more motivated.


It matters how much of a reduction 17 implies. Scale-free numbers can be extremely misleading.

E.g. last half, I had about 350 separate people I DM'd. Per year I generally break 500. If you include the size of email groups I have sent messages to, I'm consistently over 1,000 most months, possibly over 2,000 (maybe even significantly more). Microsoft has around 180,000 employees, that number could trivially go much much higher if I worked there.

17 is nothing if that's the criteria. Probably just noise, and you'll likely get different results next year.

For people I would consider to be my network... 30-40? Maybe 50? They're the ones I actually interact with repeatedly on purpose. Reducing that by 17 is substantial, and I don't think I'd be as useful if I cut that size.


> Microsoft has around 180,000 employees, that number could trivially go much much higher if I worked there.

> Reducing that by 17 is substantial, and I don't think I'd be as useful if I cut that size.

IMHO, you're missing the point.

They're allegedly controlling for the work environment by sampling and comparing peer groups at the same firm. Regardless of whether you think 17 is the right number for your situation, it's remarkable that there's a dividing line of 17 between the two peer groups in the study. Note that the 17 doesn't necessarily preclude someone having an even bigger network size delta from the non-thriving/non-balanced employees at MSFT.


Edit: maybe a better way to phrase this is that, without knowing criteria and scale, there's really no way to know if this is surprising and possibly informative, or utter nonsense and it's hard to find anything conclusive because they're trying to hide their horrifying abuse of statistics. I see the latter happen far more often than the former, which is why significant claims require significant evidence. So far this seems weirdly vague.

---

What I mean is that, depending on the criteria, their average network size could be 30 or 50,000.

A reduction of 17 from 50,000 is only 0.03%. That's overwhelmingly likely to be statistical noise and nothing else. But given enough dimensions to slice the data across, that could be the single strongest correlation, so it's the one they claim. You see that kind of thing all the time in bad data science.

The claim could fall anywhere between "shockingly strong correlation" and "utterly meaningless". So far I haven't been able to find anything that implies one way or the other.


It says “five fewer collaboration hours”. It’s not clear how many hours of collaboration we’re talking.

It’d be interesting to see the actual figures before drawing a conclusion.

On one hand I can imagine if you’re spending all your time in meetings it would be frustrating, meanwhile I’ve what happens when teams don’t collaborate, it’s not a pleasant experience, it just causes tension to build up.


They should research paying people at market next vs. not (current state) to see what the difference is :)


I don't know if I would trust a scummy anti-customer company to know what works best for people.


There's no one answer to what is going to make people happy. Some people prefer being solo, some people prefer a close knit team, and every variation. The key for people leaders is to let your team do what works best for them.


> By combining sentiment data with de-identified calendar and email metadata

For some reason the idea of my company using my email and calendar for stats like this bugs me a lot more than the idea of just reading them directly.


If people said their real feelings the results wluld be about 99.9% sex



Collaboration is a means to an end, not a goal in itself.


… and made Teams


Teams are weird as fuck.

They're relatively decent, but their chat is so shitty that student could rewrite it in a week and it'd work better,

simply because s/he wouldnt come with an idea of pasting text's font too.

I haven't found it to be useful over like two years of using Teams, but it creates problems very often.

Discord's chat is times better and more reliable because on Teams sometimes you cannot escape from `` block


What a joke. Anything but compensation. For the right salary I would be happy with all this.


Pity it seems like they haven't researched what makes customers happy...


What does "five fewer hours in their workweek span" mean?


Yes.

Let me also ask what 17 fewer collaborators mean. I work in a company with 7 employees, so I should work with negative ten people for optimum statistical happiness!

Without the base values, the article's advice is subject to speculation, but I guess we could start with how the average bigcorp in 200? operated and go from there. Problem is, I don't know what those values were, either.

What's also interesting to me is that working in a team of more than 17 people is (was?) normal, that seems like a lot of people to coordinate with (even at Deloitte, my direct coworkers (developers) were a group of 3 others, part of a larger team of maybe 40 (mostly pentesters) but they also all worked in smaller groups of 1-4). Daily stand-ups with 4 people were long enough for my taste.


Is it not zoom calls and pizza parties? /s


Yeah as long as I have a bunch of grunt work, unnecessary fire drills, canceled projects, 2 AM pages, and low pay I'm a happy camper.


I hope they mention money, because sufficient amount of money will make me very happy.


TL;DR: for Microsoft, it's "to be energized and empowered to do meaningful work."


Microsoft outlook and spreadsheets is what every employer crave obvsly




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: