Hacker News new | past | comments | ask | show | jobs | submit | burnerburnson's comments login

You're underselling the problem which is that they profit off your addiction. Time you could be spending creating art, exercising, or learning a useful skill is instead spent watching an endless stream of junk videos hand-picked for you by the Chinese government.


Addiction? I use it for around 30 minutes a day. And sometimes you need to kick back and passively graze content.

You know - after all that exercise, skill learning and art creating...


Yes, addiction, that's exactly what I had in mind. YouTube, Tok Tok etc, train your brain for craving satisfaction in short bursts and move on to something else. Even the "smart intellectual geeky" content is consumed for 20 minutes, you get satisfied, and then move on with your day/media consumption, and never make the effort to deepen your understanding.

I stopped using that word on HN because whenever I suggest that "screens are addictive" I get down voted (and it's not as much that it bothers me to lose imaginary points, it's that then it gets hidden and people don't get to see it, at which point I'm just yelling into the vacuum).


The average engineer at Boeing makes $120k/year. That's about $50k less than what a new grad with no experience will get from big tech.

Boeing doesn't have a culture problem, they have an idiot problem. The idea that you can hire competent engineers offering salaries like that is absurd.

They need to adopt a pay for performance mentality and bring in managers who are not afraid to fire underperformers.


Just where is an inexperienced new grad making $170k out of the gate? I find this difficult to believe. Are you normalizing for cost of living? I suspect, most Boeing employees aren't based in the Valley.


A major Boeing campus is in Huntsville, AL, which is going to affect that average for sure.


> errors and unsafe acts will not be punished if the error was unintentional.

No sane organization would ever implement this. If someone repeatedly makes mistakes, they're going to get fired even if the mistakes are unintentional. Anything else is going to cause more safety issues in the long-term as inadequate employees are allowed to proliferate.


This is just blameless post mortems and many, many many places implement this.

There are always going to be some level of "inadequate" employees, and also perfectly adequate employees that sometimes make mistakes in any organization and if your organization requires that no employees ever make mistakes in order to operate safely, then you have serious problems.

The purpose of a statement like that is that you don't just have a post-mortem that is like: "Our company went off the internet because an employee had a typo in a host name. We fired the employee and the problem is solved." When in reality the problem is that you had a system that allowed a typo to go all the way into production.


It's like that story of the pilot who, after his refueling technician almost caused a crash by using the wrong fuel, insisted that he always have that technician because they'd never make that mistake again.


That was the late, and definitely great, R.A. "Bob" Hoover, I am proud to have shared a beer with him at Oshkosh. His Shrike Commander was miss-fueled with jet fuel instead of avgas because it was mistaken for the the larger turboprop model. Rather than blaming the individual refueler, he recognized that there was a systemic problem and developed an engineering solution. He proposed and the industry adopted a mutually incompatible standard of fuel nozzles/receptacles for jet fuel and avgas as a result. You can find some great YouTube material on him, or the film "Flying the Feathered Edge"

https://sierrahotel.net/blogs/news/a-life-lesson

https://en.wikipedia.org/wiki/Bob_Hoover#Hoover_nozzle_and_H...

https://www.imdb.com/title/tt2334694/


Here's an old timey video of Bob in his prime. At 8:55 he flys a barrel roll with one hand while pouring himself a glass of iced tea with the other. Hardest part was pouring the tea backhanded so the camera had a good view. Then he finishes with his trademark no-engine loop, roll, and landing.

https://www.youtube.com/watch?v=PT1kVmqmvHU&t=510s


the question is what do you do with the technician after the 2nd mistake. that is to say, When does this logic break down?


That's not really the question:

Punishment culture assumes people naturally do bad, lazy things unless they are deterred by punishment and fear. Therefore we must punish mistakes.

That perspective has long been debunked. You don't see competent, skilled leaders using it. It turns out that generally people want to do well (just like you do), and they don't when they are scared / activated (in fight/flight/freeze mode), poorly trained, poorly supported, or poorly led. They excel when they feel safe and supported.

If you are the manager and the technician makes the same mistake the 2nd or 3rd time, you will find the problem the next morning in your bathroom mirror. :) At best, you have put them in a position to fail without the proper training or support. Leadership might also be an issue.


I would say that every skilled leader must use punishments and consequences to some degree.

If your tech gets drunk every day and doesnt do their job, you need to cut them loose. This isn't a management problem.

Sometimes people end up in positions where they are not suited and will continue to fail. If you hired a plumber and you need a doctor, that isnt an on the job training, support, or leadership issue.


> you need to cut them loose. This isn't a management problem.

That is 100% a management problem.

> Sometimes people end up in positions

I wonder how they got in those positions? That sounds like a management problem too.


It isnt always managements job to make the person workout in the role. Sometime it is managements job to fire that person to find someone better.

Some people are bad fits for positions. They might look good on paper, they might be trying something new, they might lie to get hired, they might change after starting, they might have been a risky hire, or any number of reasons.


I think you're envisioning people all being absolutists who follow an exacting rule book and can't consider context. (that's covered by the *flexibility* tentpole)

As N approaches infinity, there's definitely a value of N at which we discover the root cause is the airman and have to move on from him. I don't think it's particularly interesting to try to identify a constant value for N because it's highly situational, and we know we have to do *just* and *reporting* as well, the reporting falls out when the just does.


You hit the nail on the head. I do perceive a lot of people being "no bad employee" absolutists.

All I am looking for is recognition that the content of N matters.

It is part of what I see as a broader phenomenon where people emphasize systems and ignore agents. In reality, agents shape systems and systems shape agents in continuous feedback.


If you implemented some changes so the mistake is caught before disastrous consequences, you're already doing better. Well enough to let the 2nd one slide. Even the 3rd. After that, action seems reasonable. It's no longer a mistake, it's a pattern of faulty behavior.


That is a big IF. At some point it comes down to the error type, and if it is a reasonable/honest mistake.

The situation is very different if the fuel cans are hard to distinguish vs if the tech is lazy and falsifying their checklist.

Underlying any safety culture is a one of integrity. No safety culture can tolerate a culture of apathy and indifference.


I expect there's precisely 1 safety culture that can tolerate a culture of apathy and indifference -- one in which no work is ever completed (without infinite headcount).

You apply risk mitigation and work verification to resolve safety issues.

Then you recursively repeat that to account for ineffective performance of the previous level of verification.

Ergo, end productivity per employee is directly proportional to integrity, as it allows you to relax that inefficient infinite (re-)verification.


Exactly! All this talk about man vs system misses the point that man is the system designer, operator, and component.

This is why Boeing cant just solve their situation with more process checks. From the reporting, they are already drowning in redundant quality systems and complexity. What failed was the human elements.

Someone was gaming the system saying that the doors weren't "technically" removed because there was a shoelace (or whatever) holding them in place, Quality assurance was asleep at the wheel, and management was rewarding those behaviors.

Plenty of blame to go around.


Redesign the system again if it's unintentional. It is almost impossible to control humans to the degree that they never make mistakes. It's far better to design a system in which mistakes are categorically impossible.


I'm trying to push back on the knee jerk sentiment that there are no bad employees, only bad systems.

There are no systems that are human proof, and what kind of human behavior is tolerated is a characteristic of the system.

In fact, there are humans that lie, cheat, are apathetic, and incompetent. Part of a good system is to not only mitigate, but actively weed these people out.

For example, if someone falsifies the inspection checklist for your plane, you dont just give them a PIP.


> I'm trying to push back on the knee jerk sentiment that there are no bad employees, only bad systems.

Why is it important to you?


Because Im an engineer in a quality controlled field (Medicine), and my personal experience is that firms place too much faith in quality systems and not enough emphasis on quality employees.

I see lots of engineers and QA following a elaborate procedures with hundreds of checks, but not bothering to even read what they sign off on, so they can go golf all day.

People seem to think that you can engineer some process flow to prevent every error, but every process is garbage if the humans dont care or know what they are doing.

Every process is garbage is you dont hire workers with the right skills demanded by that process. In an effort to drive down costs, lots of companies try to make up for talent with process, with poor results, for both the companies and patients. you cant replace a brain surgeon with 2 plumbers and twice the instructions.


Very interesting, I'm glad I asked.

Similarly, I read some head of a leading engineering organization (I think a NASA head or maybe Admiral Rickover) who said, essentially, 'you can't replace ability with process'. All the process in the world, they said, will not substitute for highly able personnel.

But perhaps safety, not usually dependent on ability, is a different matter. Possibly, the problems you describe are a matter of leadership and management - which doesn't undermine your point; those also are things that can't, past a certain irriducible point, be replaced with process.


>Possibly, the problems you describe are a matter of leadership and management

I wholeheartedly agree that leadership/management is a part of problem. My main objection is the "no bad employee" rhetoric. Sometime times the problem with management is that they aren't getting rid of bad employees. Rot can start anywhere in an organization, and the rest of the org really needs to push back, not just management.

It actually reminds me a lot of the culture/discipline problems with some Police departments in the US. It is hard to enforce and cultivate organizational culture top down. Most of it is maintained peer-peer.


I guess it seems like that argument takes the discussion to an extreme. Does anyone actually advocate never firing employees? That there are literally no bad employees?

> It is hard to enforce and cultivate organizational culture top down. Most of it is maintained peer-peer.

I think it's a combination. The leader has a large influence; they set the standards and the norms. At the same time, I agree with what you say about peers - perhaps peers spread and 'enforce' those norms. It may also depend on the size and age of the organization.


Falsifying the inspection checklist is not a honest mistake.


Yes there are obviously bad employees but the line for actual incompetent/malicious employee is a lot further away than most people understand.

A lot of bad management is hand-waved as crappy employees (by management - shocking!)


I think that scales very much with the complexity of the task.

If you are talking about someone who cant server coffee, the balance is clearly in favor of poor management over inadequate skills and trainability.

If you are talking about very specialized skills like aerospace engineering, I think the balance can move further in the other direction.

There is also the combination of the two, where in the interests of growth or cost savings, an organization has cut corners on the quality of talent hired.


I think that this anecdote [0] is appropriate for showing the glaring disconnects that can exist in the human<-->system symbiosis.

[0]: https://www.controlinmotion.com/news/news-archive/a-little-h...


It's seemingly simple "oh the technician keeps messing up"

Did the technician mess up (sometimes true), or were they doing their job in good faith - was it the system/protocol/organization that made the task mistake prone? Did someone else actually mess up but the situation made it look like it's the technician's fault? Does this technician do a task/service that is failure prone? Are there other technicians on other tasks that are far less failure prone? Here the former technician would seem poor, the latter, excellent, but it's a function of the task/role and not the person.

I've been "the technician" - I catch a lot of blame because people know I'm anti-blame culture, so I'd rather take the blame on myself that point my finger to the next guy in line. I'm also willing to take on high risk tasks for the greater good even if they suck and are blame prone / risky. I believe in team culture in this way. If the organization doesn't respect that belief and throws me under the bus, I leave - which is quite punishing for them since they remain completely unaware of a major internal problem. If an organization "sees me" and my philosophy, then together we get very very good at optimizing the system to minimize the likelihood of failure / mistakes.


Well certainly not after the first time at least

Imo it's a function of time, company and team culture, severity, and role guidelines.

If an employee makes a mistake but followed process, and no process change occured, that's just acknowledging the cost of doing business imo and would be a unbounded number of times so long as it's good faith from the employee


My point is that good faith and sufficient competence are crucial. If the employee didn't care if the plane crashed, they are a bad fit.

If they cant read the refueling checklist, they are a bad fit.

Ideally you have system controls to screen and weed these people out too.


> a function of ... severity

Not severity; that sort of thinking is actually part of low-safety cultures. A highly safe culture requires the insight that people don't behave differently based on outcome. In fact, most people can't assess the severity of their work (this is by design; for example someone with access to the full picture makes the decisions so that technicians don't have to). So they couldn't behave differently even if they did somehow make better decisions when it matters.

But, and I'll reiterate the point for emphasis, people make all their decisions using the same brain. It is like bugs; any code can be buggy. Code doesn't get less buggy because it is important code. It gets less buggy because it is tested, formally verified, battle scarred, well specified and doesn't change often.


Would s/severity/impact/g also be counterproductive of safety culture? Genuinely trying to learn here, gotta be responsible/accountable and all.

Maybe impact relative to carelessness/aloof-ity?

I agree that an engineer/person will not behavior differently based on outcomes, but if they know in advance something can have a wide, destructive blast radius if some procedure is not followed, I feel there's a bit more culpability on the part of the engineer. Regardless I don't think I feel I have a sufficient grasp on this concept I'm trying to define so definitely agreed I shouldn't have included 'severity' in the function definition nor any alternative candidate


You take him into a boolean tree within a and with another employee for quality and put him on a improvement plan?


maybe. or maybe you turn them over to the authorities because the 2nd time their lazy and reckless disregard killed several people.


Exactly. https://asteriskmag.com/issues/05/why-you-ve-never-been-in-a... is a great article illustrating this in the airline industry itself.


> When in reality the problem is that you had a system that allowed a typo to go all the way into production.

That's a typical root cause, and is exactly what should come out of good post-mortems.

But human nature is human nature...


Just culture doesn't prevent you from firing someone who makes repeated mistakes.

In fact, Just Culture in itself provides the justification for this. As the next line says "However, those who act recklessly or take deliberate and unjustifiable risks will still be subject to disciplinary action". A person who repeated makes mistakes is an unjustifiable risk.


When a punishment is applied with more deliberation, it can also be more severe.


Why is severity desirable? Or if it's not desirable, so what?


Severity is desirable iff it's justified. I wouldn't ever sign off on a policy that says "you'll be fired for a single mistake" (that would be a severity of punishment out of proportion to the risk/underperformance).

But a policy that never provided for the possibility of termination (insufficient maximum severity) is also not desirable.


> Severity is desirable iff it's justified.

It's necessary if it's (necessary & efficient & justified); it's never desirable IMHO.

Doing severe things because they are justified is just acting out on a desire or drive - internal anger - but now we can 'justify' the target and feel ok about it. Lynch mobs think they are justified.


Designing severe things to be included as part of a process is a desirable property of that system if the severe thing is sometimes required.

No one is designing a formal system that includes lunch mobs. But a formal system of repercussions for employee behavior that does not include firing is an incomplete system.

It’s not that firing itself is ever desirable, but rather that its inclusion in a disciplinary progression is desirable.


You can really dumb it down to why didn’t you follow the checklist? If someone makes the same mistake after being corrected three times and the proper procedures exist for the worker to follow then the safety culture provides the structure and justification for their dismissal


No, you really need to smarten it up, and start off by making sure that your checklist is correct. Is it the correct checklist for the airplane model that you are building? Are all the right items on the checklist? Are they being done in the correct order? Do you have the correct validation/verification steps in your checklist? Does your checklist include all the parts that will need to be replaced? If the mechanic finds a quality issue while working the checklist and a job needs to be re-done, which checklists then need to be re-done? What other jobs are impacted by the rework?

All indications here (from the NTSB prelim and the widely reported whistleblower account) are that during rework for a minor manufacturing discrepancy, the mechanics on the shop floor followed bad manufacturing planning / engineering instructions to-the-letter, then the ball was dropped in error handling when the engineering instructions did not match the airplane configuration, because Boeing was using two different systems of record for error handling that did not communicate with each other except though manual coordination.

That's not the fault of the front-line assembly worker not following a checklist.


I agree with you. If the systems/procedures/checklists are bad it is not the fault of a front line worker.

I thought I was replying more to a parent comment addressing the inability to people go who repeatedly make mistakes, which is acceptable unless they are not following procedures.


That's quite a leap from "unintentional" to "repeatedly."


Not at all: Systemic problems will result in repeated errors until the system is changed.


Ideally, as a result of the post-mortem, the same mistake shouldn't even be repeatable, because mechanisms should be introduced to prevent it.

And if someone keeps making new original mistakes, revealing vulnerabilities in your processes, I would say that it is a very valuable employee, a lucky pen-tester of sorts.


I once destroyed $10k worth of aerospace equipment. I admitted it immediately and my only reprimand was that my boss asked me if I learned my lesson. (I did)


Once destroyed a industrial manufacturing site with a unfinished robot program that ran because I allowed myself to be distracted mid alterations.


And what happened?


Who do you think came up with this rule, bleeding heart liberals’? Stop and think for a second, why does that rule exist?

You described a fantasy world, in the real world everyone makes mistakes, and if the mistakes are punished, then there are no mistakes because no one reports them. That is until the mistake is so catastrophic, it cannot be covered up- that’s how you get Chernobyl or Boeing max


Boeing max (if you mean the crashes caused by MCAS) wasn't due to a "mistake" not being reported, it was deliberate and intentional on the part of company management. The system was designed badly and without redundancy, and without any information available to the pilots about its very existence, specifically because management wanted it that way. It wasn't caused by some kind of accident.


Every sane organization implements this. Failure to do so leads to fear of reporting mistakes, and you get Boeing. This isn't news.


If it's possible for an employee to unintentionally make the same mistake twice, that's purely management's failure. It's impossible to make systems completely fool proof, but once you know of a specific deficiency in your process you fix it. If you've corrected the issue, it should take deliberate effort for someone to do it again. An organization that knows its processes are deficient but makes no changes and expects a different result is insane.


I think the wording is clumsy, but this is analogous no-blame processes. The wording is just accounting for the possibility of wontonly malicious or recklessly negligent work quality. Think someone either sabotaging the product, or showing up to work very high or drunk.


This.

A mistake like "accidentally turning the machine off when it shouldn't be" is a fixable problem.

If someone has attitude like "fuck the checklist, I know better", it is not really a mistake, and that person should be rightfully fired or at least moved to a position where they cannot do any harm.


Wowwww never become a manager please.


[flagged]


What a reductive attitude. If your planes fall from the sky because a single employee is negligent. The ones to blame isn’t the worker, the union or the state. It’s your business that has key deficiencies in implementing safety.

Your QC process has to be able to catch these things. It’s not like you can avoid any reliability in your process and just lean back and be like “don’t worry bro, it’ll fly, we hired all non-union workers”


This also sounds like someone in the New Boeing management chain. A lot of these culture problems were already evident during the 787 program, which is when MD’s mismanagement started to fester. And when the hardon for union busting first came to my attention.


What a reductive attitude. Nah not just one negligent worker. Many, including QC staff.


> The ones to blame isn’t the worker, the union or the state. It’s your business that has key deficiencies in implementing safety.

I don't think you're charitably reading parent.

Sometimes, the employee is the problem. And sometimes, union contracts preclude changing processes (or make change unnecessarily burdensome) to improve outcomes.

Introducing another party into agreements doesn't come free.

I'm as pro-union as anyone around here... but it is another level of bureaucracy. And sometimes, protecting "our people" can include not letting go people who should be let go (e.g. decreasing headcount in assembly to hire more professional QA people).

That said, solutions do need to come from all levels, and I think unions do provide a necessary counter-pressure to management deciding only employees need to bear the brunt of change.


> where certain difficult to fire union parasites have a stranglehold

> where such fatally negligent parasitic workers can be easily shit canned

How charitable of a read does one need to give a comment that uses pejorative and hyperbolic language without providing any substantive refutation?


Forums improve through ignoring hyperbolic language, not responding to it.


Forums improve by rejecting rude and deceptive language - as Gen. Morrison said, the standard you walk past is the standard you accept.


Agree to disagree. IMHO, there are more people who thrive on internet conflict than thrive on being ignored.

It takes self control, but walking past the problem is sometimes the more effective solution.

"Man, that asshole on the Internet disagreed with me. I'm going to write a detailed rebuttal of why they're the asshole."

vs.

"Hunh, I wonder why I occasionally get down voted and no one ever responds to me? And also, the responses I do get are a lot nicer in tone than mine..."


You’re right that there is a challenge about giving attention when that’s what they’re seeking. I generally prefer either flagging (worse cases) or up-voting the one person who left a “dude, not cool” reply so they get clear community feedback.


Perhaps the worker is at fault, but in (good) aerospace companies one person's mistake cannot lead to a bad unit going out the door. At the very least, two people need to be wrong in the same way - but more likely, several people would need to miss the problem.

That being said, "operator error" as a reason for a problem is heavily discouraged (at least in official reports). Rather, it's typical to blame a human factors issue, commonly one of the "dirty dozen" [0]. I couldn't say exactly which ones are present at Boeing but frankly, from the public reports almost all of them sound applicable. With this in mind, process change is the only appropriate remedy, regardless of what the union thinks. I've never heard any union people get mad at something that improves their job though.

[0] https://skybrary.aero/articles/human-factors-dirty-dozen


I wouldn't put it past Boeing management to try to move more production to South Carolina. In fact they put the 787 there thinking that the 787 would be the future of Boeing. It isn't, and it can't be, because of the problems in developing and building the 787. So their move to a anti-union state ended up being a dead end.

In the 787 project they pushed outsourcing of design to their suppliers so they could reduce dependence on their own engineers. That was a disaster. They spun out Spirit Aerosystems to fragment their unionized manufacturing workers. The same Jack Welch influence via Harry Stonecipher sets the management tone even today. It's the wrong culture.

They need to move headquarters and production of their next plane back to the Pacific Northwest, and they need to develop the next plane not in the way that they developed the 787 which was which was designed to be a thumb in the eye of the union and their engineers. It's a cultural problem. Look at the word cultural throughout the FAA report. The culture that Boeing had doesn't work, and they need to change it root and branch.


Yeah because the QC of the Carolina plant has been sooooo great.

Airlines are trying to get ahold of the planes built in Everett because the SC plant planes have quality control issues still, ten years in.

Pull the other one.


Can you point to where a union employee was a root cause of the Max debacle? Because as I read it, most of the mistakes were of a design nature (i.e., mistakes of white-collar employees). Your comment reads like you started with a conclusion and worked backwards.


Look up SPEEA


The existence of a labor union does not make your point. You point to an industry group; I'm asking if there is a specific instance in the Boeing 737Max mishaps that is attributed to a union-member disregarding generally good safety practices by the very nature of being defended by the union. I'm not aware of one, but the opposite is true: white-collar decisions can be shown to be direct contributors to the problem(s), and I don't think they have anything to do with union membership.

E.g., Boeing did not 1) classify MCAS correctly in their hazard analysis, and 2) even with their mischaracterized risk in the HA, they did not follow their own procedures to have redundant sensor readings mandatory for the equipment as classified. Those are designer decisions, not some labor-union issue. To my knowledge, those held responsible were relatively high level engineers, implying they were not being protected by the union for their decisions.


Wait till you hear how "difficult to fire" it is at Airbus assembly locations in Germany, Spain and France, and how unionised the workforces of these countries are... and yet Airbus doesn't ostensibly have a "fatally negligent parasitic workers" problem having an impact on its airliner safety. How come?


It could be worker organization in Europe is mutually cooperative rather than antagonistic as often found in USA.


Furthering the insinuation that everyone has the right to work every job. Sometimes people suck at their job.


As your sibling comments mentioned, there's a difference between giving a chance for someone to learn from a single mistake without punishment, and allowing them to make the same mistake twice without taking matters out of their hands after.

If it's a really critical role, the training will have realistic enough simulation for them to make countless mistakes before they leave the training environment. Then you can assess their level of risk safely.


This whole thread is missing the fact that the NTSB had a theory that transparency leads to safer airplanes, they tried it, and it works. People hesitate to self-report when it comes with punishment (fines, demotions, or just loss of face among peers). You need a formal “safe space” where early reporting is rewarded and late reporting is discouraged.

Safety is a lot about trust, and there is more than one kind of trust. At a minimum: are you capable of doing this thing I need you to do? Will you do this thing I need you to do?


It's not just the NTSB, it's part of things like the Toyota Production System. There's ample evidence to show both that punishment discourages safety and that lack of punishment encourages safety, across multiple industries.


Yes this is cross industry best practices.

Goodhart's law also applies, as in the case of the edoor bolts, Spirit intentionally bypassed safety controls to meet performance metrics.

The Mars Climate Orbiter is another example. While unit conversion was the scapegoat, the real cause of the crash is that when people noticed that there was a problem they were dismissed.

The Andon cord from the Toyota Production System wasn't present due to culture problems.

Same thing with impact scores in software reducing quality and customer value.

If you intentionally or through metrics incentivize cutting corners it will be the cost of quality and safety.

I am glad they called out the culture problem here. This is not something that is fixable under more controls, it requires cultural changes.


> The Mars Climate Orbiter is another example. While unit conversion was the scapegoat, the real cause of the crash is that when people noticed that there was a problem they were dismissed.

Challenger too. Multiple engineers warned them about the O-rings. They weren't just ignored, but were openly mocked by the NASA leadership. (https://allthatsinteresting.com/space-shuttle-challenger-dis...)

A decade later a senior engineer at NASA warned about a piece of foam striking Space Shuttle Columbia and requested they use existing military satellites to check for damage. She was ignored by NASA leadership, and following (coincidentally) a report by Boeing concluding nothing was wrong, another 7 people were killed by a piss-poor safety culture. (https://abcnews.go.com/Technology/story?id=97600&page=1)


But but but what about my intuition and gotcha questions about how this could never work in practice?


The dirty secret of why traffic circles reduce accidents? Stoplights feel safer than they actually are, while circles feel more dangerous than they actually are. That nervousness becomes vigilance, which reduces accidents. It’s also why people intuitively hate them. They’re actually right, but also wrong.

Feeling safe is an illusion that governments try to maintain for their people. It’s one of their biggest jobs. But the illusion has dimensions and it’s hard to keep several going at once.


I think there is more nuance to it than that. Not everything is a mistake, not every mistake is recoverable, and not all skills are trainable.

The fundamental goal is to distinguish between recoverable errors and those that are indicative of poor employee-role fit.


Mistakes are the problem, as they will always happen.

The point is to build a culture where you value teamwork and adjust and learn from failures.

This isn't an individual team problem, this is an organization problem.

It is impossible to hire infallible, all knowing employees.

But it is quite possible to enable communication and to learn from pas mistakes.

When you silence employees due to a fear of retribution bad things happen.

People need to feel safe with calling out the systemic problems that led to a failure. If that ends up being the wrong mixture of skills on a team or bad communication within a team that is different.

Everything in this report was a mistake, and not due to gross incompetence from a single person.

The E door bolts as an example was directly attributed to metrics that punished people if they didn't bypass review. The delivery timelines and defect rates were what management placed value on over quality and safety.

Consider the prisoner delema, which is resolved by communication, not choosing a better partner.


I don't disagree with what you said about this instance, but I'm trying to push back on the knee jerk sentiment that there are no bad employees only bad systems- There are both. cultures that are too permissive of bad actors degrade the system.

Part of maintaining quality culture is maintaining red lines around integrity.

Like I said above, not all errors are recoverable or honest mistakes.

I work in medicine and a classic example would be falsifying data. That should always be a red line, not a learning opportunity. You can add QA and systemic controls, but without out integrity, they are meaningless. I have seen places with a culture of indifference, where QA is checked out and doesn't do their job either.


> I work in medicine and a classic example would be falsifying data

Certainly nobody has ever thought about that before. In fact, there definitely isn't a second sentence in the definition of aviation's just culture that is being completely ignored in favour of weird devil's advocacy.

> 4) Just Culture- errors and unsafe acts will not be punished if the error was unintentional. However, those who act recklessly or take deliberate and unjustifiable risks will still be subject to disciplinary action.

Oh wait.


I have no problem with the stated safety culture.

I simply agree that "that everyone has the right to work every job" is not a reasonable interpretation of them.

as stated above, a reasonable reader should understand:

> Not everything is a mistake, not every mistake is recoverable, and not all skills are trainable. The fundamental goal is to distinguish between recoverable errors and those that are indicative of poor employee-role fit.


Who is claiming that "everyone has the right to work every job", though? The only person to even bring up the sentence is someone who's handwringing about an interpretation that nobody was making to begin with.

This is why I called it weird devil's advocacy, because what exactly is the point of jumping to caution people about something they aren't doing?


>Who is claiming that "everyone has the right to work every job", though? The only person to even bring up the sentence is someone who's handwringing about an interpretation that nobody was making to begin with.

Thats the parent in the thread we are posting in in. User Error-Logic replied, and I built upon their reply adding that:

>goal is to distinguish between recoverable errors and those that are indicative of poor employee-role fit.

You and others wanted to dive further.


This is not a gotcha question. It's a smoke test: something that should be really easy to get right.

This is like building a calculator that can't get 1+1 correct, and then complaining nobody would ever need a calculator to calculate 1+1 so therefore it's an unfair question.


> Google doesn't want to offend or be racist

This is where you're wrong. A non-trivial percentage of Google's workforce does want to be racist and does want to offend.

And despite the fact that this group is a minority in the company, their lack of scruples allows them to have substantial power in setting corporate policy. Since they're willing to play dirty, they get their way more often than the employees who play by the rules.


We didn't allow them to do anything. They're just not accountable to us.

If you've been following Harvard's anti-Semitism drama over the last 4 months, it appears they're not really accountable to anybody. Neither US Congress nor their wealthiest donors have been able to force action from them.


It's not at all obvious to me that (or why) Yale or Harvard ought to be accountable to us. They're private universities and, as far as I know, they appear to be following the laws that they're subject to. (Following the law is a form of accountability, but a very weak form.)

If they want to suddenly condition admissions on a hash function of the applicant's name, I think that would be absurd, but I don't think I ought to have any say in that matter.


But Harvard's president was forced to resign?


Due to a citation scandal, that wasn’t as big a deal as it was presented in context.

The wrong reason to go…


A better reason to go was that her research was crap based on bad statistics.

Things like the citation scandal are a signal for the kind of problem that lead to that.


If the streaming service isn't even popular enough to refer to it by name, it has not dethroned Netflix.


>isn't even popular enough to refer to it by name,

The page title is "How Showmax, an African streaming service, dethroned Netflix", and the name of the service is also the very first word of the subtitle on the page. It is also referred to by name several other times throughout the article.

>it has not dethroned Netflix

If you go by subscriber count, which is a bit more robust than your criteria, it has.


No, the page title is "How an African streaming service dethroned Netflix." And that was also the title posted here originally.

Also, Showmax most definitely does not have more subscribers than Netflix. That is only true if you limit to a specific geographic region. But if you don't state as much in the title, you are lying by omission.


Open up the page, then mouse over the tab to read the page title. The page title is "How Showmax, an African streaming service, dethroned Netflix - Rest of World".

You can also press F12, and in the <head> section, read what is written in between <title> and </title>, which is what I quoted above.


Showmax has displaced Netflix to become the most popular streaming platform in Africa


That's a much better title than what was posted originally. You should apply for a job as an editor.


Personally I find it depressing. He could have retired years ago to work on a passion project. Why is he still dealing with the soul-sucking internal politics of a mega-corp? Unless he somehow enjoys walking on egg shells all day, it doesn't make sense to me.


Have you considered that maybe this is his passion project?


I think, as you alluded to, this may be more of a personality thing, e.g. an empath might see that sort of thing as a great opportunity.


The reason for the negative view of HN is because the community skews way farther left than the general public.

Case in point: this story comes from a website dedicated to promoting socialism and not one single person here is even questioning the credibility of the source.


I don't understand the secrecy about firing somehow. If I were an employer, I'd want my remaining employees to know that what the fired person did was unacceptable.


Companies very rarely do not want to open themselves to liability so they usually go for the blandest possible description, even laying off and paying unemployment for someone who should be fired for cause.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: