Hacker Newsnew | past | comments | ask | show | jobs | submit | ret2plt's commentslogin

You couldn't own slaves in London in 1845, and in any case the name derives from the "Master of the household", so if you want to be mad about it, you should call it sexist, not racist. Or you could just be chill, stretch the meaning a bit and say the couple together are the masters of the household. But, now I'm curious: Where do you draw the line? You don't like git master branches and master bedrooms, but what about other uses? You can have a master key, master record, master a skill, create a masterwork, be a master to an apprentice, join the toastmasters, be a master of ceremonies at a formal event, you can dress up for comic con as Master Yoda, Master Chief, or Dumbledore (the Headmaster of Hogwarts), you can be a Master Chief in the US Navy, be the dungeon master for a game of D&D, get a Masters' Degree and so on. Which of these things are in your opinion bad and should be renamed?


This is very much like asking why are you focused on fixing one bug at a time in your software when you can fix every reported bug simultaneously?

I don't know man, maybe it's because fixing this one completely inconsequential bug faces so much backlash for no particular reason other than "change bad"?

And well done with using an example from a book series where the only Asian character is named Cho Chang and where there are elves with long noses in charge of the "central bank". That really works in your favour, you totally owned me [pun intended] with that one!


Good job finding something to complain about in one of the 13 examples I listed. This unassailable refutation utterly destroys my whole argument :(


PS: about the role in git: I don't feel that strongly about it, but I think master is somewhat more descriptive. The master branch contains the most up-to-date version of the source code, so e.g. if I'm working on a feature branch and a colleague pushes a bug fix that affects me into master, I need to merge/rebase to get the latest changes into my feature branch. So, while the master branch doesn't "rule" the feature branches, there is still the implication that changes to master should find there way into the feature branches at some point, which I think main doesn't convey that clearly.


Language is in constant flux. If a word has a remote historical connection to master/slave in a precursor project people nowadays don't even know, and people invent a "folk etymology" comparing it to a master record, do you really gain anything worthwhile from insisting on the history?


I grant it's not nothing, but I think it's not enough of something to make changes over it. Thinking of a master record or similar is the natural reaction when you learn about the terminology, and most young people have never used bitkeeper, so unless you go out of your way to explain why this is "bad" most people won't even know, so what do you gain from it?


Eh, that's some interesting historical trivia, but I don't see how that matters tbh. If everybody is fine with the word because they (quite reasonably) assume it was inspired by master record or something similar, and you bring up some 25 year old history (20 years when the debate got started) about not even git but a precursor, does that really help with anything? I am sure there are other innocuous seeming words that have a dark etymology, should we go search for them so we can update the language?


The question was "How did Scrum Master escape this treatment?" - I think I answered that question accurately.


> If everybody is fine with the word

Some people weren't evidently.


It's worse than that. The problem is that being truly rational is hard, unpleasant work that few people want to do. If you read an article that makes your political opponents look bad, you can't just feel smugly superior, you have to take into account that you are predisposed to believe convenient sounding things, so you have to put extra effort into checking the truth of that claim. If you follow the evidence instead of tribal consensus, you will probably end up with some beliefs that your friends and relatives wont like, etc.


> This is often seen in the form of very smart people also believing conspiracy theories or throwing their hands up around other massive issues. As an example, the "Rationalist crowd" has de-emphasized work on climate change mitigation in favor of more abstract work on AI safety.

To be clear, the argument (in rationalist circles) is not that climate change is no big deal, it's that there's already a ton of people worrying about it, so it is better to allocate some extra resources to underfunded problems.


I think it depends on how you frame it. If the Linux Foundation thinks this kind of research would generate useful information for the kernel project, then the developers' time wouldn't be wasted, just used in a different, yet productive, way. I concede that this is not an easy question, because the developers may have different opinions about the usefulness of this exercise, but at the end of the day, maintainers can run their projects how they see fit.


> You do not experiment on people without their consent. This is in fact the very FIRST point of the Nuremberg code:

> 1. The voluntary consent of the human subject is absolutely essential.

The Nuremberg code is explicitly about medical research, so it doesn't apply here. More generally, I think that the magnitude of the intervention is also relevant, and that an absolutist demand for informed consent in all - including the most trivial - cases is quite silly.

Now, in this specific case I would agree that wasting people's time is an intervention that's big enough to warrant some scrutiny, but the black-and-white way of some people to phrase this really irks me.

PS: I think people in these kinds of debate tend to talk past one another, so let me try to illustrate where I'm coming from with an experiment I came across recently:

To study how the amount of tips waiters get changes in various circumstances, some psychologists conducted an experiment where the waiter would randomly either give the guests some chocolate with the bill, or not (control condition)[0] This is, of course, perfectly innocuous, but an absolutist claim about research ethics ("You do not experiment on people without their consent.") would make research like this impossible without any benefit.

[0] https://onlinelibrary.wiley.com/doi/epdf/10.1111/j.1559-1816...


I don't want to defend what these researchers did, but to equate infecting people with syphilis to wasting a bit of someones time is disingenuous. Informed consent is important, but only if the magnitude of the intervention is big enough to warrant reasonable concerns.


>to wasting a bit of someones time is disingenuous

This introduced security vulnerabilities to stable branches of the project, the impact of which could have severely affected Linux, its contributors, and its users (such as those who trust their PII data to be managed by Linux servers).

The potential blast radius for their behavior being poorly tracked and not reverted is millions if not billions of devices and people. What if a researcher didn't revert one of these commits before it reached a stable branch and then a release was built? Linux users were lucky enough that Greg was able to revert the changes AFTER they reached stable trees.

There was a clear need of informed consent of *at least* leadership of the project, and to say otherwise is very much in defense of or downplaying the recklessness of their behavior.

I acknowledged that lives are not at play, but that doesn't mean that the only consequence or concern here was wasting the maintainers time, especially when they sought an IRB exemption for "non-human research" when most scientists would consider this very human research.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: