Sir, if I preoperly recall, that's exactly how you discard premises. Taking a premise and applying it to a more extreme situation just battletests it. It's done in justice and law courses, etc.
"If the common good justifies killing X agent, then it must be ok for a doctor to kill a patient in their sleep, take his organs and save three people with his two kidneys and heart."
Also, what Yale did wasn't right, and the students just defend themselves... If, for example, you were robbed wouldn't you care? "Oh but there are thousands more people poorer than me, I shouldn't bother".
I based that statement on what I saw in Michael Sandel's HarvardX (edX partner) Justice course. There's a lot of thought experiments and hypothetical situations premises are tested against to see if they still hold to be moral; really interesting.
"If the common good justifies killing X agent, then it must be ok for a doctor to kill a patient in their sleep, take his organs and save three people with his two kidneys and heart."
Isn't that a variation of a semantic shift fallacy? [1]
Just an FYI: that wiki link does not seem to lead to an article about the fallacy in question, or any kind of fallacy for that matter. You might have been thinking about the continuum fallacy[1], false equivocation, straw man, or red herring. In any regard, I agree that the reasoning in question falls within the bounds of one or other informal fallacies[2].
"If the common good justifies killing X agent, then it must be ok for a doctor to kill a patient in their sleep, take his organs and save three people with his two kidneys and heart."
Also, what Yale did wasn't right, and the students just defend themselves... If, for example, you were robbed wouldn't you care? "Oh but there are thousands more people poorer than me, I shouldn't bother".