Hacker News new | past | comments | ask | show | jobs | submit login

I looked this up, because I was confused about what authority deemed this part of the English language a "grammatical mistake". Meriam Webster notes

>Although it has been in use since the late 18th century, sense 3 is still attacked as wrong. Why it has been singled out is not clear, but until comparatively recent times it was found chiefly in scientific or technical writing rather than belles lettres. Our current evidence shows a slight shift in usage: sense 3 is somewhat more frequent in recent literary use than the earlier senses. You should be aware, however, that if you use sense 3 you may be subject to criticism for doing so, and you may want to choose a safer synonym such as compose or make up.

So it's not even an issue of grammar, it's just a meaning of "comprised" that some people reject. And it's a usage which is clear, widely used, and doesn't have logical issues (like "could care less"). Go figure.

It is an issue of grammar.

"To comprise" has the same grammatical usage as "to include" although the two words mean subtly different things. You'd never say "the care package is included of cookies and candy bars" and it's equally incorrect to say "the care package is comprised of cookies and candy bars."

If you want to imply that the care package has cookies and candy bars, but also may have other things, you say: "the care package includes cookies and candy bars." If you want to imply that the care package has just cookies and candy bars, you say: "the care package comprises cookies and candy bars."

As a linguist, it is absolutely not an issue of grammar. The grammar is 100% correct. Nit-picking over the so-called "correct" definition has nothing to do with grammar.

It is an issue of semantics, and a rather arbitrarily prescriptivist one at that. Actually, I'm not even sure I would say it's a semantic issue. Most speakers will easily understand the intended meaning.

>As a linguist, it is absolutely not an issue of grammar.

I keep seeing this construction a lot lately, but it always seems weird to me. Is it correct?

There is an implied "it is my opinion that..." because I am stating my opinion. I think most people are able to pick up on that naturally, and I don't feel it necessary to spell it out.

I am sure many here would argue that it doesn't adhere to their notion of correctness, however.

The issue here is the famous dangling modifier.


Even your implied "it is my opinion that" would still leave a dangling modifier (now the erroneous parse indicates that your opinion is a linguist, rather than the original erroneous parse suggesting that the usage is a linguist).

Dangling modifiers are an interesting issue because one often has to go all the way up to the pragmatics level in order to resolve the indended meaning (in this case, who or what could be a linguist?). Of course, speakers are usually quite good at that, so the ambiguity is usually only noticed in writing (and then people disagree about how significant the ambiguity is).

The key defining feature (from a linguistics perspective) of a dangling modifier is ambiguity. I really don't see how my statement could have been misinterpreted, but I would be interested to hear alternate interpretations.

Even with some ambiguity, a dangling modifier is not necessarily ungrammatical.

Prescriptivists tend to have a much broader notion, however (e.g. the ongoing 'hopefully' controversy; apparently they consider it illogical when a speaker adverbially describes their feelings without doing so in an expressly grammatical context).

If you really want to be correct, it's best to just do what other prescriptivists seem to believe is right and not think about it too much.

Personally, I think making the sentence "grammatical" by expanding it (I am a linguist and my opinion is that...) seems rather unnatural.

Along the same lines, the redditism "(Profession) here. (Opinion)." could also be excoriated by prescriptivists for being ungrammatical on the grounds that it is a sentence fragment, yet it gets the point across and I have yet to see anyone complain about it.

The subject "I" is never explicitly introduced anywhere in the sentence "As a linguist, it is absolutely not an issue of grammar", so the only explicitly available subject for the modifier to refer to is "it". (Compare "As a pragmatic issue, it is absolutely not a syntactic issue", "As a lexical issue, it is absolutely not a matter of syntax", etc.) So the problem is where the subject "I" comes from when it's left unmentioned, and a solution could be that it's pragmatically implied in any clause like "as a [type of person]" -- but surely not syntatically.

I am not really sure what you are arguing for here, but I think you may be hyperanalyzing and missing the point.

In a very narrow and technical sense, yes, you could argue that the sentence is structurally unsound.

> so the only explicitly available subject for the modifier to refer to is "it".

Just because it is "possible" does not mean it is actually parsed that way.

>So the problem is where the subject "I" comes from when it's left unmentioned,

I do not see this as a problem. I think self-referential phrases (i.e., referring to the speaker), are usually clear, which I believe is the case here. There are much more ambiguous examples that may be nearly meaningless to the listener, but they usually involve two separate third parties that cannot be differentiated by the phrase in isolation.

In normal human conversation, it is generally easy for the listener to determine when the speaker or listener is being referred to, regardless of whether there is an explicit syntactic reference.

In short- yes, the sentence could be more explicit (maybe it does violate certain notions of Standard English/prescriptive grammar/style choice), but no, it is not unintelligible (it does not escape most listeners' mental grammar).

I think we mostly agree because we agree that listeners will almost always understand the sentence as intended and we agree that there's a sense in which the sentence is unsound.

Maybe an analogy is sentences like "going to the store later, want to come?" (although the particular problem is slightly different). One interpretation is that English is becoming a pro-drop language in some contexts, but even many speakers who utter that sentence would agree that it's slightly syntactically unsound for omitting the pronouns (even though it didn't tend to harm understanding). (I'm not positive that this case is analogous to the dangling modifier case.)

>In a very narrow and technical sense

This is the sense in which a lot of people parse language. The whole point of correctly using language is that readers don't have to jump through hoops to understand what mistake you made and why you made it (i.e. what you actually meant), which ultimately makes communication easier.

I assume by the context of my quote that we are still talking about the dangling modifier I used or similar "errors".

>This is the sense in which a lot of people parse language.

I completely disagree. You are describing a sense in which they consciously analyze language. Mental grammars are much more plastic and forgiving than you seem to suggest. I do not know a single person who, when first reading such a sentence, would not be able to derive the intended meaning from it on the grounds that the modifier has no explicit referent, and I think suggesting otherwise is far-fetched.

Sometimes perhaps a genuinely ambiguous phrase may cause the reader to review and come up with an alternate meaning or two, but it is not as if their brain says "SYNTAX ERROR" and can't continue, bringing communication to a grinding halt.

>The whole point of correctly using language is that readers don't have to jump through hoops to understand what mistake you made and why you made it (i.e. what you actually meant), which ultimately makes communication easier.

I think you are suffering from an illusion that there exists an objectively and universally correct, monolithic, Standard English.

I also think that you are on the verge of a slippery slope argument by even suggesting that a speaker-referential dangling modifier caused readers to "jump through hoops" and hindered communication.

"I know what you meant" is probably the hardest thing for any prescriptivist to admit because it immediately destroys their very arbitrary arguments about "logic" and "efficiency" of language.

My point stands that there is a huge difference between a perceived poor style choice and a genuinely ungrammatical utterance. But human beings do not naturally speak ungrammatically. There are simply dialect-dependent variations in grammar, and the more prestigious dialect will almost always be touted as "the right one" by prescriptivists, who will then say that all other dialects are "ruining the language" and "making communication more difficult". By that line of thinking it's reprehensible that different languages even exist at all.

Sorry, but you are never going to get a universally agreed-upon and practiced standard. Yes, we need some rules to have a grammar and a language, but there is a huge range of how important those rules are for communication. Saying that any old "incorrect" use of language hinders communication is, to me, a bit like suggesting that people who jaywalk in a city are liable to gridlock its transportation system.

Unless we think of this as a species of disjunct (which might have been where you were going by describing the use of "hopefully")...

> As a linguist, it is absolutely not an issue of grammar. The grammar is 100% correct. Nit-picking over the so-called "correct" definition has nothing to do with grammar.

This is true. Sentences without correct grammar are unparseable(like "he jumped didn't far"). A lot of people seem to confuse semantic or logic issues with grammar issues.

Encyclopedias, dictionaries, and other types of reference materials often conform to a rigid and pedantically defined aesthetic. This aesthetic is more about context, appearance, intuition. Editing is an art form.

Conveying meaning differs from conveying the intended meaning with clarity. When you have to say 'most people', we already have a problem. Reference material should refer. Leaving it up to intellectual interpretation is simply no good for material that is meant to record and preserve information.

That said, this possibly has nothing to do with the difference between 'comprised of' and 'composed of'.

> Encyclopedias, dictionaries, and other types of reference materials often conform to a rigid and pedantically defined aesthetic. This aesthetic is more about context, appearance, intuition. Editing is an art form.

Right. That's called a "house style", which is not about correct or incorrect but rather about having a more-or-less arbitrary standard all writers must adhere to in order to make the final product more consistent.

Now we are arguing about the semantics of "grammar". How did we get here? :)

By involving a linguist in a discussion about the best way to write something.

I did not argue that either way was better or worse.

Oh sorry, I just meant that since the people discussing had no formal knowledge about languages, grammar, semantics, etc... the only mention about any formal concepts about them would slide the discussion into the basics.

I do agree with what you said.

In normal conversation, grammar can have a more general meaning. But since everyone here wants to argue about precision and correctness, I thought I should at least level the playing field.

It is in fact an issue of grammar. Irregularity is primary feature of human language grammar--a source of delight to some and a source of annoyance to others. One of my favorite examples from language acquisition research is the small child who says "I goed to the store." In this case it's literally the exception that proves the rule.


"the care package is included by cookies and candy bars" ??

It's yet another fight between prescriptivists and descriptivists over what defines correct language usage. English is full of examples where incorrect usage became pervasive enough to become accepted as correct. But some people fight that process because they don't like that the language changes as a result of people using it ignorantly.

For me, it really comes down to how you view the purpose of language. If the purpose is to convey meaning as accurately as possible, then these changes devalue the language since the usages that result from ignorant speakers are, by nature, less precise. The oft-cited "begging the question" has a very precise meaning that's difficult to convey in other terms. When ignorant usage widens the meaning, we lose the ability to easily refer to the more narrow meaning.

On the other hand, you can view language as a shared cultural identity that's always shifting and evolving organically. In that light, these minutiae are pointless because the majority of speakers don't understand the subtlety and never will. You can try to educate people, but you'll just end up alienating them. As humans, we're evolved to learn language from our environment and the people who talk to us, not from a textbook. So why would we consider the textbook version to be the canonical version?

These two viewpoints are diametrically opposed and yet are both reasonable and there are many well-educated people on both sides.

> there are many well-educated people on both sides

However, there are not many linguists—y'know, people who study language and how it works—on the prescriptivist side. There are a lot of well-educated people who reject evolution, too, although you'll note that not many of them are biologists.

I have to say I find myself on both. In general I find the descriptive view of language to be far more flexible, pragmatic, and realistic. But certain phrases such as "all intensive purposes" evoke such a strong response from my gut that I just cannot abide it.

I agree...I find myself on both sides and that's why I presented both sides. I've decided that I want to learn as much from the prescriptivists as possible, since I find the evolution of language to be a fascinating topic. But I've also decided that I have very little interest in correcting people, so I try to vary my behavior contextually. I do my best to avoid mistakes in grammar in my own speech, but I don't correct other people that use them. And I try to use as much language precision as I feel the listeners are able to grasp.

It's almost like a client-capability protocol upgrade in the tech world...if you can use the enhanced protocol, you can benefit from it. But if your client (the person you're speaking to) doesn't understand the subtlety of your word choice, it's pointless and potentially confusing to choose the more advanced usages. So I fall back to the common usages that tend to bother prescriptivists.

But since this forum seems to be the more capable, here's one departure that people might find interesting. "All intensive purposes" is a malaprop, a specific language error that's almost never considered to be correct. But they can be fascinating examples of how we make sense of the world. My mother was a child psychologist for 30 years and collected malaprops from the children she saw. Children, especially those who have not yet started to read, have very small vocabularies and learn words phonetically from those that speak to them. When they encounter words that they don't know, they try to make sense of them from the context in which they're used and the similar words that they've already learned. This is a surprisingly effective technique, but often results in some interesting mistakes that surface as malaprops. My favorite example was a child that thought that Alzheimer's disease was actually "old timer's" disease. From that one mistake, you can really see how children approach unfamiliarity in their world.

Are people defending "all intensive purposes"? I think descriptivists get a bad rap sometimes for "anything goes" but that's not true either. All intensive purposes is clearly a mistake.

"Intensive purposes" (and "intense and purposes") is an eggcorn, so it's started the long walk to acceptance.


Edit: this is a nice article about it http://www.nytimes.com/1994/01/23/magazine/on-language-retur...

Do eggcorns inevitably become standard usage? Usually the replacement has some advantage, but here the original phrase is neither awkward not obtuse, and it makes more sense.

> It's yet another fight between prescriptivists and descriptivists over what defines correct language usage.

I'm not sure that's accurate. Being a descriptivist doesn't stop you from wanting works that are important to you to adopt clear and consistent style; the mere fact of having a usage preference and seeking, within a collective work, to have that preference consistently applied is not equivalent to linguistic prescriptivism.

Well said. Though I think that in the context of an online encyclopedia, being precise has far more value than in most other contexts, and so the prescriptivists are in the right. This should be recognizably true even if you take the side of the descriptivists in other contexts.

Merriam-Webster uses a very permissive standard for grammar compared to most other usage guides. I like them, but you have to be careful following their advice. If you read their words fairly, you will note they say that "comprised of" is rejected by some readers.

It's the whole descriptive vs. prescriptive thing, and MW explicitly adopts a descriptive standard (they say so in their preface). A good introduction to this larger question is David Foster Wallace's (excellent) essay in review of Bryan Garner's (fantastic) usage guide (http://harpers.org/wp-content/uploads/HarpersMagazine-2001-0...).

If you are interested in usage, and are not familiar with Garner's book, I recommend it. (http://www.amazon.com/Garners-Modern-American-Usage-Garner/d...) It is superior to MW for most writers' purposes. (For students of linguistics, MW might win out.)

Garner, incidentally, doesn't much like "comprised of" -- http://www.lawprose.org/blog/?p=2385 (“invariably inferior”).

It is a grammar issue. Words often have common uses that don't make grammatical sense. "Comprised" means "included" or "contained." You wouldn't say that a group was included of four people or that a group was contained of four people. It's semantically backwards. Confusing it with "composed" makes it more difficult for people to understand its meaning when it's used correctly.

Well, maybe giraffedata will precipitate a change in language in which it becomes less acceptable again.

I don't really understand the impulse people have to say, "Language is constantly in flux and there is no authority on what is correct usage. Thus, you should not try to change people's language and I am the authority telling you this."

If you're descriptivist, then one of the things that you should describe is people's bottom-up attempts to change language by, say, excising 47,000 uses of a phrase from a commonly read reference material.

> If you're descriptivist, then one of the things that you should describe is people's bottom-up attempts to change language by, say, excising 47,000 uses of a phrase from a commonly read reference material.

I don't think descriptivists are obliged to treat a one-man crusade on a phrase with the same gloves as the grand-scale evolution of that phrase in the first place. 47,000 instances of "comprised of" across a zillion wikipedia articles written by a bunch of people is evidence of lingual evolution; one guy reverting those all to "comprised" isn't further evolution, that's just one guy changing something he doesn't like.

Sure. But, you know, when you use evolution metaphors, you get that what starts a change in a species is one mutation, right?

Someone first used the phrase "comprised of" in this sense, and that person, it would seem, got some traction. Giraffedata is trying to be the guy who gets traction on making the phrase ungrammatical. This does happen! At some point, we stopped for example using the second-person informal (thou). This didn't happen because the God of Zeitgeist changed the minds of millions of English-speakers all at once. It happened because individual people started to change how they used language, and it caught on.

Do I think that giraffedata's crusade is likely to be successful? Nope. But what I don't get is being all offended that someone is trying to change language on the basis of "language is always changing."

> But what I don't get is being all offended that someone is trying to change language on the basis of "language is always changing."

I think the offensive part comes from the implication that one way is right and one way is wrong.

It's not (merely) about right vs wrong, but about "destroys useful symmetries" vs doesn't, or "makes language less useful" vs doesn't.

If people start to use "up" to mean "down", that defeats the purpose of having words for them.

If a term's very existence is to provide a context-free way to disambiguate, it will be a pain when people expect you to infer the real meaning from context (cf literally).

In programming, if you accept some new idioms, but those make it hard to see if a given line of code mutates state, that will create pain for you down the line, even if it's "just the evolution of the language".

So it's wrong to say that one way is right and one way is wrong?

... ad nauseum

A fair point. But common usage can find an obstacle in those whose beliefs or principles refuse to accept it. I still enjoy seeing the Sears Tower in feb-roo-airy, in spite of others' desire to visit the Willis Tower in Feb-you-airy.

As long as they don't try to drop a "nuke-you-lar" bomb, I won't correct them. I could care less, but then I'd be apathetic to their linguistic failings.

(Edit: grammatical error)

I've found that whenever anybody gets a burr up their ass over some minor grammatical point, they almost never know what they're actually talking about, have not even the most cursory understanding of language or grammar and have instead built some foundation of their identity on a rule they wrote memorized and see anything that goes against that rule as a violation of their self-image.

To some people, fussing over a certain set of grammatical issues is the mark of an educated person. I remember one person who got angry with me when I responded to their lecture on the word "decimate" with a little bit of historical linguistics. Probably because it seemed like I was undermining their self-image.

What's interesting to me is which items are in the set of things to be fussed over. Split infinitives, yes. Is-doubling, no. "Decimate", yes. Any other borrowing from Latin, no. It's so arbitrary, yet some people cling to it so fiercely.

It can be hard to parse sometimes, people who are educated from people who suffer from severe cases of Dunning–Kruger and have tied their supposed expertise deeply into their persona.

ha! you're correct of course.

The phrase "could care less" has a logic error. It makes more sense for someone to say, "I couldn't care less" when the person wants to express that he doesn't care.

In the UK, we pretty much always use the "couldn't" version. I'm guessing "could care less" is an Americanism; it always jars for me when I read it.

Could care less is very rare in England, although use is growing from Internet usage.

There's some suggestion that it comes from Yiddish style dialect. "I should be so lucky!", for another example.

Both "could care less" and "could not care less" work well in conveying the semantics "I care little".

One says, "although I care very little, there is some wiggle room to care even less". The other says, "I care so little, I couldn't care less than I do now".

Either way, I care little is the main message.

However, "couldn't care less" is more sensible, because what is the point of expressing that you care little, but still have room to care less? That sort of expression would only serve as a retort against an accusation that you do not care. ("A: You don't care at all! B: That is not true, I could care less.") B admits that he or she cares little, but objects to being characterized as entirely uncaring.

We should choose the expression based on its logical sensibility, rather than regional dialect.

I enjoy your analysis :-)

I don't know about "could care less" conveying the semantics - I genuinely paused when I first read it to work out which meaning it had.

Let's saying "caring" goes 0 to 10. "Could care less" includes everything from 1 to 10, assuming integer granularity - it's very much the right hand side of the scale, anyway.

Only "couldn't care less" covers that 0 rating.

So - perhaps it's terrible in conveying semantics and only context/tone imply the disinterest being communicated?

Don't get me started on "could care less", my eyes sting when I read that. My other pet hate is "so fun", as in "that was so fun"...I blame those US kids channels for that one.

Why is 'so fun' so wrong?

Fun can usually be substituted freely for a word like 'entertaining'. Is there something wrong with saying "That was so entertaining"?

Because "fun" in that context is being used like a countable or mass noun, therefore "That was so much fun" would be the correct usage.

"Entertaining" is an adjective and you can use "so" as a modifier to express how entertaining whatever you were doing or watching is or was.

This probably explains it better than I can:


Fun also functions as an adjective. "That was a fun movie. It was fun. How fun? So fun."

I agree, there's definitely two conflicting usages here: and activity can be the noun "fun", in the same way that doing something can be can be "bliss" or "hell" or "a complete waste of everyone's time". And so while I can say "My drive in to work was hell", I can't say "My drive in to work was so hell" (because not an adjective). But if I can say "My drive in to work was enjoyable", and even "My drive in to work was so enjoyable", then what's wrong with saying "My drive in to work was so fun"?

I had no idea that fun was not allowed to be an adjective.

I assume the issue is with the construct "so fun", not the word "fun". And in that construct, the word "so" is an intensifier. It emphasizes the nature of the next word. You'll hear it in constructs like "that was so cool", or "that was so awesome", or other such phrases that kids like to say when they're excited about something.

Kids like Hamlet: "So excellent a king, that was to this Hyperion to a satyr; so loving to my mother..."

Good point. My comment left the impression that it was something only kids say, but it's absolutely not. "So" is a perfectly good adverb that's used elsewhere as well. It just happened that the immediate phrases that popped into my head were things kids were likely to say (probably because the grandparent comment referenced "kids channels").

I think it's because some people and style guides (e.g. Strunk & White) reject this use of "so" as being a generic, overly vague intensifier. Often these people suggest pairing it with a "that" clause, which allegedly makes it more acceptable.

So, instead of "so fun", some people would prefer that you write e.g. "so fun that I squealed with glee", or even just "it was incredibly fun".

At least that's the argument I've come across. To the extent I even think about such things, I couldn't really care less if people use "so" like this; the meaning seems perfectly clear.

Who cares if it has a "logical error", whatever that even means, if you understand. And I know you understand it.

Millions of people use this idiom completely intuitively, following their instincts of language that I, as someone who doesn't have English as their native language, can only be jealous of.

"I could care less" is either meant to be taken sarcastically, or it came to be because the "dn't c" consonant cluster is hard to pronounce, the same reason you don't pronounce the 'l' in could, the 'k' in knee, knight or knave, the 'p' in psychologist or pneumonia, or half the letters in Wednesday.

see: sarcasm

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact