>Although it has been in use since the late 18th century, sense 3 is still attacked as wrong. Why it has been singled out is not clear, but until comparatively recent times it was found chiefly in scientific or technical writing rather than belles lettres. Our current evidence shows a slight shift in usage: sense 3 is somewhat more frequent in recent literary use than the earlier senses. You should be aware, however, that if you use sense 3 you may be subject to criticism for doing so, and you may want to choose a safer synonym such as compose or make up.
So it's not even an issue of grammar, it's just a meaning of "comprised" that some people reject. And it's a usage which is clear, widely used, and doesn't have logical issues (like "could care less"). Go figure.
"To comprise" has the same grammatical usage as "to include" although the two words mean subtly different things. You'd never say "the care package is included of cookies and candy bars" and it's equally incorrect to say "the care package is comprised of cookies and candy bars."
If you want to imply that the care package has cookies and candy bars, but also may have other things, you say: "the care package includes cookies and candy bars." If you want to imply that the care package has just cookies and candy bars, you say: "the care package comprises cookies and candy bars."
It is an issue of semantics, and a rather arbitrarily prescriptivist one at that. Actually, I'm not even sure I would say it's a semantic issue. Most speakers will easily understand the intended meaning.
I keep seeing this construction a lot lately, but it always seems weird to me. Is it correct?
I am sure many here would argue that it doesn't adhere to their notion of correctness, however.
Even your implied "it is my opinion that" would still leave a dangling modifier (now the erroneous parse indicates that your opinion is a linguist, rather than the original erroneous parse suggesting that the usage is a linguist).
Dangling modifiers are an interesting issue because one often has to go all the way up to the pragmatics level in order to resolve the indended meaning (in this case, who or what could be a linguist?). Of course, speakers are usually quite good at that, so the ambiguity is usually only noticed in writing (and then people disagree about how significant the ambiguity is).
Even with some ambiguity, a dangling modifier is not necessarily ungrammatical.
Prescriptivists tend to have a much broader notion, however (e.g. the ongoing 'hopefully' controversy; apparently they consider it illogical when a speaker adverbially describes their feelings without doing so in an expressly grammatical context).
If you really want to be correct, it's best to just do what other prescriptivists seem to believe is right and not think about it too much.
Personally, I think making the sentence "grammatical" by expanding it (I am a linguist and my opinion is that...) seems rather unnatural.
Along the same lines, the redditism "(Profession) here. (Opinion)." could also be excoriated by prescriptivists for being ungrammatical on the grounds that it is a sentence fragment, yet it gets the point across and I have yet to see anyone complain about it.
In a very narrow and technical sense, yes, you could argue that the sentence is structurally unsound.
> so the only explicitly available subject for the modifier to refer to is "it".
Just because it is "possible" does not mean it is actually parsed that way.
>So the problem is where the subject "I" comes from when it's left unmentioned,
I do not see this as a problem. I think self-referential phrases (i.e., referring to the speaker), are usually clear, which I believe is the case here. There are much more ambiguous examples that may be nearly meaningless to the listener, but they usually involve two separate third parties that cannot be differentiated by the phrase in isolation.
In normal human conversation, it is generally easy for the listener to determine when the speaker or listener is being referred to, regardless of whether there is an explicit syntactic reference.
In short- yes, the sentence could be more explicit (maybe it does violate certain notions of Standard English/prescriptive grammar/style choice), but no, it is not unintelligible (it does not escape most listeners' mental grammar).
Maybe an analogy is sentences like "going to the store later, want to come?" (although the particular problem is slightly different). One interpretation is that English is becoming a pro-drop language in some contexts, but even many speakers who utter that sentence would agree that it's slightly syntactically unsound for omitting the pronouns (even though it didn't tend to harm understanding). (I'm not positive that this case is analogous to the dangling modifier case.)
This is the sense in which a lot of people parse language. The whole point of correctly using language is that readers don't have to jump through hoops to understand what mistake you made and why you made it (i.e. what you actually meant), which ultimately makes communication easier.
>This is the sense in which a lot of people parse language.
I completely disagree. You are describing a sense in which they consciously analyze language. Mental grammars are much more plastic and forgiving than you seem to suggest. I do not know a single person who, when first reading such a sentence, would not be able to derive the intended meaning from it on the grounds that the modifier has no explicit referent, and I think suggesting otherwise is far-fetched.
Sometimes perhaps a genuinely ambiguous phrase may cause the reader to review and come up with an alternate meaning or two, but it is not as if their brain says "SYNTAX ERROR" and can't continue, bringing communication to a grinding halt.
>The whole point of correctly using language is that readers don't have to jump through hoops to understand what mistake you made and why you made it (i.e. what you actually meant), which ultimately makes communication easier.
I think you are suffering from an illusion that there exists an objectively and universally correct, monolithic, Standard English.
I also think that you are on the verge of a slippery slope argument by even suggesting that a speaker-referential dangling modifier caused readers to "jump through hoops" and hindered communication.
"I know what you meant" is probably the hardest thing for any prescriptivist to admit because it immediately destroys their very arbitrary arguments about "logic" and "efficiency" of language.
My point stands that there is a huge difference between a perceived poor style choice and a genuinely ungrammatical utterance. But human beings do not naturally speak ungrammatically. There are simply dialect-dependent variations in grammar, and the more prestigious dialect will almost always be touted as "the right one" by prescriptivists, who will then say that all other dialects are "ruining the language" and "making communication more difficult". By that line of thinking it's reprehensible that different languages even exist at all.
Sorry, but you are never going to get a universally agreed-upon and practiced standard. Yes, we need some rules to have a grammar and a language, but there is a huge range of how important those rules are for communication. Saying that any old "incorrect" use of language hinders communication is, to me, a bit like suggesting that people who jaywalk in a city are liable to gridlock its transportation system.
This is true. Sentences without correct grammar are unparseable(like "he jumped didn't far"). A lot of people seem to confuse semantic or logic issues with grammar issues.
Conveying meaning differs from conveying the intended meaning with clarity. When you have to say 'most people', we already have a problem. Reference material should refer. Leaving it up to intellectual interpretation is simply no good for material that is meant to record and preserve information.
That said, this possibly has nothing to do with the difference between 'comprised of' and 'composed of'.
Right. That's called a "house style", which is not about correct or incorrect but rather about having a more-or-less arbitrary standard all writers must adhere to in order to make the final product more consistent.
I do agree with what you said.
For me, it really comes down to how you view the purpose of language. If the purpose is to convey meaning as accurately as possible, then these changes devalue the language since the usages that result from ignorant speakers are, by nature, less precise. The oft-cited "begging the question" has a very precise meaning that's difficult to convey in other terms. When ignorant usage widens the meaning, we lose the ability to easily refer to the more narrow meaning.
On the other hand, you can view language as a shared cultural identity that's always shifting and evolving organically. In that light, these minutiae are pointless because the majority of speakers don't understand the subtlety and never will. You can try to educate people, but you'll just end up alienating them. As humans, we're evolved to learn language from our environment and the people who talk to us, not from a textbook. So why would we consider the textbook version to be the canonical version?
These two viewpoints are diametrically opposed and yet are both reasonable and there are many well-educated people on both sides.
However, there are not many linguists—y'know, people who study language and how it works—on the prescriptivist side. There are a lot of well-educated people who reject evolution, too, although you'll note that not many of them are biologists.
It's almost like a client-capability protocol upgrade in the tech world...if you can use the enhanced protocol, you can benefit from it. But if your client (the person you're speaking to) doesn't understand the subtlety of your word choice, it's pointless and potentially confusing to choose the more advanced usages. So I fall back to the common usages that tend to bother prescriptivists.
But since this forum seems to be the more capable, here's one departure that people might find interesting. "All intensive purposes" is a malaprop, a specific language error that's almost never considered to be correct. But they can be fascinating examples of how we make sense of the world. My mother was a child psychologist for 30 years and collected malaprops from the children she saw. Children, especially those who have not yet started to read, have very small vocabularies and learn words phonetically from those that speak to them. When they encounter words that they don't know, they try to make sense of them from the context in which they're used and the similar words that they've already learned. This is a surprisingly effective technique, but often results in some interesting mistakes that surface as malaprops. My favorite example was a child that thought that Alzheimer's disease was actually "old timer's" disease. From that one mistake, you can really see how children approach unfamiliarity in their world.
Edit: this is a nice article about it http://www.nytimes.com/1994/01/23/magazine/on-language-retur...
I'm not sure that's accurate. Being a descriptivist doesn't stop you from wanting works that are important to you to adopt clear and consistent style; the mere fact of having a usage preference and seeking, within a collective work, to have that preference consistently applied is not equivalent to linguistic prescriptivism.
It's the whole descriptive vs. prescriptive thing, and MW explicitly adopts a descriptive standard (they say so in their preface). A good introduction to this larger question is David Foster Wallace's (excellent) essay in review of Bryan Garner's (fantastic) usage guide (http://harpers.org/wp-content/uploads/HarpersMagazine-2001-0...).
If you are interested in usage, and are not familiar with Garner's book, I recommend it. (http://www.amazon.com/Garners-Modern-American-Usage-Garner/d...) It is superior to MW for most writers' purposes. (For students of linguistics, MW might win out.)
Garner, incidentally, doesn't much like "comprised of" -- http://www.lawprose.org/blog/?p=2385 (“invariably inferior”).
I don't really understand the impulse people have to say, "Language is constantly in flux and there is no authority on what is correct usage. Thus, you should not try to change people's language and I am the authority telling you this."
If you're descriptivist, then one of the things that you should describe is people's bottom-up attempts to change language by, say, excising 47,000 uses of a phrase from a commonly read reference material.
I don't think descriptivists are obliged to treat a one-man crusade on a phrase with the same gloves as the grand-scale evolution of that phrase in the first place. 47,000 instances of "comprised of" across a zillion wikipedia articles written by a bunch of people is evidence of lingual evolution; one guy reverting those all to "comprised" isn't further evolution, that's just one guy changing something he doesn't like.
Someone first used the phrase "comprised of" in this sense, and that person, it would seem, got some traction. Giraffedata is trying to be the guy who gets traction on making the phrase ungrammatical. This does happen! At some point, we stopped for example using the second-person informal (thou). This didn't happen because the God of Zeitgeist changed the minds of millions of English-speakers all at once. It happened because individual people started to change how they used language, and it caught on.
Do I think that giraffedata's crusade is likely to be successful? Nope. But what I don't get is being all offended that someone is trying to change language on the basis of "language is always changing."
I think the offensive part comes from the implication that one way is right and one way is wrong.
If people start to use "up" to mean "down", that defeats the purpose of having words for them.
If a term's very existence is to provide a context-free way to disambiguate, it will be a pain when people expect you to infer the real meaning from context (cf literally).
In programming, if you accept some new idioms, but those make it hard to see if a given line of code mutates state, that will create pain for you down the line, even if it's "just the evolution of the language".
... ad nauseum
As long as they don't try to drop a "nuke-you-lar" bomb, I won't correct them. I could care less, but then I'd be apathetic to their linguistic failings.
(Edit: grammatical error)
What's interesting to me is which items are in the set of things to be fussed over. Split infinitives, yes. Is-doubling, no. "Decimate", yes. Any other borrowing from Latin, no. It's so arbitrary, yet some people cling to it so fiercely.
There's some suggestion that it comes from Yiddish style dialect. "I should be so lucky!", for another example.
One says, "although I care very little, there is some wiggle room to care even less". The other says, "I care so little, I couldn't care less than I do now".
Either way, I care little is the main message.
However, "couldn't care less" is more sensible, because what is the point of expressing that you care little, but still have room to care less? That sort of expression would only serve as a retort against an accusation that you do not care. ("A: You don't care at all! B: That is not true, I could care less.") B admits that he or she cares little, but objects to being characterized as entirely uncaring.
We should choose the expression based on its logical sensibility, rather than regional dialect.
I don't know about "could care less" conveying the semantics - I genuinely paused when I first read it to work out which meaning it had.
Let's saying "caring" goes 0 to 10. "Could care less" includes everything from 1 to 10, assuming integer granularity - it's very much the right hand side of the scale, anyway.
Only "couldn't care less" covers that 0 rating.
So - perhaps it's terrible in conveying semantics and only context/tone imply the disinterest being communicated?
Fun can usually be substituted freely for a word like 'entertaining'. Is there something wrong with saying "That was so entertaining"?
"Entertaining" is an adjective and you can use "so" as a modifier to express how entertaining whatever you were doing or watching is or was.
This probably explains it better than I can:
I agree, there's definitely two conflicting usages here: and activity can be the noun "fun", in the same way that doing something can be can be "bliss" or "hell" or "a complete waste of everyone's time". And so while I can say "My drive in to work was hell", I can't say "My drive in to work was so hell" (because not an adjective). But if I can say "My drive in to work was enjoyable", and even "My drive in to work was so enjoyable", then what's wrong with saying "My drive in to work was so fun"?
So, instead of "so fun", some people would prefer that you write e.g. "so fun that I squealed with glee", or even just "it was incredibly fun".
At least that's the argument I've come across. To the extent I even think about such things, I couldn't really care less if people use "so" like this; the meaning seems perfectly clear.
Millions of people use this idiom completely intuitively, following their instincts of language that I, as someone who doesn't have English as their native language, can only be jealous of.
"I could care less" is either meant to be taken sarcastically, or it came to be because the "dn't c" consonant cluster is hard to pronounce, the same reason you don't pronounce the 'l' in could, the 'k' in knee, knight or knave, the 'p' in psychologist or pneumonia, or half the letters in Wednesday.