Myself I couldn't see that our garbage bin was dark green or that my house was kind of green until someone pointed it out. Now it seems impossible to miss for me, at least in daylight, but this fall I spoke to someone who was at my house and they was just as confused as I was.
I'm still partially colorblind but some colors seems to be just a matter of having it pointed out once, or seing it in the right light or something.
: FTR, here's a photo of a house with the same color: https://byggebolig.no/eksterior-tak-ytterkledning-og-maling-...
There’s a sense that colour is a bit weird like that and your story about only seeing the green once it was pointed out to you reminded me of that.
Really this just shows how the British empire managed to spread it's word (literally) but not manage to manage it properly. We now have a litany of different English language versions, which is interesting from a cultural perspective but entirely contradicts what a language is made for: communicating.
We really ought to start officially standardizing the English language and untangling the current written mess. That way we'd make 10% of English speaker's lives much better...
I do occasionally cringe at changes to English driven by the great influx of non-native speakers and lack of education of even the native speakers, my favorite being https://en.wikipedia.org/wiki/Begging_the_question
But hey, it's all good. This is how culture changes language. It's a living dynamic.
The lack of strictness in English means you can say something wrong but intelligible very easily, so we all get by.
By the way, it shouldn’t matter at all what a linguist thinks about that unless it is literally his field of expertise. And political sociolinguistics is not a field of expertise of most linguists.
Yeah but do they? It doesn't seem to me like a standard of this kind is anywhere near a priority for anybody. I'm not even a native speaker and I never felt the need of it. Why are we trying to fix something that's clearly not broken?
Traditionally, begging the question is the logical fallacy of assuming the consequent: proving that something is true conditionally on the fact that it is true and then deducing that it is true.
It is a mistranslation because it’s origin is a poor translation of assuming as begging and the consequent as the question.
On the other hand any direct translation (and initially translations were direct translations through several languages which is worse) of any early works of philosophy or logic (say Plato) will have this problem as the vocabulary Plato had to use just didn’t already have words for all the things in logic and philosophy that we now have words for.
How can we standardise English when we can't even agree on the spelling of the word 'standardise'.
But, seriously, one of the strengths of English is that it is a democratic language. It is usage which defines the language, not some official body.
Saying that English is a pretty anarchic language would probably be a more accurate statement, not that there's anything wrong with that fact.
As anyone who speaks French can tell you, the official body's prescriptions often have no relationship to the will of the people regarding how they actually use the language.
There's quite something to the polysynthetic view where modern French has a verb system with extensive clitics, but such an analysis devoid of appropriate genuflection to Latin gives les grammairiens anciens fits.
I don't see how the situation would be different with English (especially given that French was not as international in the 1600s as English is today -- so that ship has long since sailed). Not to mention that English spelling reforms have been tried many times in the past and rarely caught on within a single country let alone worldwide.
Surely there is an happy middle ground.
* as most well known things without strong factual basis
They are exactly that, less democratic, as the language standard is dictated by a committee of academics/politicians, leading to it often diverging significantly from the spoken language. In the most extreme cases language ends up used for the political purposes - for instance in my home country once known as Yugoslavia, following the break out of the federal entities into the new independent countries, there was a big push for separating each dialect of more-or-less common language spoken on the most of the territory into declaratively separate languages in order to backup the whole new politics of separate national countries and to deepen the gap between the new countries. So now we have 4 languages that are in spoken version basically the same, but standard languages differ.
It's actually the other way around, the standard languages are very similar, but spoken language differs greatly, even between different areas of one country, to the extent of not being mutually intelligible.
More importantly, why would you want to do this? It just seems like a compulsive reaction to the purity of standardization. If inter-dialect communication is the goal, why not establish a standard accent as well?
Other languages are less dynamic so they are less responsive to introducing new words. Some of those and others controlled by official bodies which may or may not represent the population are less popularity driven. In those cases a guardian relationship exists where words are vetting by others. English is vetted by all.
The previous couple of reforms I believe were successful because of communists' ruling style. The first one was dramatic, it removed several letter from alphabet and did a bunch of other changes. People who run away from communists in 1917..24 continued to use old grammar for decades.
> The difference is that countries speaking the language agreed on setting common rules, with the goal of making official what is already becoming standard by daily use.
There is a trouble with that. The standard by daily use is to follow official spelling. Pronunciation could shift sometimes, but spelling is fixed. Any changes in spelling (even official ones) feel as blasphemy. Some people could overcome this feeling, but the most could not and want not.
That makes trying to regulate it more reasonable.
And that's not even getting into all the other English variations out there.
The point here is that for many words in English, there is no actual link between the letters in the spelling and the sounds in the spoken word.
I've been learning German for a few years. The spelling is easy. I spell 90% of words right first try. If I hear a word I can spell it well enough to look it up in a dictionary and learn what it means.
The opposite is true in English. Does a word have "ie" or "ei" in it? There is no way to know. Does a work starting with an "n" have a silent "k" in front (or worse, words like honest that start with an "o" sound but are spelt with an h)? No way to know. Does a word end -ite or - ight? No way to know.
There are a lot of surprise complexities that serve a purpose (when you have 2+ words with the same pronunciation but different spellings) or that are just sytlistic (colour vs color). I don't object to that.
But we should stop pretending you can spell out words or that written and spoken words are linked, at least 25% of the time they're not. Just look at that last sentence: there is no W in written and no L in should. No wonder kids struggle with this crap.
They may be complicated, but rules exist, and I find it interesting this was the first example you jumped to - there's a rhyme we're taught as kids that works the majority of the time: "'i' before 'e' except after 'c', or when sounded like 'a' as in 'neighbor' or 'weigh'".
When I was at school (I turned 8 in 2002, I'm a brit in case it matters :) ) it was just "'i' before 'e' except after 'c'".
Thats actually already bad, why swap 2 letters around based on a third if the word sounds the same anyway?
Then we had to add two more caveats to the rule.
And English spelling still doesn't fit this rule, even with the extra bits bolted on. At best it's a guideline (Frequencies or any other pluralised -cy word seems to break it)
Before we learnt 1000 different spellings. Now we learn a rule, but it's quite complex, and then we learn 200 exceptions to this rule.
But for what?
From now on, it's always "ie". Done. Wouldn't that be simpler? We could eliminate the I all together. There is no "I" sound in neighbour or weigh. They should be neybor and way (or if you really need to distinguish all the other Way words, "wey").
Then instead of 1000 hours of frustration, little kids can all spell correctly on day one and get on and do 1000 hours of maths or reading or art or something.
Weird. Foreign. Neither. Keith. None of those are sounded as an 'a'.
That's why we can't realistically generalise and say these rules are actual rules - they're more rule of thumb. If you're lucky.
Then there are words like "stein", that just feel like the rules are too good for them.
In english it's simply not possible to, in general, pronounce a word correctly that you've never heard said or spell a word you've never seen written. This is true even if you limit yourself to only germanic words
I assumed all languages were like this (spelling only loosely/occasionally linked to pronunciation). But German manages it just fine.
I always found it frustrating being told to sound out words and everyone acting like this system worked when I was a kid. It put me off reading/writing etc. Not sure what I can suggest other than being honest and saying "yeah, you're right, that word makes no sense".
There is more variety in how my native Norwegian is spoken, and that is only spoken by 5 million people.
You also don't seem to distinguish between dialects, written forms and a language. E.g. Norwegian is a language with dozens of dialects and four written form, two of which are official. The various dialects map to one written form more closely than another.
Nothing stops English from being a language of different dialects and different written forms. The point isn't how many written forms there are but that those that exist are standardized. That is IMHO not a very hard thing. E.g. the largest area where English is natively spoken is the US. And the US has almost no variation in how the language is spoken.
Yes I know American loudly object to this but I have traveled all over the US and lived several placed. Honestly there is not much difference between how somebody in Grand Forks, North Dakota speaks and somebody in Honolulu, Hawaii.
Yes, by American standards it may sound very different. But by say Swiss standards, Norwegian standards etc not very much. Even tiny Britain has more regional variation.
One would simply create American, British, Australian etc standardized written forms, while trying to harmonize each written form as well as possible.
I agree with your point in general that English as is spoken in international media isn't all that diverse, but if you look at the British Isles where the language has traditionally evolved for centuries you'll find a picture that more closely resembles other long-lived non-international (? national?) languages with rich dialectical variation.
Non-english languages are way better at this. In Slovenia, for example, we have about 2 million speakers. Those are divided into 32 dialects. Many of them mutually unintelligible.
Germany has a similar problem with regional dialects.
That’s why these languages have a standard official version. Because they need one.
The fact English doesn’t have such a standard is fantastic evidence to the relative lack of strong dialects.
Then again, we do have BBC English and American Movie English which act as de facto standards.
Oh and air traffic control english is also heavily standardized from what I’ve heard. Defined with the rigor of an API.
What’s interesting about English is the relative lack of dialects outside the UK. It’s already a very standardized lingua franca because of how it spread.
And what’s left is solvable with the BBC/Hollywood English. Similar to how germany has hoch deutsch – an invented standard dialect everyone learns in school.
You can have a standard form of English taught in schools without demanding say-to-day stabdardization in common use. And we kinda already do. Just unofficially.
Iechyd da i chwi yn awr ac yn oesoedd!
For german there aren't any standards maintained by an institution. Duden  is an institution, which tries but they aren't official imho.
This is also why the German language was so cohesive, but is getting more and more diverse. Because, the German speaking population is now more diverse.
And your assessment of German, displays a lack of historical knowledge of how language develops and get standardized. Most European languages tended to be far more diverse in the past. Contrary to your claim all European languages have become ever more homogenous in modern times.
Why you ask? Because language tends to develop into different forms in isolation. Hence all of Europe has historically been a patchwork of languages. The formation of nation states through the 1800s and standardized schooling is what caused the gradual homogenization of language.
It is the late founding of the US, which the the reason why American English is so homogenous despite the size of the country and its diversity.
French, German, Norwegian, Dutch and plenty of other European languages was in fact so diverse that a national standardization was needed in large part to actually be able to write books which could be used throughout.
In fact in my native Norway creating a written form of Norwegian proved so difficult we ended up with something like four different written forms all with different spelling and grammatical rules. Two of these written forms are official and taught all over Norway. Neither written form really corresponds exactly to how anybody speaks as Norwegians speak a multitude of dialects with different pronouns, grammar and words which loosely map to either of the written forms.
But at least by having a language board we have somebody trying to make sure the spelling isn't the utter mess that English is. English grammar is easier than Norwegian grammar, but the spelling is the worst of any language I have learned.
The differences are much greater than the differences between Spanish or English dialects for example.
Nice idea methinks but I reckon there's not a snowball's chance. For starters, Anglophone countries would have to agree upon some structure or institution to act as a regulator à la the Académie Française, (the Moderator of the French Language): https://www.thoughtco.com/academie-francaise-1364522. Given past and present politics, it seems to me they'd first delare war on each other before ever agreeing to that!
You simple create a British, American, Canadian, Australian etc language board. Then these boards can choose to coordinate and cooperate. There is no problem having multiple written forms. The point is two have these written forms standardized and maintained.
E.g. Norwegian regularly imports new words from other languages. Usually the board will create Norwegian variants of these following Norwegian phonetical rules. These become one of several valid variant of a word. E.g. we imported the word "genre" from French. The board made a Norwegian spelling for it "sjanger." Both versions where valid until gradually "sjanger" replaced "genre."
The Dutch and the Flemish have different language boards but the cooperate so Dutch and Flemish is to my knowledge almost identical in spelling.
You come from a part of the world that is renowned for its sense and civility, whereas over the last 40 or so years the Anglophone countries have had internal squabbles over just about everything and anything to the extent that they have—in parts—become almost dysfunctional, democracy just isn't working to the extent that it once did. I say this as someone who was born in one, worked in three of them, and travelled to the others at various times over the years.
Take the US for instance: the country has become so polarised over whether to wear facemasks in the COVID-19 epidemic—even after getting the best advice available to wear them—that one could be forgiven for thinking that insanity pills have been added to its water supplies. It's likely the very notion that someone or authority might make suggestions about changing the way people write or speak—even if it's not aimed directly at specific individuals—would likely be taken as an affront by many.
Again, I'd love to be proved wrong as I've been saying for years that both pronunciation and spelling would be much easier if English were to introduce characters with accents/diacriticals into those awkward, pesky words such as 'through', 'thorough', 'thought' and so on.
In fact, I posted a rather long response on the neurosciencenews.com website asking if anyone knew of any research work that's been carried out with bad spellers and those with demonstrated dyslexia with the aim of helping them to overcome their reading difficulties 'by training them with accented text (albeit suitably contrived for the purpose)'. I went on to suggest 'that if it's not so then this might be an avenue worthy of research'. It seems to me, that if any progress is to be made in cleaning up English then the lever could well come from a successful—or even partially successful—way of treating dyslexia (as the research would have demonstrated that the changes to the language were worthwhile and should be made).
Your point about multiple implementations is worthy of note. If say changes were made to books specifically printed for dyslexic people then, over time, some or all of the changes could be more widely adopted.
My post hasn't appeared on neurosciencenews.com yet, presumably as it's a moderated site. I'll look at the post again and see if parts of it are suitable for posting here, if so then I'll post them directly below in reply to this post.
"From my experience, English is a dog of a language, its grammar is hither and thither, its spelling and punctuation are all over the place, and I really feel sorry for a non-native speaker who has to learn it.
I've never been a good speller nor a particularly good reader so reading a passage aloud in public is not something I particularly relish. On the other hand, my partner is not only an excellent reader and speller but also she can do cryptic crosswords with great ease—which is a task that's always eluded me. (It's always seemed pointless to me to deliberately increase the entropy of what one is saying by choosing cryptic words and meanings. One could just use clear text to avoid confusion.)
I put my lack of ability down to both my marginal aptitude for languages and not having much interest in learning them when I was at school. (Whilst it's possible I'm on the edge of being dyslexic I don't consider my reading handicap sufficiently large to bother me.)
My spelling was always worse than my grammar and the ways we were taught at school didn't help. For instance, spelling tests were marked out of 50 instead of 100 with two marks taken off for every spelling mistake. I cannot remember the total number of words in the test but it was well in excess of 50 and that meant one could score negative marks for spelling, which I did on occasions (but I wasn't the only one, there were also quite a few others). It seems to me that giving negative marks wasn’t the most productive way to engage students' interest.
There's no doubt that words such as 'pint', 'lint', 'through', 'though', 'thorough', etc. are a major problem for bad spellers but it's the sheer number of them that's the problem, add the large number of 'strange' English proper names to this and we're in big trouble. Whilst the correct punctuation of words like 'Wycombe' and 'Warwick' are comparatively well known to native speakers there are many others of that kind which aren't—and I reckon those two words would be very problematic for those learning English as a second language.
That long intro leads to my main point, which is to ask a question I've asked many times before without ever having received an even partially satisfactory answer. That being why doesn’t English use accents/diacriticals marks to help resolve many of its peculiar spellings and wayward pronunciations. It seems to me that if ever a language needs diacriticals then it has to be English. The problem of how to pronounce 'pint' and 'lint' correctly would be solved instantly if a diacritical were to be applied to one 'i' and not the other (for instance 'ì' 'í' or 'î' could be used).
From my experience, if you ask those who are knowledgeable in English and competent in using it (such as English teachers or those who run grammar or spelling websites) about potential usefulness of using diacriticals in English then their responses are nearly always negative or at best nonchalant. As they have already mastered English without the need to resort to them, they never see any need to ponder the matter further—and those who would have actually benefited from their use have never had sufficient knowledge or wherewithal to push for their introduction, hence the complicated mess that we have today.
Moreover, both native speakers and those with English as a second language face significant problems when they first come across written words that are not common in everyday usage. For example, I recall that I first came across the words 'chiral' and 'enantiomer' in textbooks well before I heard them being used by professionals who knew how to pronounce them correctly, again diacriticals would have quickly solved the problem. One may well ask why not consult a dictionary and use the IPA references. Correct, one can do that but if one is bad at pronunciation and spelling then one finds so many such words it becomes a never-ending tedium, thus one just skips over them none the wiser.
It seems to me that people with dyslexia and or those who are having difficulty with pronunciation and spelling would be much better served if English used diacriticals; I base this on my own experience from having learned other languages. I studied French at school and like English, I was never particularly good at it, similarly, for some years I lived in Austria, so I've a smattering of German. What's relevant here is that when I was learning French it was drummed into me that it was essential to understand the differences in pronunciation of 'e', 'é', 'è' and 'ê', etc. Thus, I've not much difficulty in pronouncing words like 'd'être' or proper nouns such as 'Tahère' and 'Sainte-Sévère-sur-Indre'—a name certain cognoscenti will no doubt recognise (sorry the circumflex is missing, so three will have to suffice; right, I don't know any names that use all four pronunciations of 'e'). ;-)
What I am saying is that I could mount a reasonable argument to say that my pronunciation of certain French words that contain characters with diacriticals is better than it is of many English words whose characters are missing them! <...>"
In the end such a body would wind up either as descriptive, which is interesting academically but also useless from a standardization perspective, or prescriptive, which would just lead to it being happily ignored.
1) A dictionary is a representation of agreed upon words and their meaning (as in scrabble).
2) A dictionary is a list of how you are likely to encounter a word being used. (as in urban dictionary)
with the latter a dictionary is a tool to help you better understand word/uses that can be unclear, with the former a dictionary provides justification for your choices of wording (the dictionary defines this word so and so, therefore my usage was correct).
In my opinion dictionaries are trying to be 2) but people use them as 1). It is how we get definition of "Literally: not literally, figuratively"; "literally" does not mean "not literally" it is simply used very often like that for emphasis.
One that comes to mind is "How it works?" Although it sounds weird, it is now common enough that there is no way to retroactively correct all the times it has come up.
The other one I see a lot is starting a sentence without an article e.g. "Asking, 'How it works?' is now considered grammatically correct. Reason being [,|that] it is frequently used in spoken English."
I've never heard that phrase before, so I guess it does qualify as wierd, but it's clearly following the same grammatical construction as [looks at hamburger] "Hamburger?", ie asking "X?" as a offer or request for X.
It is more natural to say either "How it works" (with no question mark) or "How does it work?"
In some ways auto correct is a stronger enforcer.
I’m looking forward to the day when duck is considered a swear word.
This is because France doesn't contain the majority of French speakers in the world, and even if it did there doesn't seem to be an incentive to follow any rules they may publish.
Source: I live in Québec, we don't defer to the Académie.
Escaping to Ontario is my usual relief.
And it has different pronunciations depending on the region.
The article's argument is the lack of consistency, not standardization. Unless you are arguing that children--in whom dyslexia first manifests--are being exposed to the vast diversity of English across time and space, then there can be no doubt that mutability and lack of unified culture contributes nothing to dyslexia.
But that doesn't mean people outside of Spain follow the standard. (And depending on class and what part of Spain you live in, there's a good chance you don't follow the standard either, I'd imagine.)
For centuries people wrote in Latin in Europe not because it was anyone natural language, but because of the reach it would grant.
 and because in many cases local languages were less sophisticated and inadequate to philosophy/poetry.
If English had a central authority, I’d imagine that we’d have missed out on numerous great works.
Ok, but let’s just assume for a second that there was some standard.
Your statement would still be true.
Nobody is “confined to rigid rules” unless we also make the Grammar Police a real thing.
First, German is not standardized, and the attempts to (re-)standardize it are, let's say, controversial.
Even within Germany there are vast differences within the spoken language, mostly but not exclusively by region. The written language is more uniform but that's also a bit contested. I'm not a linguist but I'd guess the differences are at least as strong as in different parts of the UK.
For example in Berlin you would have the advantage that a native speaker of any form of German can pretty much understand the natives when they talk amongst themselves. You would not have that luxury in Swabia.
Second, the idea that this presumed standard crosses borders is... well sorry but it's absurd. Ask any Austrian or Swiss-German. Or even read their newspapers.
And finally, English is rapidly creeping into unofficial-but-official German. For example they routinely say "Vaccine" in the news instead of "Impfstoff" now. I could go on but I guess it's off topic.
German is a living language, and an international one, and as much as Duden might have been a guiding light to my generation it's not that for the kids at all.
: (in German) https://de.wikipedia.org/wiki/Reform_der_deutschen_Rechtschr...
Which 10% is that?
The same holds for the vocable - here, "nobody" complains that this is just something else that needs to be memorized. So, just consider the genus as part of the vocabulary word.
I'd wager this is due to the outsized influence of American and British culture in other nations, the sheer number of people who speak English as a second or third language, and the relative ease with which certain English words can be borrowed into languages with entirely different phonemes relative to the other way around.
"Standardising a language" never works. Prescriptivism rarely does because languages are living things and evolve no matter how many rules you throw at them.
> untangling the current written mess
Reforming the written language, however, is quite a possibility, and is often a good thing, because it actually follows a language's evolution rather than pretending that rigid rules reflect reality.
E.g. we used to count closer to Danish in one of the Norwegian variants - 27 used to be "syv og tyve" ("seven and twenty"), while it is now "tjuesju", both altering the order and the words for twenty and seven. The Danish form was abolished in 1951. You'll still hear people - especially older - use the Danish form now and again, but it has become relatively rare outside of small geographic areas.
It does however not work if you're not prepared to deal with real-world use. E.g. "syv" was reintroduced as a valid (but deprecated) word for "seven", because its use has remained more persistent and proved harder to eradicate.
Norwegian language reforms have mostly been quite pragmatic in that respect - there's a general direction of travel, but the reforms sometimes undoes changes that proves not to "take". But control over what is taught as "correct" in schools has proven to work quite well as a means of making these changes happen, as long as you're patient and accept that certain types of changes are a lot easier to make happen than others.
It also of course matters that the changes makes sense to the users of the language. In the case of the "Danish counting" a lot of Norwegian dialects already used the "new" form, so it was a simplification, not a new
invention - getting people to buy into something entirely new is generally harder.
Apart from the many instances where it worked I guess. English is really the odd one out here for not having a standard body.
It never really worked in those instances. What you see in the books and what you hear in real life are often two quite different things.
Even things like grammar rarely survive real life.
In many languages, standardisation means that the language actually survives because without it loses its utility and will replaced by a dominant language.
Not an expert in that matter, I only try to replicate what I've heard.
Basque did standardise quite late. To my understanding, people had difficulty to understand one another when they came from another valley. That limited the utility and Castilian (aka Spanish) was often used which put pressure on the language. And what words do you learn then at school? And what do teachers learn? And what is used at the university?
And all the media, in which form will it be?
Romansh didn't standardise (or better said the standardisation didn't took hold) and faces that difficulty.
I bet you'll find a similar story in Irish Gaelic.
You are basing your assumption on languages using Roman writing system which actually isn't capable of expressing all sounds used within the language. All language systems doesn't suffer this. For example most Indian languages have accurate sound to letter mapping even to the level of defining short and long sounds of same alphabet differently. In order to speak correctly you just have to say the individual sounds together. There is no weirdness involved.
Romanian and Turkish disagree. And the spoken language still differs (probably less so in Romanian, more so in Turkish).
One can still speak wrong if they learn to speak the letters wrong but chances are reduced. Are Romanian and Turkish similar?
It's more than just sounds. Turkish and Romanian alphabets capture the sounds of Turkish and Romanian alphabets quite well.
In Turkish "I will do something" is written like this: "Bir şey yapıcaǧım". And if you're going for proper enunciation, that's what you will say. In most situations most people will say "Bi şey yapıcam" (note the omission of "r" in "Bir" and "ǧı" in "yapıcaǧım").
Do the letters correspond to the sounds? Yes. Does it help? Nope, people will not speak the way it's written, because written rules describe a very specific rigid set of rules. And, for example, elisions and contractions  are very common in nearly every language and are often frowned upon in written texts (except for a small number of unavoidable ones).
And that's before we go into the plethora of sounds between dialects and regional variations. For example, in Swedish, there's a combination of letters, `sj` that has at least four different pronunciations across Sweden. So, the word "seven", "sju" will be pronounced with [ɕ] in one part of Sweden, with [ɧ], in a different part of Sweden and so on. It's the same word, should it be spelled differently for each group of people?
And note, we're just touching just 1% of 1% of the complexities of pronunciation :) And they rarely if ever can be captured in written text and rules for the written text.
 https://en.wikipedia.org/wiki/Swedish_phonology#Fricatives Wikipedia has a full paragraph dedicated to this alone
And yet, you still have dialects, you still have people using words and grammar in slightly and not so slightly different ways from the prescribed standard.
Funny how it "literally works".
And yet everyone speaks in their own dialects, pronunciation, many places have their own twists on grammar etc (RP is an accent, so grammar isn't relevant when discussing RP). Let me quote myself from my original answer:
--- start quote ---
--- end quote ---
So, Britain has standardised English. Did it work? Well, not really, almost everyone keeps speaking in their own way. As for government speeches... Here's SNP's Ian Blackford in the House of Commons (not government, but still, a high-ranking politician) . And I believe Americans, Australians and New Zealanders have no problem understanding their politicians despite English not being standardised.
As for TV. Have you ever watched British TV outside BBC News programs (which traditionally, but not necessarily use a "BBC accent" which also changes with time)? Just a few samples: Big Narstie Show , Eastenders , and, yes, BBC News (around 10:02 mark, 16:30)  Or even something like this , but this is an extreme example.
All that despite English having been standardised for more than a century.
edit: And arguably even spoken language eventually gets standardized through the written word. There's tons of regional German dialects without High-German, a more or less randomly chosen dialect, people from different regions would have a hard time even talking to each other.
> Reforming the written language, however, is quite a possibility, and is often a good thing, because it actually follows a language's evolution rather than pretending that rigid rules reflect reality.
This is particularly interesting because the argument is that English reveals more dyslexia because it is irregularly phonetic -- but the Chinese characters used in Chinese and Japanese aren't phonetic at all. But they're still made of components (including radicals) arranged in horizontal and vertical orders.
I would have assumed that dyslexia would be just as much of a problem regarding the ordering of components within characters. Anyone have any idea why it's apparently not? Or is the article wrong?
Paraphrasing wildly, Wolf said that writing as a technology is something we built by repurposing different areas of our brain that evolved in response to our pre-literate environment. Our brains didn't evolve to read. Reading is one of those marvelous things that we hairless apes have managed by pushing ourselves. It's what we do.
Dyslexia is, essentially, to put it in computer terms, the result of slow throughput in the cognitive system we've built from our (spoken) language part of the brain and our visual-spatial part (which probably evolved in response to recognizing details on the landscape, etc). In most people, when they learn to read, the brain "wires up" this system in an efficient way. It's wired up cleanly. In people with dyslexia, the wiring is more like a rat's nest — or maybe "spaghetti" would be a better way of putting it. (Again, I'm paraphrasing!)
Dyslexia is more prevalent among left-handed people. It seems like (if I remember left and right sides of the brain correctly), language is on one side of the brain, and visual-spatial is on the other. Now, left-handed people are right-brain dominant. And, if I remember correctly, left-brain dominant people are better at wiring up their brains, when the "wiring" needs to cross over from one side of the brain to the other. In general, right-handed people do a better job at building a "system" than left-handed.
Regarding Chinese and Japanese, the article talked about the way these writing systems are taught: namely, students are made to sound out what they're writing, over and over. So, if I were to guess, this slow and deliberate practice recruits all the parts of the brain involved more energetically, and as a result helps the individual compensate for any innate, relative deficiency in building a cognitive system that coordinates left and right sides of the brain: meaning, wiring up a brain that can read.
That's my best guess. Speaking out loud while learning to write may be the critical difference.
That's not true in any meaningful way, as far as I know. I took several years of Mandarin courses, and there were exactly zero times I was able to guess a pronunciation from a radical... and I'd learned 1,000+ characters by that point.
Perhaps that's true in some areas of technical vocabulary? Or there are a handful of examples that have the same pronunciation. But I'm pretty sure that trying to guess a pronunciation from a radical, you'd be wrong 99% of the time.
Note that when we survey the multiple dialects in China, we see that they more or less preserve the Middle Chinese 4 tones, even though the tones' realizations have evolved so greatly such that the Middle Chinese tones' actual realizations are but unreconstructible.
"Researchers looking at the brains of dyslexic Chinese children have discovered that the disorder in that language often stems from two separate, independent problems: sound and visual perception."
The article explains it in terms of teaching methods: it sounds like the method they describe involves a consistent relation of the order of strokes to the parts of the word being recited with it; that would make the phonetic associations (even though the symbology is not phonetic) consistent. A consistent phonetic writing system (which is likewise less likely to reveal dyslexia) has a similar feature. English is taught phonetically, but has intense phonetic irregularities, so the phonetic connections at present but ambiguous. If phonetic associations with the symbol through learning have an anchoring effect, that would seem to explain the identified effect.
Yes, you have symbols in Japanese, but each of those symbols can also be written out in hiragana (their phonetic script). And there are no special rules for pronouncing hiragana that change depending on the word (with a handful of caveats). Today's Japanese is very, very straightforward and easy to pronounce. Now, please note that I am not talking about accents - having an accent is a whole another issue that will take loads of practice to get rid of.
So I'm a bit confused about how Japanese is straightforward and easy to pronounce -- isn't it actually one of the hardest languages in the world to memorize how to pronounce, behind Chinese?
I know there are the phonetic hiragana and katakana, but they're used particularly with children and language learners. An adult reader has to memorize ~2,500 characters non-phonetically, no?
However, Hiragana and Katakana are not just for children or students. They form a significant portion of everyday text, and in fact necessary for constructing sentences properly in Japanese. For example, many/most grammatical particles like "of", "in", "to", "from", are written in Hiragana. And foreign/imported words are usually in Katakana.
Regarding the ease of pronunciation - I think the grandparent comment was talking about the sounds of the language, not about memorizing how to read and pronounce the text.
Japanese speech is composed of a fairly small set of simple sounds, similar to Spanish. Having taught both Japanese and English, I would say the latter has many more tricky sounds - like th in "the", or ar in arm (especially how Americans say it). In contrast, if one knows the sounds of each Hiragana (consonant + vowel combination), they cover almost all of Japanese speech.
1) Pure hiragana
2) Pure Kanji
3) A mixture of Kanji + hiragana
#3 comes into picture especially when conjugating verbs. As the base verb is written in Kanji but the various forms are tacked on in hiragana.
The important bit : The kanji symbol essentially just compresses the hiragana into a symbol, but all words that contain kanji can be uncompressed back to hiragana.
Japanese is definitely hard to learn, but it is not hard to pronounce. The hardness comes from the fact that the Kanji can have multiple 'readings' depending on the context.
Source: Studying/struggling with Japanese for the last 5 years.
That said the actual sounds are few and really easy to vocalize (if you ignore pitch accent). Adults do use kana for certain words and in between words, but it's true that you do need to know a lot of kanji to read things fluently.
This is esp. painful and disadvantageous as one, who is forced by the 'independent-liberal' Indian state, to either study in English or endure a life of penury. Indian languages are so much easier to master, but the lack of economic-educational opportunity means that this ability is not institutionally well-developed & supported either. The country is doomed to poverty for atleast a century given such continuing supposedly 'nationalist' colonial policies (in stark contrast to China, Korea, Japan and now the ASEAN nations as well).
कुत्ता रहा ना घर का ना घाट का ...
Just seems to me like English is not in the top 10 list of reasons why India might be "doomed to poverty for atleast a century".
धोबी का कुत्ता घर का या घाट का नहीं, धोबी का है|
Amo - "master" and "loved"
Sal - "get out" or "salt"
Cerca - "close" or "fence"
And many more, and that's without getting into coloquial terms, for example in my country
Puya - "it stings" or "damn" as an euphemism for "puta"
I think English's biggest advantage is that it is easy to be understood in, it's writing, while inaccurate in terms of pronunciation, is easy to understand and learn
I'm not sure if such an example exists in Spanish. The closest one I can come up with is "¿Cómo como? Como como como."
For instance 子(child) 古(old) can both pronounced as "ko"
I only have a vocab of ~300 words but I can list at least a dozen or more off the top of my head. Far more than my experience with English.
Except "h". And "v" sounds like "b". "c", "s", and "z" a lot of times sounds the same. And where I'm at, people say "perame" instead of "espérame".
This is true except for `h` which is silent unless preceded by an `s` or a `c`. In some cases `u` is silent.
That's incorrect. See for example deshonesto.
Is it then optimal to bilingual your kids into Japanese + English? Perhaps perhaps.
Most of our press coverage has been in English, and I'm sure this affects the usage distribution. But we have actually been covered in about 20 different languages, and we're still way more popular in English than in those other languages.
Twenty years later, I still read lingerie as it is spelled in my head before correcting myself, since I encountered it in writing and didn't make the connection to the spoken version. So I was running around with two words in my head that mean the same thing for so long that it became a habit.
I'll blame the French for that one though.
And now we have once again two competing languages dominating the world stage, american English in the west and Chinese (or Mandarin I guess) in the east.
I can't imagine a future when these merge naturally, because they don't have any common ground to speak of.
However, realistically speaking... If we actually do manage to become a singular community, one language would win at some point. I doubt it'll happen while we're alive, but it'll surely be an interesting topic for historians of the future... As long as we survive long enough I guess
As far as merging, Mandarin borrows phonetically sometimes but isn't nearly as eager as Japanese, which has found quite a bit of loanword usage. Which is to say, I think it's possible.
Surely Mandarin is not making much leeway into India, Korea, Japan, or most other countries in the east.
If not, I didn't claim it either.
But you already made up your mind in misinterpreting my message so I'll desist.
English is my second language. At first I found English quite easy: non-gendered nouns, simpler tenses, etc. But, after 25 years of using it it still shocks me at times. If you see a word for the first time you can have a guess how to pronounce it, but you could never be certain without consulting a dictionary.
You also make it much much harder for future generations to
read the vast wealth of historical literature.
Clearly rather than orthographic reform, we need pronunciation reform. No one seems to give this proper consideration whenever I suggest it though...
It would be very interesting to see research on the inverse: native English speakers who experience dyslexia in other languages. Also research on non-symbolic language dual learning (e.g., German speakers who learn English).
It just seems so for native speakers as they know a lot more about the history of their language than others.
English speakers seem particularly susceptible to it since its lineage has been extensively studied.
It may also just seem so to me since I hang around primarily in the English speaking Internet.
But it also seems to actually be the case for english
“ In the Max Planck Institute’s World Loanword Database, Mandarin Chinese has the lowest percentage of borrowings of all 41 languages studied, only 2 percent. (English, with one of the highest, has 42 percent.) In part because of the difficulty of translating alphabet-based languages into Chinese characters”
English just does has more loanwords than most other european (and proto-indo-european) languages. And this isn’t an artefact of spanish or german or what have you having been studied less. (Though I fully admit that giving Japanese as an example of a more monocultural language in this context is a poor choice given how many non-native words it borrowed from Chinese).
Japanese did (and does) use Chinese characters.
But Japanese, in fact, is a perfect example of a language that developed in a very monocultural way; it developed in a single region, and:
"Japanese is classified as a member of the Japonic languages or as a language isolate with no known living relatives " from:
A 'language isolate' is a language that has no known genealogical relationship with other outside languages. Japanese developed as a fairly isolated language. Only since the 1850s has the Japanese borrowing of loanwords really took off, and those are mostly limited to English words for technology and other more 'modern' objects.
> Only since the 1850s has the Japanese borrowing of loanwords really took off, and those are mostly limited to English words[...]
Are you sure you meant to say English? [ https://en.wikipedia.org/wiki/Sino-Japanese_vocabulary - though even the article you linked specifically says it has a large portion of Chinese loans ]. Though Chinese borrowing started in the 4th century into Japanese, so - are you saying Japanese borrowings from Chinese particularly accelerated in the 1850s (If you could find some source for this claim, it’d be appreciated - I wasn’t able to find one and it seems unlikely, given that this was the point at which it started to lean more European in terms of its borrowings)? To my knowledge English borrowings in Japanese are modern (after WW2). I’m confused!
> Chinese and Japanese are far more monocultural, and developed somewhat linearly.
> it agglomerates Proto-Indo European, Germanic, Celtic, Italic, and others
And ultimately, Germanic, Celtic, Italic and others are all PIE languages.
By that dime, you will also find in Mandarin words that come from intercourse with the various Chinese dialects (e.g. 尷尬) and its Classical Chinese roots (飲食, vs. 吃 and 喝), as well as its neighbours; e.g. reborrowing (和平 from japanese 平和). Since modernization, you will find calques from European languages, or transliterations. It's also easy to find lots of Buddhist concepts that have entered the language.
With Japan not being a regional power for most of its early history, you have even more diverse influences. On the native language you have an extensive and violent borrowing from Chinese in several waves that have completely transformed the language not unlike English's borrowings from Norman, French, and Latin. And of course, the Buddhist vocabulary as well. Then from Portuguese traders you have several borrowings from them, and from its friendship with Germany, more borrowings from them. And these days, you have wasei-eigo, English words reinterpreted into Japanese with somewhat different meanings.
Their development only looks "linear" when viewed from a position of ignorance.