

The Upside-Down "e" - an Editor's Nightmare (1993) - rcrowell
http://www.azer.com/aiweb/categories/magazine/ai104_folder/104_articles/104_alphabet_nightmare.html

======
weilawei
"You're right. Man should not be a slave to technology. The 'tail shouldn't
wag the dog'. But that's idealistic and unreal. Technology always has its own
limitations and parameters. It's not the first time technology has shaped the
alphabet. The same story has been repeating itself since the origin of the
alphabet. Greek and Latin were clearly determined by the hammer and chisel
against marble - it's easier to carve straight lines in stone than rounded
ones. That's why so many of our capitals are based on straight lines. The
cursive Arabic was influenced by pen; cuneiform, by clay and sharp stick.
Other alphabets were created by carving on bamboo. Simply today, the
determining tools are fonts, computer keyboards, and satellite transmissions -
the only difference being that they're a bit new and unfamiliar in the
alphabet designers' hands."

~~~
__david__
I'm reminded of the English language and the "thorn" character (Þ). It used to
be the character for the sound "th" but it fell out of favor and the final
nail in its coffin happened when the printing press came into play--the
typefaces were imported from Germany and Italy and they didn't have thorn. The
printers substituted "y" for a while (hence "Ye olde shoppe"--It's actually
pronounced "The", not "yee").

So I would say it's not unheard of for an alphabet to be shaped by technology.
We English speakers are not the worse for wear.

------
quant18
Yes, when you ask a committee of linguists to design an orthography for you
--- you get a godawful almost-IPA "one phoneme = one grapheme" academic
exercise which is utterly impractical for everyday use. (Again, an object
lesson in the robustness of top-down-designed systems vs. ones which evolved
from centuries of use). Azerbaijan would have had the same problem even if
they had stayed with the Cyrillic alphabet. E.g. after Kazakhstan became
independent, they had lots of trouble trying to use Kazakh as the
administrative language due to a shortage of typewriters with the appropriate
letters.

Uzbekistan is greatly underappreciated for having done something really neat
after the Soviet breakup --- designing an orthography with no "funny letters".
They use context to distinguish the "back" and "front" versions of phonemes
like i [1]. And in the two cases where that would be too confusing (o and u),
they put an apostrophe after the letter --- i.e. u', instead of something like
ü.

[1] Turkey, which speaks a closely-related language, solved the "front i" vs
"back i" problem by making one dotted and the other dotless --- those of us
here probably know all of the toLowerCase-related bugs that caused, I think
articles about this have been posted a couple of months ago

------
j-g-faustus
I remember we had problems with Norwegian fonts in the early 1990s - printing,
copying between computers and email would invariably mess up the æ, ø and å.
Even the computer science department of the university I attended in the
mid-90s didn't get it right. I think it took close to ten years before the
majority of printouts and emails were correct.

But compared to Azerbaijan, I understand we were lucky - at least we didn't
have to mess around with creating our own fonts :)

Good point on how tools put constraints on expression, BTW. I guess ASCII art
is another example, and the ASCII smiley :-) is still with us.

Even today mixing text and freehand drawing or images is troublesome enough
that we rarely bother. In a sense it is somewhat curious that 60 years of
computer science has not been enough to replicate the convenience of a
handwritten note...

------
techiferous
Also relevant: "The Absolute Minimum Every Software Developer Absolutely,
Positively Must Know About Unicode and Character Sets"

<http://www.joelonsoftware.com/articles/Unicode.html>

------
chiquita
Aren't they talking about U+018F and U+0259?

Nowadays, adding that glyph to an existing font could be a very easy thing to
do. It depends a lot on the font design you're after...

Fonts released under a free license would have helped.

A nice font editor: <http://fontforge.sourceforge.net/>

------
theschwa
Who'd have known I'd cause so much trouble.

------
wendroid
Painful to read something easily solved by free software, even in 1993

~~~
dalke
Really? One problem was the lack of diverse fonts with an ə in it. How does
free software solve that? Another was the different programs used different
ways to input the ə character. Free software doesn't help there - why would
they all use the same entry sequence? A third was that no keyboard supported
that common character in the home row. While key remappers exist, what should
the new standard layout be?

Had they chosen æ instead then they could have just used Latin-1 and the large
number of existing fonts, keyboards, etc. for languages like Danish which
already solved the problem.

~~~
wendroid
Bitmap fonts are easy. I made a truetype font in the 1990s using CorelDraw so
I know that's not hard. Keyboard maps, again trivial. And don't discount the
power of a felt tip when it comes to marking a keyboard.

~~~
dalke
The article goes into well-written details of why what you suggested here was
tried and didn't work. 1) they also wanted print fonts, 2) they made a bitmap
font but it didn't scale right, 3) people want a large variety of font forms,
which they couldn't provide. 4) At the OS level, what do you map your key to,
if the character code doesn't exist? 5) Applications chose different,
incompatible key-bindings, and 6) for the entire country?

There's also a difference between making a font and making a good, generally
usable font, just like I as a 7th grader could program, but they weren't good,
generally usable programs.

