The politics of typography

Did you ever consider the political implications of the ascii standard? No? Thought so.
Someone who did is Robert Bringhurst. And these are not the only implications of typography that he has considered. The title of his book The Elements of Typographical Style may not be sexy, not the kind of thing you would read on the bus or in bed, but, holy shit, it is! If one happens to be a typography freak, this is just heaven, but I suspect that even people who use Times New Roman and Arial (or who don’t know which fonts they’re using, which probably means that they use Times New Roman and Arial), might find something to enjoy here, and even get their horizons widened, so that the next time they open a book, they may not go straight to the mental images behind the curved lines on the paper, but stop for a split second and think “hey, that’s a nice ‘g’!”
Where was I? Oh, the politics. Consider this, about the ascii character set, the standard upon which most computer type setup is based:

The fact that such a character set was long considered adequate tells us something about the cultural narrowness of American civilization, or American technocracy, in the midst of the twentieth century.

The basic ascii set has room for 94 characters. Since c. 1980 we have had the extended ascii set, with 216 free slots — a considerable improvement. But —

This ignores the needs of mathematicians, linguists and other specialists, and of millions of normal human beings who use the Latin alphabet for Czech, Hausa, Hungarian, Latvian, Navajo, Polish, Romanian, Turkish, Vietnamese, Welsh, Yoruba, and so on. The extended ascii character set is the alphabet not of the real world nor of the un General Assembly but of nato: a technological memento of the them-and-us mentality that thrived in the Cold War.

Bringhurst is a Canadian, of course… They ain’t so bad, them maple-lovers!
And if you thought that things have improved — well, yes, perhaps. But Bringhurst has some cold water for that burning enthusiasm too:

The rate of change in typesetting methods has been steep — perhaps it has approximated the Fibonacci series — for more than a century. Yet, like poetry and painting, storytelling and weaving, typography itself has not improved. There is no greater proof that typography is more art than engineering. Like all the arts, it is basically immune to progress, though it is not immune to change.

The ascii set has 94 characters, Gutenberg used 290 for his bible. ‘Nuff said.


11 thoughts on “The politics of typography

  1. Well, today there is Unicode, which I think even has letters for Tolkien’s Elvish…
    Anyway, typography has gone through really big changes in the last century—it may not have improved, in fact there were several disimprovments due to the abandon of old technology (e.g. Linotype), but with systems like TeX everybody can typeset his own book nowadays (in excellent quality of course), which was unthinkable back then.
    And don’t talk to me about Word: I have yet to see a book/manuscript done with Word that is appealing to the eye (if you can show me a good example, though…). It’s always the problem with stuff: it should be as simple as possible, but not simpler. Word tries to be simpler.

  2. Yeah, there’s Unicode, of course, but that would have weakened my argument, to mention that… Thanks for doing it for me. :-)
    Whether everybody can typeset their own books in TeX depends on a number of things – first of all if they have a book to typeset, and a reason to typeset it, i.e., if they need the control mechanisms that TeX can offer, and if the text-command-based method of TeX is what will most efficiently let them achieve the desired result.
    As for alternatives – well, we’ve discussed Word before. I still don’t think Word is a bad program, although you’re absolutely right that it tries so hard to be simple that it introduces plenty of ways for the user to screw things up.
    However, whether or not a page is pleasing to the eye depends on two things which have nothing to do with the tools to produce it: the competence of the one who produces it, and the standards of the beholder, which will always be subjective. Word is used by so many people with no competence in typesetting, that it’s not strange that most documents produced in Word look bad, but that’s not Word’s fault (or Word’s fault alone; that Word has ugly fonts and lousy page setup as default, is, of course).
    So when you judge this document, a pdf file made from a Word document, please note only the unappealing features for which Word can be blamed.
    The crop marks indicate the actual page size, 160×240 mm (2:3 – the proportion of the perfect fifth).
    For this to be more appealing, some kind of hyphenization is needed, which I have not used — partly because I don’t have a hyphenization dictionary for the lorem ipsum language, but partly also to illustrate once again the need for manual decisions, regardless of tool (I have a hard time believing that TeX can hyphenate a long text automatically, but I’d be happy to be proven wrong).
    But when it comes to things like positioning, consistency, arrangement of elements, etc., Word has the tools to do it — they may not always be emphasised in the GUI, but they are there.

    Other alternatives, which are easier to use than TeX, but which give more precise control than Word, are the full desktop publishers like QuarkXpress, FrameMaker, or for that matter InDesign (if you’re rich), the free PagePlus (which works remarkably well), and OpenOffice Writer, which has DTP features which come close to those of a real DTP program.

  3. I have a hard time believing that TeX can hyphenate a long text automatically, but I’d be happy to be proven wrong

    Indeed, you are wrong here. Don Knuth (creator of TeX) worked quite hard to create algorithms to solve two problems:
    Being given a word, find possible hyphenation points,
    Being given a paragraph, find the best way to break it into individual lines.
    Of course, the second one needs the first, at times, but one has to define what the best way is: Hyphenation is bad, but not as bad as     too    wide    spaces    between    words, or two narrow words [which I can’t imitate here—this is not TeX ;)]
    If your budget allows this, you may get a copy of Digital Typography, by Donald Knuth, which is a collection of scientific papers he wrote about … well, digital typography; including his famous paper Breaking Paragraphs Into Lines. Or try to check out this book in your local library. (Or parse through the TeXbook, which TeX-sources are available online, I’ll tell you more if your really want to know.) Indeed TeX hyphenation algorithm is quite good and also adapted in many other typesetting programs, including OO Writer, I think. Of course this doesn’t mean that TeX knows how to hypenate every word, it fails (i.e. can’t hypenate unless you tell it how) with stranger words like "galaxy," but I don’t know how to hypenate this myself…
    You’r example is indeed the most beautiful thing I can remember having seen that was created with Word. The types do look good, no way to deny this. Still, the spaces between words are a bit irregular, but I guess this is due to the disallowed hyphenation. Anyway, I’d stop complaining (at least I’d really, really decrease it) if everybody used Word in the way you do here. Maybe you should write a HOWTO "How you can use Word without hurting my eyes." ;-)
    I could try to create a TeXed version of your text here, but this would take some time, and I have other things to do at the moment (and yeah, of course, (La)TeX is more complicated then most of it’s alternatives. Even when using LyX, which now also runs on Windows, natively.)

  4. I’d like to say something more about the original topic: Computer scientists are not to blame; in fact, the ascii standard was (and partly is) quite right.
    You can learn for the Wikipedia page, that Unicode isn’t well excepted everywhere, especially Japan does not seem to like it very much. How comes? Partly, because Japan has it’s working encoding (“KJS” or something like that it is called); partly because it is a damn hard task to built a encoding theme that really serves every writing culture in the world. ascii had it’s predecessor, which hadn’t even all the characters that are used in the English nations. It was replaced by ascii, which worked quite well, but ascii was not replaced by any “next step” solution (Latin-1 is a different thing), instead now here is Unicode. Computer scientists (I’d like to say “Hackers”) like to solve problems in the general way—which Unicode undoubtedly is. (By the way, in 1992, when Bringhurst wrote that book, Unicode was already “there,” although not very wide spread.)
    Finally, one last thing:

    The ascii set has 94 characters, Gutenberg used 290 for his bible. ‘Nuff said.

    This argument confuses characters and glyphs. Computers work in a way that several characters can become one token, that is interpreted as one command, which may become one glyph. For instance, the m-dash “—” above was typed by me as three hyphens like “—“. This whole web page is encoded in ascii (mostly), but could contain several hundred glyphs anyway.

  5. Hm. Your blog software messed up my encodings. I tried to display “ascii” as <acronym style=”font-variant: small-caps;”>ascii</acronym>, which worked in the preview but not now. Hm. Also, my nice trick with the three hyphens didn’t work, the software takes <tt> out. Did work in the preview, though.
    I don’t want to start complaining, but the preview makes writing longer commands quite hard (it’s getting slow, and is misleading at the same time. :-(
    [Now I wonder: is “:-(” a glyph? ;-)]

  6. I’m glad to be proven wrong.
    Two comments: An algorithm for setting a whole paragraph — and not just single lines — with a little extra space as possible, is probably the one feature re. hyphenation which could be automated but which isn’t in Word, which makes it a shortcoming. I wasn’t aware that Knuth’s algorithm was used in OOo Writer — if it is, so much better! That said, I still I still don’t believe in single-word hyphenation from an algorithm. There are general rules, of course, which apply to most European languages, but the differences between languages are many (even within one and the same language: in Latin, the word faustus can be divided faus-tus or fau-stus, and dixit before or after the -x-,according to two different principles), and they are in most cases overruled by the natural divisions in compound words. The general principles are usually handled quite well by most word processors, whereas the exceptions — which is what will annoy readers, and far more than some extra space here and there — cannot be handled without a dictionary. So I wouldn’t leave hyphenation to a program alone.
    Which is, I suppose, what you also say, with your galaxy example :-)

  7. About unicode: I didn’t really blame the computer guys, and, yes, as a standard, created under specific limitations (the binary system, which the way it was used at that time gave 128 available slots, minus the characters without glyphs, like space, etc.), ascii is probably the best solution (and it was after all the American standard). One might also argue that why do I need a font with 30,000 Chinese characters when I can’t read a single syllable of Chinese anyway? Instead of saying, as Bringhurst does, that the extended ascii is the character set of NATO, one might say that it’s the character set of that part of the Western World which has been in close contact during the computer age (Sweden is not a member of NATO, but Swedes can use limited character sets even more easily than the NATO members Danes and Norwegians).
    Yet, taken as a more general argument, that one can always question the reasons behind what we do, even things which seem too trivial to ask for a reason behind, his statements are enlightening.

    About characters and glyphs: Yes again. But… In this case, the argument is mostly concerned with variety vs. simplification. The emphasis on standardization, for which computer scientists can be blamed (at least the kind who think redundancy is always bad — which is a good principle for programme code, but not necessarily for the outcome) (or: the users of their tools can be blamed for not questioning this battle against redundancy) (I wouldn’t want to insult any programmers out there, would I… ? :-) ) When a renaissance printer used 290 glyphs (signs of differing appearance) this was not only to represent different characters (“codes” with different meaning) for which combinations within the ascii set can equally well be used, but also to represent different glyphs for the same character. It is true that some of Adobe’s fonts allow for this kind of glyph substitution, but to use it is usually a cumbersome task, and there is no room even in the largest unicode version for two different lowercase, latin “a”s in the same font — they are placed, more or less randomly, in the “private use area”.
    This kind of redundant variety is indeed contributing strongly to the beauty of a printed sixteenth-century page.

  8. Hm. Sorry about the mess-up. Apparently, css style is not allowed in the comment box. I have added the correct markup now.
    And the preview thing — yes, I’ve become aware of that too. I’ve disabled it. I don’t know if it had much of a function anyway — especially if one couldn’t rely on it.
    :-( — wouldn’t that be three different glyphs, as it is written here, but which in many situations are replaced by ugly, over-friendly, yellow single-glyphs, which for obvious reasons are disabled in this blog? :-)

  9. Well, about them characters and glyphs: On the other hand Hermann Zapf believed in Digital Typography, and fonts like Zapfino are not possible without digital technology. Zapfino also has many glyphs for one character, its all a question of quality. And it sure doesn’t make sense to have redundancy in encodings.
    By the way, Knuth has nice examples of “randonizes” glyphs, i.e. types that have slight, random distortions, this technique makes fonts lively and warm. It’s really only a matter of quality (And I would never argue there’s much general quality in the way things get typeset with computers today, see Word [the common use of it]). But there are ways to typeset things beautifully with computers, in the case of Zapfino even more beautiful then any of the old techniques could do.)

  10. what do you really mean
    why so formal, is that a blog
    is there somewhere we can talk
    this is like greek to me
    all i wanna do is talk
    sorry about your codine heiner
    bye

Leave a Reply

Your email address will not be published. Required fields are marked *


*