[Date Prev][Date Next][Subject Prev][Subject Next][ Date Index][ Subject Index]

Re: More Dreary Pain-in-the-Ascii Argumentation



** Reply to note from Peter Evans Sun, 02 Nov 1997 01:20:05 +0900

> If we must use "ASCII" for anything above 127, then qualifying it
> with "US IBM" (or whatever) might reduce the risk of confusion.

What, not using Teletype for 5-bit codes 0-31? The high chars are (or
originally were) called "OEM 8-bit characters" -- because typographers
supplemented the 7-bit letters and decimals with line drawings or whichever
international chars the client OEM required for an application.
ANSI implies a standard, whereas in those days there weren't any standards >127.

But in common parlance, what we usually mean when we say
"ASCII" is the passive, uninterpreted glyphs assigned to 8-bit strings 0-255.
Exactly what each glyph looks like (and whether the glyphs attached to
individual strings vary from font to font) doesn't matter at all. The
sine qua non of these glyphs is that they are visual in nature,
and are written and read in *indivisible* 8-bit gulps; that's what
distinguishes them from Binary codes, which consist of ones and zeros, are
generally interpreted, non-visual, and can contain "words" ranging in length from 1
bit to 16 bits to 128 to whatever. So, in general, a character ceases to be ASCII
when it is interpreted -- i.e. when it passes from being a symbol, and becomes an
instruction. Ascii-13 is a musical quarter-note; 13d|0Dhex generates a carriage
return.


-----------
Robert Holmgren
holmgren@xxxxxxxx
-----------