[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Universal Standards?
Chip Richards wrote in http://www.essential.org/listproc/am-info/msg01338.html:
> And today, we have even more hardware designs to choose from, and several
> OS's which run on all of them. Life is good. It could be better, don't get
> me wrong.
I was thinking of all the different kinds of machines that ran CP/M, and of
how every x86-based machine that wasn't 100% IBM compatible got washed out
because (as I understand it) so many programs worked directly with the
hardware. Do you mean that we have more kinds of microprocessors now? It's
starting to look like that might not last much longer... and one sure
doesn't see them on shelves or in offices, at least not here in Taiwan.
Perhaps you're right, in the strict sense; in practical terms, all I can
say is I don't see it.
As for life being good, for me, life has gotten progressively worse with
every forced software switch. There were gains -- greater capabilities --
but each switch also involved *giving up* capabilities that I valued
greatly. After seven years with WordPerfect it still galls me that I have
no scrolling keys; and now, of course, even after customizing all the keys
I can in Word, not only do I have no scrolling keys, once I invoke the
menus I can't use any of the command keys sitting under my fingertips. Step
by step, the machine I must do my work on is being crippled.
Life is not good as long as the only software I am allowed to use is M'soft
software, the only programs I ever document are Windows programs, and even
the hardware products I write manuals for are getting M'soft protocols
built into them. Not to pass myself off as some kind of saint, but I feel
the way I suspect a conscientious objector forced to work in a munitions
factory would feel.... Being a foreign worker in a country where M'soft
rules the main industry makes breaking out of this a bit more complicated
than it would be for most people, and that's a corner I backed myself into,
but the fact is, for the time being, for me, life is decidedly not good.
> It was wonderful to have dynamic on-screen instructions.
> And only a few of the best-written applications, then or now, actually have
> this feature.
Quite true. The first program I used had 'em, though, and its resources
were limited to 64K of RAM and a single 128K floppy for program, overlays,
and data.... Those instructions weren't great, but they changed with the
context, and they got me through handsomely. Then I had to switch to a
"clean screen" program, and then to a screenful of tiny, arcane pictures. I
guess I was just spoiled by CP/M.... It seems to me that in this respect
we've gone only backwards, and that's something to look into, and hard.
To get back to Mac advocacy for a second, the implied claim that a system
can be easier to use, that it represents a step forward, *simply because it
is graphical*, is patently false, and just plays into BG & Co.'s hands.
> But in the CP/M days, you could choose text, or ... text. [...]
True; I overlooked graphics. Dunno how many of the display systems used
with CP/M 2.x (and there sure were a lot) could do graphics at all.
Graphics on such disparate display systems is a tall order. I would be
surprised if versions like CP/M-86 and TurboDOS have no graphics
capabilities.... I know people I can ask ;-).
Actually, I was able to print simple graphics in CP/M WordStar by fiddling
with the printer driver -- and now I do vector graphics in DrawPerfect
entirely through WordStar keystrokes (oh, I have a mouse driver, but one
day, after drawing all morning, I discovered I'd forgotten to load it ;-).
> Now, at least with the right OS, you can use text, graphics, or a mix of
> both. I prefer the latter. My most heavily-used X application is xterm.
I couldn't agree more. I'm not saying let's go back to CP/M 2.2! One system
that provides an excellent mix -- whatever capabilities it might not
provide -- is MS/PC DOS, and at the risk of being attacked from ALL sides,
I will say that that is what I currently use at home and whenever I can at
the office, and I find it satisfactory. There's better out there, but
Windows ain't it! The next step up for me will probably be Unixish, but for
now, telnetting to shell accounts will have to do while I concentrate on
getting out from under the Windows yoke at work.
DOSLynx, an early Web browser for MS/PC DOS, combines fast text-based
browsing with the ability to view graphics when desired -- and it uses
WordStar keystrokes for navigation in both documents and menus, right out
of the box! I almost cried when I discovered that.
OK, I got carried away a bit, but I'd say there's ample proof that *no*
software needs to be *dependent* on pointing devices and vendor-specific
keys -- there's nothing wrong with it *using* them, but there's everything
wrong with it *requiring* them -- and we have had a supremely ergonomical,
efficient, and easy-to-learn way of working stolen from us.
> ... and to do *everything people do today* ...
> Oh, really? Which CP/M or Apple II ftp client did you use? Which web
> browser? When was the CP/M or Apple II version of The GIMP released [...]
Point granted; see above.
> ... using standards-based non-printing main-block command keystrokes
> common to every computer on Earth ...
> Sorry, Dan, now you've gone too far. Which standards were you thinking of
Every encoding standard from six-bit ASCII to thirty-two-bit Unicode.
On every system with a Ctrl key, either the keyboard or the o.s. generates
01 hex when you press Ctrl-A, right? And 01 hex is a non-printing code that
software can use as it pleases, right? That's as much a part of ASCII as 41
hex for Shift-A, I'd say. It is true that different programs make different
*uses* of those first 32 ASCII keystrokes, and most software gives you no
recourse to change what those keystrokes do, and I object to that as much
as anyone else -- but the solution is NOT to discard or trivialize Ctrl-key
combinations and force everyone to use pointing devices and function and
arrow keys for actions as basic as moving the cursor. Absolutely not.
> The closest thing to keystroke standards I've ever seen were the Wordstar
> control keys, still in use in some modern software, and the Emacs control
> keys, still widely used in lots of different packages.
For more than a decade there has been nothing even remotely similar to
WordStar or Emacs keystrokes in any mass-market application program except
WordStar itself. One or two versions of DOS Edit allowed cursor movement,
but not command invocation, with WordStar keys; whoopee-do. Get into
developers' tools, now -- Borland's IDE, Unix JOE, Emacs, etc. -- and it's
a different story. Keyboard command sets such as WordStar's and Emacs' are
*hugely* popular with programmers, and I WILL NOT REST until everyone who
slogs away at a keyboard eight hours a day can work in a similar fashion if
they so desire.
> Oops -- correction: those keystrokes were common to every computer, every
> operating system, and every application program on Earth until Apple
> REMOVED the Ctrl key, M'soft DISABLED it, IBM REPOSITIONED it, and
> WordPerfect and Lotus IGNORED it, making keyboard use a NIGHTMARE and
> FORCING people to use vendor-specific keys and pointing devices.
> Please, people, don't be misled into believing this. IBM, CDC, Univac,
> GE/Honeywell, Burroughs, Harris, DEC -- they each one thought they had A
> Better Way, and enforced it mercilessly on their customers. [...]
That got sorted out thirty years ago. As I understand it, everyone but IBM
went to ASCII fairly quickly. ISO and ANSI (then called ASA) didn't begin
working on encoding standards until 1960-61, so there was no acceptable
standard *to* follow until a few years later. (One look at an EBCDIC chart
shows why EBCDIC was not acceptable -- and IBM didn't even have a unified
version of that until standards organizations entered the picture.) I would
have to check, but I'm pretty sure some of the companies you mentioned not
only adopted ASCII early on, they actually helped formulate it (IBM, of
course, kept pushing EBCDIC). No, by the dawn of the microcomputer era it
was IBM and EBCDIC vs. The World and ASCII, that much is clear.
> I believe IBM actually invented lock-in back in the sixties. Apple are
> amateurs at it compared to IBM mainframe salesmen. Public standards in
> computing are pretty recent, relatively speaking.
I'm sure IBM and others were at it before the sixties, and IBM has long had
it down to an art. By 1984, however, you would have been *very* hard put to
find a non-ASCII microcomputer or a digital keyboard without Ctrl in the
home row. Then Apple came out with this machine providing everything we
dreamed of but lacking the one feature I valued more than anything else --
and Little Caesar immediately followed suit....
Tell Gatesey-boy and the Microdroids I will not rest. I WILL NOT REST.
> [...] As one who has been using a computer a lot longer than you, Dan, I
> find the 100-key keyboards a delight. Oh, the layout could stand some
> improvement, I agree, but having actual cursor keys (that even *work* in
> most applications!), a numeric keypad, and user-definable function keys is
> a dream come true for someone who spent hours on an IBM 026 keypunch or a
> KSR-33 Teletype. And even *those* were pretty modern machines in their day.
Relax; I didn't say anything about chopping up *your* keyboard -- only my
own. I wouldn't deny you a single key, but I'd sooner go back to the Apple
II's dinky 52-key keyboard than use the *unchangeable* key assignments
found in so many MS/PC DOS apps and those forced on all apps by Windows.
I went from manual typewriters to electrics to mag-cards to a line-display
electronic, each with more keys than the last, and I never thought about it
except to wonder why Backspace was in such a lousy place on all of them.
Then came the Apple II and WordStar. Seemed pretty neat: everything at my
fingertips, the screen to guide me.... Then it hit me that "$08," "$09,"
and "$0D" (I was teaching myself hex) were decimal 8, 9, and 13; that H, I,
and M were the eighth, ninth, and thirteenth letters of the alphabet; and
that if -- as all the books said -- ASCII was universal on microcomputers,
Ctrl-H, Ctrl-I, and Ctrl-M should be Backspace, Tab, and Return not only in
WordStar but in every program running under every o.s. on every machine I
might ever encounter... and *it was so* until well into the x86/68x era --
until programs started reading proprietary keyboard scan codes instead of
ASCII codes (or ignoring both!).
The days of EBCDIC, FIELDATA, Display Code, etc., may be firmly etched in
your mind, Chip, but they *are* well behind us. The ASCII control codes are
part of Unicode, and mappings like Ctrl-H=Backspace and Ctrl-M=Return are
pre-coded into more printers, modems, command-line interpreters, and
editors than you can shake a stick at. We *do* have standards now. They
were approved because they allow enough leeway for any reasonable company
to innovate, so insofar as they provide hard and fast rules those rules
*must be followed*; not to follow them is to take us back to the fifties
and sixties, and we must not put up with that.
I will not rest....
I don't agree with everything you say, Chip, but I must thank you for
reining me in some. Perhaps I entered the computer age at an unusual time:
when everything was incredibly open and what standards we had were pretty
religiously followed. Let us remember that there can be such times.