[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Moderately improved map of Microsoft tying evidence
I strongly support the idea of putting together
hard factual evidence of the global strategy
implemented by MS over the years.
What is *very* difficult in this kind of task, though,
is that software *is* complex, and it takes a lot
of time to be able to assess the merits and demerits
of a system in an objective way. I am not surprised
by all these mails saying "System X is better than system Y:
I tried both"... If you do not understand the inner working
of the systems, you can only compare individual experiences
which can vary wildly. Few people have the knowledge (and the
time) it takes to point out the origins of problem they faced
and to assess the seriousness of the design error underlying it.
Actually, this is the very core of the problem: it is too
easy to state that a given design error is not an error but
"the best we know how to do in such a rapidly evolving complex
technological field". To counter such an argument you must
explain how to do better to people that will not understand
what you are talking about.
Nevertheless, I have an example that everybody knows and maybe
somebody could try to "explain" its underlying technological
issues in a language accessible to everybody.
I am talking of the defragmentation utilities :-)
I personally met lots of people which were very proud of
knowing how to use a defragmentation utility under dos/windows
and were very surprised to see that I had no such magic tool
on my portable running NextStep (that has a mach based bsd
unix underneath if you know what I am talking about).
The typical reaction was: "ah... poor boy has not my magic tool"
I usually take the time to explain that I do not *need* it
because unix implements a self defragmenting disk management
policy. Wait, I do not use these words :-) I tell them a little
story about how would they go about filling in and emptying
different sized drawers in their closet to avoid leaving too
many drawers half-filled. (Dont ask me to do this exercise in
english: I can get by with what I need for my work as a computer
scientist, but I am not a novel writer :-)).
Up to now, everybody understood, at the end, that their defragmentation
utility is not a "feature", but a blatant proof of inferior technology
and lack of interest in the user's needs.
Here the reactions vary, though: they have been told that a
defragmentation utility "makes your disk go faster", but not
that "the silly MS system makes your disk go slower", and
they do not all handle this information the same way.
Also in the category "slowing down the technological evolution/fooling the customer",
we should put the braindead directory scheme used once in DOS and
modified now for FAT. I mean, unless you use a hammer on your disk,
you are supposed to never find a damaged directory structure on an
Unix system (this is guaranteed by the inode data structure + fsck
algorithm), while I personally have seen many disks completely ruined
by "ScanDisk" (the last one this summer: ScanDisk "repaired" the disk
and afterwards all the filenames were turned into gibberish and all
the data was lost).
Anybody willing to try his hand at writing a little story accessible to everybody
on these easy to verify facts?
--Roberto Di Cosmo
--------------------------------------------------------------
LIENS-DMI E-mail: dicosmo@dmi.ens.fr
Ecole Normale Superieure WWW : http://www.ens.fr/~dicosmo
45, Rue d'Ulm Tel : ++33-(0)1-44 32 20 40
75230 Paris CEDEX 05 Fax : ++33-(0)1-44 32 20 80
FRANCE MIME/NextMail accepted
--------------------------------------------------------------