In comp.os.linux.advocacy, AeoN
on Sat, 06 Nov 2004 14:34:50 -0800
It gets weirder. I brought up that page; it was a bit thin on specifics
though it at least brings up the topic of Microsoft's selective fact-pulling.
However, the ad that came with asks the question
"Which offers superior performance: Windows [green type]
or Linux [red type]?"
Web server performance comparison hints at 276% better peak
performance. (The actual graph is more like 8 to 3, so the
numbers aren't too inconsistent internally. The source is from
Veritest, according to the graph. A few clicks, and one gets
which has at least two tests; they also compared Samba's
performance with Microsoft on a PowerEdge 3500. One has
to ask the question as to why one didn't run a similar comparison
using a pure Linux/NFS network as a third option; Samba is
a bit of a kludge because of the proprietary nature of CIFS/SMB,
though it's a nice, easy-to-use solution.)
The second page brings up the TCO ad; again Windows gets green,
and is touted as 14% less expensive.
Three guesses who's sponsoring this controversy. I'm hoping
for some real unbiased numbers, but there are several problems
 NRE is part of TCO, and is highly variable. If the staff is
expert on Unix (or, for that matter, Linux), it's a no-brainer
unless shown otherwise. If the staff knows nothing of Unix or
Linux, but is very well-conversant on Microsoft tools, then it's
also a no-brainer unless shown otherwise (e.g., lots of viruses
 So far, my thinking on FOSS is that it might be, occasionally,
harder to *find* stuff than to construct it. This may change
as search engines improve and/or the organization solidifies
for distributing software, but there are, for instance, over
a dozen freeware charting packages. There are also several
databases, a large number of editors, and even three Java
JDKS: Sun, Kaffe, and a third one whose name escapes me
(I don't think it's gcj, though; that's just a compiler,
though a good one).
 I'm not knowledgable enough about IT to suggest which is cheaper
to maintain after installation, and that's also somewhat
variable, as the Linspire root controversy shows. Config
of a secure system is well-known, but that doesn't mean
everyone knows it, and IT might on occasion do workarounds
rather than doing root-cause, if the root-cause turns out to
require expensive replacement of, say, a software switch,
or a lot of NRE.
 Fetches per second, as hinted at in the ad, is also highly variable.
If one has a tiny, simple page, it might take a tenth of a
millisecond. If one has a complex page that requires a database
fetch or two or many, it might take a few seconds on an *unloaded*
system, never mind a computer that's busy. Of course anything
more than about 2 seconds is bordering on poor design; anything
more than 8 seconds runs the risk of losing customers -- but there
are a fair number of factors, not the least of which is the patience
of someone downloading a huge picture (e.g., from the USGS website,
or perhaps maps.yahoo.com).
So what's the real, real truth here? An interesting question all around.
I suspect Microsoft has a consulting/professional services department
to help customers (for a fee), which might be the best way to handle
this for payware. I don't know what RedHat, SuSE, or other such offer
in that space, though Cygnus for awhile had a subscription service
for support of their stuff.
We've seen some of the tweaks they've done to systems in the Mindcraft
It's still legal to go .sigless.