Linear Speedup of a TM

Hi All,

I am reading Computational Complexity by Papadimitriou and I have run
into the Linear Speedup Theorem.  This states that for any Turing
Machine that halts in time f(n) (where n is the input size), there
exists a Turing Machine that decides the same language as the original
that halts in time a*f(x) + n + 2 for any a > 0.  The proof is that m
symbols of the original TM can be combined into a single symbol for
the constructed TM, the constructed TM, in order to execute m steps of
the original machine, need only look at the current symbol, and the
two adjacent symbols (and must, of course, contain many many new
states).  Without going into more detail ... Papadimitriou claims that
this implies that constants make no difference in complexity theory,
this makes sense, but he also claims that in, for instance, a poly
time algorithm that halts in, say, n^2 + n + 9 steps, the linear term,
n, makes no difference either.  I don't see how this is so ... we can
make constants arbitrarily small, even constants in front of terms,
like if it were c*n^2 we could cause a to be 1/c by picking an
appropriate m, but n will always play a role (however insignificant
that role might be).  Am I missing something, or was Papadimitriou
simply a bit vague and a bit imprecise?

K-Stern (27)
6/26/2003 2:13:11 PM
comp.theory 5139 articles. 1 followers. marty.musatov (1143) is leader. Post Follow

0 Replies

Similar Articles

[PageSpeed] 22