Much recent work in the theory of computational complexity ([Me], [FR]. [SI]) is concerned with establishing “the complexity” of various recursive functions, as measured by the time or space requirements of Turing machines which compute them. In the above work, we also observe another phenomenon: knowing the values of certain functions makes certain other functions easier to compute than they would be without this knowledge. We could say that the auxiliary functions “help” the computation of the other functions.
For example, we may conjecture that the “polynomial-complete” problems of Cook [C] and Karp [K] and Stockmeyer [S2], such as satisfiability of propositional formulas or 3-colorability of planar graphs, in fact require time proportional to nlog2n to be computed on a deterministic Turing machine. Then since the time required to decide if a planar graph with n nodes is 3-colorable can be lowered to a polynomial in n if we have a precomputed table of the satisfiable formulas in the propositional calculus, it is natural to say that the satisfiability problem “helps” the computation of the answers to the 3-coloring problem. Similar remarks may be made for any pair of polynomial-complete problems.
As a further illustration, Meyer and Stockmeyer [MS] have shown that, for a certain alphabet Σ, recognition of the set of regular expressions with squaring which are equivalent to Σ* requires Turing machine space cn for some constant c, on an infinite set of arguments. We also know that this set of regular expressions, which we call RSQ, may actually be recognized in space dn for some other constant d. Theorem 6.2 in [LMF] implies that there is some problem (not necessarily an interesting one) of complexity approximately equal to that of RSQ, which does not reduce the complexity of RSQ below cn . It does not “help” the computation of RSQ.