Levin,Russ. Math. Surv., 25:6(1970), 83–124.
1922.Algorithmic Complexity2.8.3.[12] Show that givenx,y,andC(x,y), one can computeC(x)andC(y) up to an additive logarithmic termO(logC(x,y)).Comments.Hint: use symmetry of information and upper semicom-putability. Suggested by L. Fortnow.2.8.4.[28] Letω=ω1ω2...be an infinite binary sequence. The en-tropy functionH(p) is defined byH(p) =plog 1/p+(1−p) log1/(1−p).Let limn→∞1n∑ni=11nωi=p.(a) Show thatC(ω1:n|n)≤nHparenleftBigg1nnsummationdisplayi=1ωiparenrightBigg+ logn+c.(b) Prove the following: If theωi’s are generated by coin flips with prob-abilitypfor outcome 1 (a Bernoulli process with probabilityp), then forallǫ>0,Prbraceleftbiggω:vextendsinglevextendsinglevextendsinglevextendsingleC(ω1:n|n)n−H(p)vextendsinglevextendsinglevextendsinglevextendsingle>ǫbracerightbigg→0,asngoes to infinity.2.8.5.[26] Show that 2C(a,b,c)≤C(a,b)+C(b,c)+C(c,a)+O(logn).Comments.For an application relating the 3-dimensional volume of ageometric object in Euclidean space to the 2-dimensional volumes of itsprojections, see the discussion in Section 6.13 on page 530. Hint: use thesymmetry of information, Theorem 2.8.2. Source: D. Hammer and A.K.Shen,Theor. Comput. Syst., 31:1(1998), 1–4.2.9History andReferencesThe confluence of ideas leading to Kolmogorov complexity is analyzedin Section 1.8 through Section 1.12. Who did what, where, and when,is exhaustively discussed in Section 1.13. The relevant documents aredated R.J. Solomonoff, 1960/1964, A.N. Kolmogorov, 1965, and G.J.Chaitin, 1969. According to L.A. Levin, Kolmogorov in his talks usedto give credit also to A.M. Turing (for the universal Turing machine).The notion of nonoptimal complexity (as a complexity based on shortestdescriptions but lacking the invariance theorem) can be attributed, inpart, also to A.A. Markov [Soviet Math. Dokl., 5(1964), 922–924] andG.J. Chaitin [J. ACM, 13(1966), 547–569], but that is not a very crucialstep from Shannon’s coding concepts.The connection between incompressibility and randomness was madeexplicit by Kolmogorov and later by Chaitin. Theorem 2.2.1 is due toKolmogorov. The idea to develop an algorithmic theory of information
2.9.History and References193is due to Kolmogorov, as is the notion of deficiency of randomness. Uni-versal a priori probability (also based on the invariance theorem) is dueto Solomonoff. This is treated in more detail in Chapter 4. (Solomonoffdid not consider descriptional complexity itself in detail.)In his 1965 paper, Kolmogorov mentioned the incomputability ofC(x)in a somewhat vague form: “[...] the functionCφ(x|y) cannot be effec-tively calculated (generally recursive) even if it is known to be finitefor allxandy.” Also Solomonoff suggests this in his 1964 paper: “it isclear that many of the individual terms of Eq. (1) are not ‘effectivelycomputable’ in the sense of Turing [...but can be used] as the heuristicbasis of various approximations.” Related questions were considered byL. L¨ofgren [Automata Theory, E. Caianiello, ed., Academic Press, 1966,251–268;Computer and Information Sciences II, J. Tou, ed., AcademicPress, 1967, 165–175]. Theorem 1 in the latter reference demonstratesin general that foreveryuniversal functionφ0,Cφ0(x) is not recursiveinx. (In the invariance theorem we considered only universal functionsusing a special type of coding.)Despite the depth of the main idea of Kolmogorov complexity, the tech-nical expression of the basic quantities turned out to be inaccurate inthe sense that many important relationships hold only to within an error
Upload your study docs or become a
Course Hero member to access this document
Upload your study docs or become a
Course Hero member to access this document
End of preview. Want to read all 814 pages?
Upload your study docs or become a
Course Hero member to access this document
Term
Spring
Professor
N/A
Tags
Computational complexity theory, Algorithmic information theory