Abstract – (Bayesian/Algorithmic) Information theory, one- and two-part compression, and measures of intelligence – David Dowe

On August 19, 2011, in Abstract, Presenter, Summit, by Adam A. Ford

Greg Chaitin wrote in 1982: “… develop formal definitions of intelligence and measures of its various components [using algorithmic information theory, a.k.a. Kolmogorov complexity]”.   We make a case for the relationship between information theory and intelligence.  More specifically, we begin by introducing the Bayesian information-theoretic notion of Minimum Message Length (MML) machine learning, and then […]