Algorithmic Information and Randomness
Martin-Löf's 1966 discovery that the theory of computing
and only the theory of computing provides a satisfactory definition of the
randomness of individual binary sequences was an
early and dramatic indicator of things to come.
In addition to providing hardware and software tools, computer science
has developed deep theoretical tools that are conceptually
transforming the foundations of several areas of science.
Many of these tools, like randomness, arise from the growing
of theoretical computer science with information theory.
I have been involved in the development of several
types of effective fractal dimensions:
The first of these refines randomness and gives a new
characterization of Kolmogorov complexity. The second sheds new light
on normal sequences and data compression. The third shows that
packing dimension (an ostensibly deeper fractal dimension
developed in 1982) is an exact dual of Hausdorff dimension with
similarly useful effectivizations. The last two are dimension-theoretic
analogs of mutual information and conditional entropy.
Constructive dimension (2000)
Finite-state dimension (2001, with Dai, Lathrop, and Mayordomo)
Effective strong dimension (2003, with Athreya, Hitchcock, and Mayordomo)
Mutual dimension (2013, with Case)
Conditional dimension (2016, with N. Lutz)
I am currently working with colleagues and students to extend these developments,
with applications to fractal geometry, dynamical systems, chemical reaction networks, and computational complexity.
John Hitchcock maintains online bibliographies on resource-bounded
measure and effective
fractal dimensions, with links to available papers in the latter.