Abstract
Of all the sub-disciplines of philosophy, the philosophy of science has perhaps the most privileged relationship to information theory. This relationship has been forged through a common interest in themes like induction, probability, confirmation, simplicity, non-ad hocness, unification and, more generally, ontology. It also has historical roots. One of the founders of algorithmic information theory, Ray Solomonoff, produced his seminal work on inductive inference as a direct result of grappling with problems first encountered as a student of the influential philosopher of science Rudolf Carnap. There are other such historical connections between the two fields. Alas, there is no space to explore them here. Instead this essay will restrict its attention to a broad and accessible overview of the aforementioned common themes, which, given their nature, mandate an emphasis on AIT as opposed to general information theory.