Objects have the capacity to distinguish themselves from other objects and from themselves at different times. The interaction of objects, together with the process of making distinctions, results in the transfer of a quantity that we call information. Some objects are capable of distinguishing themselves in more ways than others. These objects have a greater information capacity. The quantification of how objects distinguish themselves and the relationship of this process to information is the subject of this book.
As individual needs have arisen in the fields of physics, electrical engineering and computational science, diverse theories of information have been developed to serve as conceptual instruments to advance each field. Based on the foundational Statistical Mechanical physics of Maxwell and Boltzmann, an entropic theory of information was developed by Brillouin, Szilard and Schr¨odinger. In the field of Communications Engineering, Shannon formulated a theory of information using an entropy analogue. In computer science a “shortest descriptor” theory of information was developed independently by Kolmogorov, Solomonoff and Chaitin.
The considerations presented in this book are an attempt to illuminate the common and essential principles of these approaches and to propose a unifying, non-semantic theory of information by demonstrating that the three current major theories listed above can be unified under the concept of asymmetry, by deriving a general equation of information through the use of the algebra of symmetry, namely Group Theory and by making a strong case for the thesis that information is grounded in asymmetry.