If I am interested in quantifying how much I do not know about a phone book, say, I just need to tell you the number of phone numbers in it. This is, in most cases, not particularly difficult. I have to specify, in particular, which states the system can take on. In that case, I have to specify this ‘other system’ as precisely as I can. But I have told you that we will make the concept of information mathematically precise. This other system can be anything: the stock market, a book, the behaviour of another person. Well, in general, when we make predictions, they are about a system that we do not already know. If the only thing you will take away from this article is your appreciation of the difference between entropy and information, then I will have succeeded.īut let us go back to our colloquial description of what information is, in terms of predictions. They may have been used synonymously (even by Claude Shannon-the father of information theory-thus being responsible in part for a persistent myth), but they are fundamentally different. Information and entropy are two very different objects. One of the objectives of this comment is to make the distinction between the two as clear as possible. Entropy, in case you have not yet come across the term, is just a word we use to quantify how much is not known.īut, isn’t entropy the same as information? That stuff, instead, was mostly entropy (with a little bit of information thrown in here or there). I will argue that, when you thought that the information you were given was not useful, then what you were exposed to was most likely not information. When was the last time that you found information to be counterproductive? Perhaps it was the last time you watched the news. We all understand that information is useful. What do I mean by prediction? What is ‘accuracy better than chance’? Predictions of what? But the concepts introduced in this sentence need to be clarified. Even though the former sentence appears glib, it captures the concept of information fairly succinctly. What is information? Simply put, information is that which allows you (who is in possession of that information) to make predictions with accuracy better than chance. My purpose is to introduce the concept of information-mathematically defined-to a broader audience, with the express intent of eliminating a number of common misconceptions that have plagued the progress of information science in different fields.
Conversely, a vague understanding of the concept can lead to profound misunderstandings, within daily life and within the technical scientific literature. I want to argue in this opinion piece that a precise understanding of the concept of information is crucial to a number of scientific disciplines. Even though most people are perfectly comfortable with their day-to-day understanding of information, the precise definition of information, along with its properties and consequences, is not always as well understood. We use information technology in our daily interactions with people and machines. We rely on information in order to make sense of the world: to make ‘informed’ decisions. Information is a central concept in our daily life.