Abstract The conventional wisdom is that the concept of information is closely related to the concept of probability. In Shannon’s information theory, information is equated to a reduction in entropy—a probabilistic concept. In this paper, a different view of information is put on the table. Information is equated to restriction. More concretely, a restriction is a limitation on the values which a variable can take. The concept of a restriction is more general than the concept of a constraint and the concept of a probability distribution. There are three principal kinds of restrictions: possibilistic, probabilistic and bimodal. A bimodal restriction is a combination of possibilistic and probabilistic restrictions. Underlying the restriction-centered approach to information is what may be called the Information Principle. Briefly stated, the Information Principle has two parts. (a) There are three principal types of information: possibilistic information, probabilistic information and bimodal information. Bimodal information is a combination of possibilistic information and probabilistic information. (b) Possibilistic information and probabilistic information are underivable (orthogonal), in the sense that neither is derivable from the other. Information is all around us. And yet, there is widespread unawareness of the existence of the Information Principle. In particular, what is not recognized is that possibilistic information and probabilistic information are underivable (orthogonal). An important empirical observation is that propositions in a natural language are carriers of predominantly fuzzy possibilistic and fuzzy bimodal information. Existing systems of reasoning and computation—other than fuzzy logic—do not have the capability to reason and compute with fuzzy bimodal information.