Entropies with and without probabilities: Applications to questionnaires

作者:

Highlights:

摘要

Entropy is a basic quantity in Information Theory. As it measures the amount of uncertainty one has in an alternative, it is “conditional” upon all kinds of “information” that has been given. New motivations for measures of uncertainty and information are provided. A more natural interpretation of the entropies in the “mixed theory” and the entropies for a random vector is given. The proposed new approach in measuring uncertainty is illustrated with examples, in particular, from questionnaires theory.

论文关键词:

论文评审过程:Available online 13 July 2002.

论文官网地址:https://doi.org/10.1016/0306-4573(84)90070-0