2009年6月12日 星期五

Entropy

entropy is a measure of the uncertainty associated with a random variable.

measure of disorder: a measure of the disorder that exists in a system

是一種度量的標準

Entropy 可以用在:

◎熱力學→狀態與熱能之關係
◎機率→亂度、不確定性
◎資訊→資訊的含量

隱藏涵義的程度

假設希望度量所要隱藏資訊的安全性=>大好

H(X)大,不確定性大,資訊量大,越安全。 (指桑罵槐)
H(X)小,不確定性小,資訊量小,越不安全,易被猜中。
H(X)=0,此訊息為真,即背後沒有隱藏其它涵義。(藏不了東西)

“High Entropy” means X is from a uniform (boring) distribution
A histogram of the frequency distribution of values of X would be flat
the values sampled from it would be all over the place

“Low Entropy” means X is from varied (peaks and valleys) distribution
A histogram of the frequency distribution of values of X would have many lows and one or two highs
the values sampled from it would be more predictable

沒有留言: