||
[请教] 相关系数和互信息之间的解析关系
文件《第三章:连续信源的信息熵》
http://wenku.baidu.com/view/328a5a778e9951e79b8927ff.html
给出了:
请教:
(1)上述两个解析公式的可信性怎样?
(2)您能不能通俗地介绍一下其含义?
(3)最重要地:请您提供近期权威的书籍和参考文献,里面有这些公式的详细解释。
感谢您的指教!
相关链接:
[1] 2012-07-30,《[请教] 相关系数、n阶相关、互信息》
http://blog.sciencenet.cn/blog-107667-597293.html
¾¾¾¾¾¾¾ 出自¾¾¾¾¾¾¾
Thomas M. Cover, Joy A. Thomas. Elements of InformationTheory (Second Edition) [M]. USA:Hoboken, New Jersey, John Wiley & Sons, Inc., 2006.
(1)Page 244
Example 8.1.2 (Normaldistribution)
Then calculating the differential entropyin nats, we obtain
Changing the base of the logarithm, we have
(2)Page 252
Example 8.5.1 (Mutual information betweencorrelated Gaussian random variables with correlation ρ)
If ρ= 0, X and Y are independent and the mutual information is 0.
If ρ= ±1, X and Y are perfectly correlated and the mutual information is infinite.
Archiver|手机版|科学网 ( 京ICP备07017567号-12 )
GMT+8, 2024-11-21 23:41
Powered by ScienceNet.cn
Copyright © 2007- 中国科学报社