网站首页  英汉词典

请输入您要查询的英文单词:

 

单词 Shannon Entropy
释义

Shannon Entropy

英语百科

Entropy (information theory)

2 shannons of entropy: Information entropy is the log-base-2 of the number of possible outcomes; with two coins there are four outcomes, and the entropy is two bits.
Entropy Η(X) (i.e. the expected surprisal) of a coin flip, measured in shannons, graphed versus the fairness of the coin Pr(X = 1), where X = 1 represents a result of heads.Note that the maximum of the graph depends on the distribution. Here, the entropy is at most 1 shannon, and to communicate the outcome of a fair coin flip (2 possible values) will require an average of at most 1 bit. The result of a fair die (6 possible values) would require on average log26 bits.

In information theory, systems are modeled by a transmitter, channel, and receiver. The transmitter produces messages that are sent through the channel. The channel modifies the message in some way. The receiver attempts to infer which message was sent. In this context, entropy (more specifically, Shannon entropy) is the expected value (average) of the information contained in each message. 'Messages' can be modeled by any flow of information.

随便看

 

英汉网英语在线翻译词典收录了3779314条英语词汇在线翻译词条,基本涵盖了全部常用英语词汇的中英文双语翻译及用法,是英语学习的有利工具。

 

Copyright © 2004-2024 encnc.com All Rights Reserved
更新时间:2025/6/21 15:57:06