Shannon entropy (information theory)

8 次查看(过去 30 天)
I want to calculate the shannon entropy. X transmits random binary sequense (e.g 1000110010) and Y received (e.g 1000100010) with probability of 2%. Could some explain me how can I calculate the shannon entropy.
  1 个评论
Akira Agata
Akira Agata 2019-7-4
Do you mean 'Channel capacity' based on the Shannon-Hartley theorem assuming 2% BER?
You don't need to use received binary sequense Y to calculate Shannon entropy, which can be determined by the probability of '0' and '1' in the transmitted binary sequense.

请先登录,再进行评论。

回答(0 个)

类别

Help CenterFile Exchange 中查找有关 Biomedical Imaging 的更多信息

标签

产品

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by