data_mining:entropie

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
data_mining:entropie [2014/02/11 21:49] – Externe Bearbeitung 127.0.0.1data_mining:entropie [2017/09/09 12:53] (current) phreazer
Line 1: Line 1:
-====== Entropie ======+====== Entropy ======
  
-Claude Shannon (1948): Information hängt mit der Überraschung zusammen+Claude Shannon (1948): Entropy as a measure of surprise / uncertainty.
  
-Nachricht über ein Ereignis mit Wahscheinlichkeit umfasst $- \mathit{log}_2 p$ Bits an Informationen+Message about an event with a probability of occurrence includes $- \mathit{log}_2 p$ bits of information
  
-Beispiel für eine faire Münze : $- \mathit{log}_2 0.5 = 1$+Example of a fair coin: $- \mathit{log}_2 0.5 = $1
  
 ====== Mutual information ====== ====== Mutual information ======
  • data_mining/entropie.txt
  • Last modified: 2017/09/09 12:53
  • by phreazer