Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
data_mining:entropie [2014/02/11 21:49]
127.0.0.1 Externe Bearbeitung
data_mining:entropie [2017/09/09 12:53] (current)
phreazer
Line 1: Line 1:
-====== ​Entropie ​======+====== ​Entropy ​======
  
-Claude Shannon (1948): ​Information hängt mit der Überraschung zusammen+Claude Shannon (1948): ​Entropy as a measure of surprise / uncertainty.
  
-Nachricht über ein Ereignis mit Wahscheinlichkeit ​umfasst ​$- \mathit{log}_2 p$ Bits an Informationen+Message about an event with a probability of occurrence ​includes ​$- \mathit{log}_2 p$ bits of information
  
-Beispiel für eine faire Münze ​: $- \mathit{log}_2 0.5 = 1$+Example of a fair coin: $- \mathit{log}_2 0.5 = $1
  
 ====== Mutual information ====== ====== Mutual information ======