data_mining:entropie

# Differences

This shows you the differences between two versions of the page.

 data_mining:entropie [2014/02/11 21:49]127.0.0.1 Externe Bearbeitung data_mining:entropie [2017/09/09 12:53] (current)phreazer 2017/09/09 12:53 phreazer 2014/02/11 21:49 Externe Bearbeitung2013/09/15 18:24 phreazer [Mutual information] 2013/09/15 18:04 phreazer [Mutual information] 2013/09/15 17:20 phreazer [Mutual information] 2013/09/15 16:46 phreazer [Mutual information] 2013/09/15 16:41 phreazer 2013/09/15 14:46 phreazer angelegt 2017/09/09 12:53 phreazer 2014/02/11 21:49 Externe Bearbeitung2013/09/15 18:24 phreazer [Mutual information] 2013/09/15 18:04 phreazer [Mutual information] 2013/09/15 17:20 phreazer [Mutual information] 2013/09/15 16:46 phreazer [Mutual information] 2013/09/15 16:41 phreazer 2013/09/15 14:46 phreazer angelegt Line 1: Line 1: - ====== ​Entropie ​====== + ====== ​Entropy ​====== - Claude Shannon (1948): ​Information hängt mit der Überraschung zusammen + Claude Shannon (1948): ​Entropy as a measure of surprise / uncertainty. - Nachricht über ein Ereignis mit Wahscheinlichkeit ​p umfasst ​$- \mathit{log}_2 p$ Bits an Informationen + Message about an event with a probability of occurrence ​p includes ​$- \mathit{log}_2 p$ bits of information - Beispiel für eine faire Münze ​: $- \mathit{log}_2 0.5 = 1$ + Example of a fair coin: $- \mathit{log}_2 0.5 =$1 ====== Mutual information ====== ====== Mutual information ======
• data_mining/entropie.txt