data_mining:neural_network:overfitting

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
data_mining:neural_network:overfitting [2018/05/10 17:52] – [Weight penalites] phreazerdata_mining:neural_network:overfitting [2018/05/10 18:03] (current) – [Inverted dropout] phreazer
Line 139: Line 139:
 ==== Inverted dropout ==== ==== Inverted dropout ====
  
-Layer $l=3$.+<code> 
 +Layer l=3
  
-$keep.prob = 0.8$+keep.prob = 0.8 // probability that unit will be kept
  
-$d3 = np.random.rand(a3.shape[0], a3.shape[i]) < keep.prob$+d3 = np.random.rand(a3.shape[0], a3.shape[i]) < keep.prob // dropout vector
  
-$a3 = np.multiply(a3,d3)$+a3 = np.multiply(a3,d3) // activations in layer 3 a3 *= d3
  
-$a3 /= keep.prob// e.g. 50 units => 10 units shut off +a3 /= keep.prob // e.g. 50 units => 10 units shut off
- +
-$Z = Wa+b$ // reduced by 20% => standardize with 0.8 => expected value stays the same+
  
 +Z = Wa+b // reduced by 20% => standardize with 0.8 => expected value stays the same
 +</code>
 Making predictions at test time: No drop out Making predictions at test time: No drop out
  
  • data_mining/neural_network/overfitting.1525967529.txt.gz
  • Last modified: 2018/05/10 17:52
  • by phreazer