data_mining:neural_network:model_combination

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
data_mining:neural_network:model_combination [2017/04/01 15:29] – [Dropout] phreazerdata_mining:neural_network:model_combination [2017/08/19 22:12] (current) – [Approximating full Bayesian learning in a NN] phreazer
Line 85: Line 85:
 More complicated and effective methods than MCMC method: Don't need to wander the space long. More complicated and effective methods than MCMC method: Don't need to wander the space long.
  
-If we compute gradient of cost function on a **random mini-batch**, we will get an ubiased estimate with sampling noise.+If we compute gradient of cost function on a **random mini-batch**, we will get an unbiased estimate with sampling noise.
  
 ====== Dropout ====== ====== Dropout ======
-Ways to combine output of multiple models: +See [[data_mining:neural_network:regularization|Regularization]] 
-  * MIXTURECombine models by averaging their output probabilities. +
-  * PRODUCT: by geometric mean (typically less than one) $\sqrt{x*y}/ \sum$+
  • data_mining/neural_network/model_combination.1491053386.txt.gz
  • Last modified: 2017/04/01 15:29
  • by phreazer