data_mining:neural_network:transfer_learning

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
data_mining:neural_network:transfer_learning [2018/05/21 22:29] – [Transfer learning] phreazerdata_mining:neural_network:transfer_learning [2018/05/25 23:13] (current) – [Transfer learning] phreazer
Line 3: Line 3:
 Using pre-trained models / their trained weights as a starting point to train a model for a different data set. Using pre-trained models / their trained weights as a starting point to train a model for a different data set.
  
-Use pre-trained net, initialize last layers with random weights.+Use pre-trained net, initialize last layers with random weights and e.g. create own softmax layer, **freeze** parameters in previous layers.
  
 Options: Train new layers of network, or even more layers. Options: Train new layers of network, or even more layers.
Line 9: Line 9:
 Prereqs: Prereqs:
  
-Much data for existing model, for new model only few data +  * Task A and B have same input x 
-Pre-trained model needs to generalize+  * Much data for existing model/task A, for new model only few data 
 +  * Low level features from A could be helpful for task B 
 +  * Pre-trained model needs to generalize
  
  
 +Another trick:
  
 +  * Precompute output of frozen layers for all samples (save computation time later)
 +
 +For larger set of samples:
 +
 +  * Only freeze first layers, train last few layers (and replace softmax)
 +
 +For large set of samples:
 +
 +  * Use weights as initialization, then train (and replace softmax)
 ===== Image-based NNs ===== ===== Image-based NNs =====
  
  • data_mining/neural_network/transfer_learning.1526934570.txt.gz
  • Last modified: 2018/05/21 22:29
  • by phreazer