data_mining:neural_network:neurons

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
Last revisionBoth sides next revision
data_mining:neural_network:neurons [2017/08/19 17:37] – [Sigmoid Neuron] phreazerdata_mining:neural_network:neurons [2017/08/19 17:42] – [Rectified Linear Neurons] phreazer
Line 75: Line 75:
 $z=b+\sum_{i} x_{i} w_{i}$ $z=b+\sum_{i} x_{i} w_{i}$
  
-$y = \begin{cases} z, & \text{if } z > 0 \\ 0, & \text{otherwhise}\end{cases}$+$y = \begin{cases} z, & \text{if } z > 0 \\ 0, & \text{otherwhise}\end{cases} = \max(0,z)$
  
 Above 0, it is linear, at 0 it is 0 Above 0, it is linear, at 0 it is 0
 +
 +Faster computation, since slope doesn't get very small/large.
  
  
Line 99: Line 101:
  
 $y = \frac{e^{z}-e^{-z}}{e^{z}+e^{-z}}$ $y = \frac{e^{z}-e^{-z}}{e^{z}+e^{-z}}$
 +
 +Centering of data to 0.
 +
 +Exception: Output layer, since output should be in [0,1].
 ===== Softmax group ===== ===== Softmax group =====
  
  • data_mining/neural_network/neurons.txt
  • Last modified: 2017/08/19 17:43
  • by phreazer