data_mining:neural_network:perceptron

This is an old revision of the document!


Perceptron

  • Popularized by Frank Rosenblatt (1960s)
  • Used for tasks with very big vectors of features

Decision Unit: Binary trheshold neuron.

Bias can be learned like weights, it's weight with value 1.

Perceptron convergence

  • If output correct ⇒ no weight changes
  • If output unit incorrectly outputs 0 ⇒ add input vector to weight vector.
  • If output unit incorrectly outputs 1 ⇒ substract input vector from the weight vector.

This generates set of weights that gets the right answer for all training cases, if such a set exists. ⇒ Deciding the features is the important distinction

Geometrical Interpretation

Weight-Space view

  • 1 dimension for each weight
  • Point represents setting of all weights
  • Inputs represent planes (or Constraints)
  • Leaving the threshold out, each training case can be represented as a hyperplane through the origin.
    • For a particular training case: Weights must lie on one side of this hyper-plane to get the answer correct.

Plane is perpendicular to the input vector. Good weight vector needs to be on the same side of the hyperplane. Scalar product of wight vector and input vector positiv (angle < 90°).

Cone of feasable solutions

- Hypercone - Weight vectors don't need to exist - Convex hypercone

  • data_mining/neural_network/perceptron.1486073421.txt.gz
  • Last modified: 2017/02/02 23:10
  • by phreazer