data_mining:neural_network:perceptron

This is an old revision of the document!


Perceptron

  • Popularized by Frank Rosenblatt (1960s)
  • Used for tasks with very big vectors of features

Decision Unit: Binary trheshold neuron.

Bias can be learned like weights, it's weight with value 1.

Perceptron convergence

  • If output correct ⇒ no weight changes
  • If output unit incorrectly outputs 0 ⇒ add input vector to weight vector.
  • If output unit incorrectly outputs 1 ⇒ substract input vector from the weight vector.

This generates set of weights that gets the right answer for all training cases, if such a set exists. ⇒ Deciding the features is the important distinction

Geometrical Interpretation

  • 1 dimension for each weight
  • Point represents a setting of all weights
  • Leaving the threshold out, each training case can be represented as a hyperplane through the origin. Inputs represent planes (or Constraints)
    • For a particular training case: Weights must lie on one side of this hyper-plane to get the answer correct.

Plane goes through origin, is perpendicular to the input vector (with correct answer = 1 (or 0)). Good weight vector needs to be on the same side of the hyperplane. Scalar product of wight vector and input vector positiv (angle < 90°).

Cone of feasable solutions

- Hypercone - Weight vectors don't need to exist - Convex hypercone

  • data_mining/neural_network/perceptron.1486073919.txt.gz
  • Last modified: 2017/02/02 23:18
  • by phreazer