Supervised Classification Using Hebbian Linear Neural Network
Introduction
Donald Hebb introduced a technique in 1949, where unlike the threshold neural it will learn. The method proposed by Hebb will update weights between neurons. This method is called Hebbian Learning.
The key points of this method could be summed as follow: 1. Connections between neurons or weights are where information is stored, 2. The change in weights between two neurons is proportional to the product of their neuron output, and 3. the learning process is by repeatedly and simultaneously activating a group of weakly connected neurons.
From these key concepts the mathematical formulation of Hebbian learning was made.
Objectives
Familiarize with the concept of supervised classification using linear Neural Network (Hebbian Network), and build a linear classifier via training.
Methodology
Using MATLAB code to formulate the learning problem, by computing the net input function (U), derive the Error E, and update weights if applicable until getting the right classification.
The result is tested by calculating the target values (t) of the given data. The data distribution and decision line are then plotted.
Figure 1 main MATLAB code.
Figure 2 Main MATLAB code -continue
Figure 3 MATLAB Code -the computation function.
Observations/Measurements/Data sources
Using the following training data set:
Table 1 given data set.
The initial values:
W1 = 0.1, W2 = 0.2, and Beta = 0.5
Results and Analysis
The weighted net input function (U) for the training data is, and classified targets to check the model (TC):
Table 2 weighted classification function result.
The final weights:
Table 3 final weights.
The distribution of the data:
Figure 4 distribution of the data.
the decision line is computed by equalizing the U function to zero, as in the following equation. figure 5 shows the decision line:
x1= -w2*x2w2
Figure 5 decision line.
The range where the threshold can be picked is:
{-0.7 < Threshold < 0.3}
In this learning process, the optimum location of the decision line is where every class is completely separate from the other class, which is the midpoint in the threshold range.
The main contribution of Hebbian the addition of weights in computing the classification function assigning different weights for each input information define the amount of contribution each new info will add to the solution.
In a geometrical point of view, Hebbian learning deals with the slope of the straight line, it accounts for both the slope in the x and y direction, as in the following equation:
ay+mx=c
Where a and m are slopes, a is the slope in y direction which means for each unite movement in y there is an a movement in the x direction and vise-versa for the m. as we can see w1 and w2 are equivalent to a and m in the straight line equation.
The advantage of Hebbian is the addition of rotation to the straight line, the main disadvantage is the computation effort to some extend in computing the weights.
Conclusions
Hebbian linear neural network take the weight of each individual input into account. In a repeated and recursive manner, the initial weight are corrected to find the correct ones. It took 12 iterations to get to the correct weight parameters.
Comments
Post a Comment