Supervised Classification Using Hebbian Linear Neural Network with A Bias Term (Wo)
Introduction
Hebbian linear neural network with a bias term allows for a translation capability in addition to the rotation that is made using the weights parameters w1 and w2.
Objectives
Familiarize with the concept of supervised classification using linear Neural Network (Hebbian Network) with a bias term.
Methodology
Using MATLAB code to formulate the learning problem, by computing the net input function (U) using given values of weights and bias term., and derive the Error E, and update weights if applicable until getting the right classification.
The result is tested by calculating the target values (t) of the given data. The data distribution and decision line are then plotted.
Figure 1 main MATLAB Code
Figure 2 main MATLAB Code - Continue
Figure 3 MATLAB Code - computation function
Observations/Measurements/Data sources
Using the following training data set:
Table 1 given data set.
The initial values:
W0 = 1, W1 = 0.1, W2 = 0.2, and Beta = 0.5
Results and Analysis
The weighted net input function (U) for the training data is:
Table 2 weighted classification function result.
The final weights:
Table 3 final weights.
The distribution of the data:
Figure 4 data Distribution
the decision line is computed by equalizing the u function to zero, as in the following equation, figure 5 shows the decision line:
x2= -w0w2-w1*x1w2
Figure 5 decision Line.
The range where the threshold can be picked is:
{ -1.4 < Threshold < 0.6}
Conclusions
The use of the bias term highly reduces the amount of recursive attempts to get the corrects weight parameters. As the Hebbian learning without bias terms took about 12 iterations, while with bias term only took 2 iterations.
Comments
Post a Comment