## Perceptrons and basic neural networks By Eric Suh
Perceptrons are the easiest data structures to learn for
the study of Neural Networking. Think of a perceptron as
a node of a vast, interconnected network, sort of like a binary tree, although the network does not necessarily have to have
a top and bottom. The links between the nodes not only show the
relationship between the nodes but also transmit data and information,
called a signal or impulse. The perceptron is a simple
model of a neuron (nerve cell).
Since
linking perceptrons into a network is a bit complicated, let's take
a perceptron by itself. A perceptron has a number of external input
links, one internal input (called a Usually, the input values are All of the inputs (including the bias) have The threshold is one of the key components of the perceptron. It
determines, based on the inputs, whether the perceptron fires or
not. Basically, the perceptron takes all of the weighted input values
and adds them together. If the sum is above or equal to some value
(called the The threshold is like a wall: if the "signal" has enough
"energy" to jump over the wall, then it can keep going,
but otherwise, it has to stop. Traditionally, the threshold value
is represented either as the Greek letter The main feature of perceptrons is that they can This learning method is called the Change in This can be elegantly summed up to: The delta rule works both if the perceptron's output is too large
and if it is too small. The new Interestingly, if you graph the possible inputs on different axes
of a mathematical graph, with pluses for where the perceptron fires
and minuses where the perceptron doesn't, the weights for the perceptron
make up the For instance, in the picture above, the pluses and minuses represent the OR binary function. With a little bit of simple algebra, you can transform that equation in the diagram to the standard line form in which the weights can be seen clearly. (You get the following equation of the line if you take the firing equation and replace the "greater than or equal to" symbol with the equal sign). This equation is significant, because So, by themselves, perceptrons are a bit limited, but that is their
appeal. |