The neural network function is differentiable with

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: uous and smooth. 4. Activation function is monotonic. This property is not necessary, since we know that RBF networks is also a kind of power model. Note : A key difference between a perceptron and a neural network is that a neural network uses continuous nonlinearities in the units, for the purpose of differentiation, whereas the perceptron often uses a non- differentiable activation function. The neural network function is differentiable with respect to the network parameters so that a gradient descent method can be used in training. Moreover, a perceptron is a linear classifier, whereas a neural network,by introducting the nonlinear transformation ,it greatly enlarges the class of linear models and by combining layers of perceptrons, neural network is able to classify non- linear problems through proper training. Figure: Graph of By assigning some weights to the connectors in the neural network (see diagram above) we weigh the input that comes into the perceptron, to get an output that in turn acts as an input to the next layer of perceptrons, and so on for each layer(There are no cross- connections between units in the same layer and no backward connections from layers downstream. Typically, units in layer k provide input only to units in layer k +1). This type of neural network is called Feed- Forward Neural Network (http://en.wikipedia.org/wiki/Feedforward_neural_network) . Applications to Feed- Forward Neural Networks include data reduction, speech recognition, sensor signal processing, and ECG abnormality detection, to name a few. [9] Back-propagation Introduction: For a while, the Neural Network model was just an idea, since there were no algorithms for training the model until 1986, when Geoffrey Hinton [10] devised an algorithm called back-propagation [18] (http://en.wikipedia.org/wiki/Backpropagation#Algorithm) . After that, a number of other training algorithms and various configurations of neural networks were implemented. Work procedure: Each neuron receives a signal from previous neurons, and each of their signals i...
View Full Document

This document was uploaded on 03/07/2014.

Ask a homework question - tutors are online