This preview shows page 1. Sign up to view the full content.
Unformatted text preview: brain in two respects: 1. Knowledge is acquired by the network from its environment through a learning process.
2. Interneuron connection strengths, known as synaptic weights, are used to store the acquired knowledge.
 It is a multistage regression or classification model represented by a network. Figure 1 is an example of a typical neural network but it can have many different forms. It network applies both to regression or classification.
wikicour senote.com/w/index.php?title= Stat841&pr intable= yes 37/74 10/09/2013 Stat841 - Wiki Cour se Notes A regression problem typically has only one unit
in the output layer, but these networks can handle
multiple quantitative responses in a seamless fashion.
In a k- class classification problem, there are usually k target measurements units
layer that each represent the probability of class k and each
is coded (0,1). in the output Activation Function
Activation Function (http://en.wikipedia.org/wiki/Activation_function) is a term that is frequently used in classification
In perceptron, we have a "sign" function that takes the sign of a weighted sum of input features.
Figure 1: General Structure of a Neural Network. The sign function is of the form
; it is not continuous at 0 and we cannot take derivative of it. Thus, we replace it by a smooth function
of the form
and call it
the activation function.
The choice of this function is determined by the properties of the data and the assumed distribution of target variables, but for multiple binary classification problems the
logistic function, also known as inverse- logit (sigmoid function (http://en.wikipedia.org/wiki/Sigmoid_function) ), is often used:
There are some important properties for the activation function.
1. Activation function is nonlinear. It can be shown that if the activation function of the hidden units is linear, a
three- layer neural network is equivalent to a two layer one.
2. Activation function saturate, which means there are maximum and minimum output value. This property
ensures that the weights are bounded and therefore the searching time is limited.
3. Activation function is contin...
View Full Document
- Winter '13