{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

class_11_07

# class_11_07 - Statistical Data Mining ORIE 474 Fall 2006...

This preview shows pages 1–5. Sign up to view the full content.

Statistical Data Mining ORIE 474 Fall 2006 Tatiyana Apanasovich 11/27/06 Artificial Neural Networks(Cont.)

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
NN: training and overfitting The most delicate part of neural network modeling is generalization, the development of a model that is reliable in predicting future accidents. Overfitting (i.e., getting weights for which squared error is so small on the training set that even random variation is accounted for) can be minimized by having two validation samples in addition to the training sample. Usually, the data set is divided into three subsets: 40% for training, 30% to prevent overfitting, and 30% for testing. Training on the training set should stop at the epoch when the error computed on the second set begins to rise (the second set is not used for training but merely to decide when to stop training). Then the third set is used to see how well the model performs. The cross-validation helps to optimize the fit in three ways: by limiting/optimizing the number of hidden units, by limiting/optimizing the number of iterations, and by inhibiting network use of large weights.
The major advantages and disadvantages of neural networks in modeling applications are as follows: Advantages There is no need to assume an underlying data distribution such as usually is done in statistical modeling. Neural networks are applicable to multivariate non-linear problems. The transformations of the variables are automated in the computational process. Disadvantages Minimizing overfitting requires a great deal of computational effort. The individual relations between the input variables and the output variables are not developed by engineering judgment so that the model tends to be a black box or input/output table without analytical basis. The sample size has to be large.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
How are NNs related to statistical methods? Most neural networks that can learn to generalize effectively from noisy data are similar to statistical methods. For example: Feedforward nets with no hidden layer are basically generalized linear models. Feedforward nets with one hidden layer are closely related
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 9

class_11_07 - Statistical Data Mining ORIE 474 Fall 2006...

This preview shows document pages 1 - 5. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online