This preview shows page 1. Sign up to view the full content.
Unformatted text preview: tandard vanilla backpropagation training.
The stock market index as seen from Fig. 1 has a continuously rising exponential trend. If
the index value is normalized linearly and fed to the neural network, it is possible that the
normalized future values of the index will be greater than unity, so that the neural network
will never be able to track those values. Hence, a non-linear normalization is required.
The function selected for normalization is the hyperbolic tangent function, as suggested in
. The actual function used is:
spx_norm = 0.5*(1+tanh((spx-spx_avg)/A))
where, spx_norm is the normalized value and spx_avg is the average of the index over the
training data only. A is a constant to obtain the desired data range (so as to avoid
saturation of the tanh function. The tanh function will give an output in the range [-1,1],
which is then converted to [0,1] linearly (by adding 1 and then multiplying by 0.5).
As a first try, such a normalization was used and the data selected was the last 325 weeks
(6 years) with the testing data being the last 50 weeks and the training ba...
View Full Document
- Spring '12