rnnPres

rnnPres - TylerDavis February16,2010 (FNN Easytotrain (RNN...

Info iconThis preview shows pages 1–6. Sign up to view the full content.

View Full Document Right Arrow Icon
Constructive Training of Recurrent Neural Networks Tyler Davis ME697—Intelligent Systems February 16, 2010
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Feed Forward Neural Networks (FNN) Universal approximator Easy to train Modeling dynamic systems is not straight forward or efficient
Background image of page 2
Recurrent Neural Networks (RNN) Universal approximator of dynamic systems Intuitive correlation with state space systems Difficult to train N. Subramanya and Y. C. Shin, 2009
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
RNN Training Difficulties Network architecture and structure must be selected Learning parameters via backpropagation is extremely slow Constructive methods for FNNs do not extend suitably to RNNs Current training strategies Expert knowledge and iteration Direct network optimization An automated constructive training method for RNNs is highly desirable.
Background image of page 4
Constructive RNN Training Formulation Using the Elman network structure a discrete state space equation of the form can be equivalently represented as where σ
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Image of page 6
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

Page1 / 13

rnnPres - TylerDavis February16,2010 (FNN Easytotrain (RNN...

This preview shows document pages 1 - 6. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online