This preview shows page 1. Sign up to view the full content.
Unformatted text preview: sequence is seen.
Applications: speech recognition
Sequence Reproduction: Generate the rest of a sequence
when the network sees only part of the sequence.
Applications: Time series prediction (stock market, sun
Temporal Association: Produce a particular output
sequence in response to a specific input sequence.
Applications: speech generation ECE 517: Reinforcement Learning in AI 9 Recurrent
Recurrent vs. Feedforward The RNN input now contains a term which reflects the state of
the network (the hidden unit activation) before the pattern was
When we present subsequent patterns, the hidden and output
units' states will be a function of everything the network has
seen so far The network behavior is based on its history, and so we must
think of pattern presentation as it happens in time.
ECE 517: Reinforcement Learning in AI 10 Network
Once we allow feedback connections, our network topology
becomes very free: we can connect any unit to any other,
even to itself
Two of our basic requirements for computing activations
and errors in the network are now violated
We required that before computing yi, we had to know the
activations of all units in the posterior set of nodes
For computing errors, we required knowledge of all the
errors of all units in a nodes anterior set For
For an arbitrary unit in a recurrent network, we now
define its activation at time t as:
at yi(t) = fi(neti(t-1))
ECE 517: Reinforcement Learning in AI 11 Network
View Full Document