{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Applications applications speech recognition sequence

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: sequence is seen. Applications: Applications: speech recognition Sequence Sequence Reproduction: Generate the rest of a sequence when the network sees only part of the sequence. Applications: Time series prediction (stock market, sun spots, etc) Temporal Temporal Association: Produce a particular output sequence in sequence in response to a specific input sequence. Applications: speech generation ECE 517: Reinforcement Learning in AI 9 Recurrent Recurrent vs. Feedforward The RNN input now contains a term which reflects the state of the network (the hidden unit activation) before the pattern was seen When we present subsequent patterns, the hidden and output units' states will be a function of everything the network has seen so far The network behavior is based on its history, and so we must think of pattern presentation as it happens in time. ECE 517: Reinforcement Learning in AI 10 Network Network Topology Once we allow feedback connections, our network topology becomes very free: we can connect any unit to any other, even to itself Two of our basic requirements for computing activations and errors in the network are now violated We We required that before computing yi, we had to know the activations activations of all units in the posterior set of nodes For For computing errors, we required knowledge of all the errors of all units in a nodes anterior set For For an arbitrary unit in a recurrent network, we now define its activation at time t as: at yi(t) = fi(neti(t-1)) ECE 517: Reinforcement Learning in AI 11 Network Network Topology...
View Full Document

{[ snackBarMessage ]}