2_19_09_LinearRecurrentNetworks_1

# 2_19_09_LinearRecurrentNetworks_1 - A full feedforward...

This preview shows pages 1–10. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
A full feedforward network has vector inputs and outputs connected by a weight matrix.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
A recurrent network is a feedforward network with a recurrent synaptic weight matrix.
For a feedforward network: For a recurrent network: τ r dv a dt = v a + F W ab b = 1 N a u b r d v dt = v + F W u ( ) r d v dt = v + F W u + M v ( )

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
Assuming that the activation function F is linear, that is, F(x)=x, and denoting the input as h = W . u τ r d v dt = v + h + M v r d v dt = I v + h + M v r d v dt = M I ( ) v + h r d v dt = v + F W u + M v ( )
τ r d v dt = M I ( ) v + h General solution of homogeneous equation Particular solution of full equation, including h Solution is linear combination of

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
The homogeneous equation describes a vector field, with curved and straight trajectories.
The straight-line solutions are the simplest, as for them: d Y dt = A Y Y t ( ) = f t ( ) Y 0 df t ( ) dt Y 0 = A f t ( ) Y 0 ( ) = f t ( ) A Y 0 df t ( ) dt f t ( ) = λ A Y 0 =

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

{[ snackBarMessage ]}

### Page1 / 20

2_19_09_LinearRecurrentNetworks_1 - A full feedforward...

This preview shows document pages 1 - 10. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online