{[ promptMessage ]}

Bookmark it

{[ promptMessage ]}

Multcons - MULTCONS Philosophy Multilayer Temporal...

Info iconThis preview shows pages 1–12. Sign up to view the full content.

View Full Document Right Arrow Icon
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 2
Background image of page 3

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 4
Background image of page 5

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 6
Background image of page 7

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 8
Background image of page 9

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 10
Background image of page 11

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
Background image of page 12
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: MULTCONS Philosophy Multilayer Temporal Constraint Satisfaction Network ' Fundamental limits on how accurate phonetic “labeling” can be, based only on acoustic features - even “it” perfect labeler, we do not speak phonetically correct 0 Requires context 9 Local acoustic context — time-varying features - recurrent network/Input preprocessing o Phoneme context o Word context 9 Phrase/Grammar context 0 T opic/Application context ° MULTCONS allows all levels of context to interact and compete before final decisions/suggestions are made 0 Based on a Novel Neural Network Approach Basic Neuron Node net = 2):in Node Activation = f(net) -2 0 +2 Activation Function Learns by Modifying Weights after seeing training data, then generalizes on novel data (i.e. medical diagnosis) Constraint Satisfaction Network Example: Free Market Economics 0 Typically stable at an equilibrium state .0 Assume that an outside adjustment is made (e. g. Spotted Owl) 0 After a short amount of interaction, the network relaxes to a new stable state MULTCONS MUlti-layer Temporal CONstraint Satisfaction networks Grammar Layer (GL) Common Word Layer (WL) Phoneme Layer (PL) ----------------- ~40 Phonemes V Recurrent Network VVVVVVVVVYVVVVVVV Frequency and Time Domain Inputs .MULTCONS Timing t—l t-2 (GL) 3'41. -41.' V k V -" (W J! ‘- a! 775' “ --- o‘ 0 ¥ A, ‘1 V1 ’4"! (PL) .. y; Phoneme Estimation Net VVVVVVYVVVVVVVVVV Phoneme Time Step Triggered by Segmentation Estimation Algorithm Phoneme Layer Interaction Preceding Phonemes Following Phonemes Mutual Inhibition 40 Preceding Phonemes Recurrent Network Phoneme Layer and Word Layer Interaction Preceding Words Following Words + Oven 4 is £ ‘@ Common Words 5 / i f 40' Preceding Recurrent Network Phonemes Word Layer Sequential Excitation Preceding Words Following Words Common Words Grammar Layer Usage Assume possible competing words “of” and “oven” at the word layer: list f en ineers turn on ove If the word “on” is active in the word layer then the corresponding “Preposition” node of the grammar layer would be active. Since nouns typically follow prepositions, the following “noun” node in the grammer layer would be active and would excite the “oven” node in the word layer. The “of” node of the word layer would be inhibited since it’s corresponding preposition node would be inhibited by the preposition node corresponding to “on”. Addition of Grammar Layer Preceding Parts of Speech Following Parts of Speech + @ Parts of Speech + Common Words 40 Preceding Phonemes Recurrent Network Time Windows Current time t LLU—LLJJLU Sentence Layer — only competing word slots, slightly longer or the same as the word layer. Note that in this case final word decision lags about 3 words behind time t. The decision can be the single best word sequence or the top 0 candidates. Sentence and Word windows slide left on phoneme shifts and words continue to interact until they are completely out of the Window. Decision Area . . WW W WWW This includes four » words ' Word Layer — both phoneme and word slots, potential words overlap - length of w phoneme slots (for w=25, covers approximately 5 words). Note that contiguous streams of words are best candidates. This is supported automatically by the network interaction. Also note that the left half of the word window has no vertical interaction with the PL, but continues to have horizontal interaction within the WL and vertical interaction with the SL. Phoneme Layer — k phoneme slots. k an odd value where 5 S k S 11. Run M network time steps before accepting the next phoneme shift. Note that earlier phonemes will be gone before final word decisions are made. t—2 t—l t will Recurrent Net — Microbes and Phoneme time slot generation Multcons Connections N~PlLS(N,+/—)fl ~——PlLS(N,+/—) s,+)/' («SH xor ~SE(S-) Blas(S,+/-) / //’_\ ~FlLS(N,+/-)/% « G'Namma’ FlLS(N,+/—) ode 1—\ M0I(N,-)L\~ SBl(S,-) \\ W+2 N, dlStI‘lb W-: none 'V-: Max of PWA~WGXLS(S,-) ~GWXLS,- - . WGXLS,+/— GWXLS,+ A _ / / — W+2 N, distrib" / " W-: N2 f PG W—: N /,_\ W+: none —PlLS(N.+/,-)— mg: FILS(N,+/-) flBias(S,+/—)”> <\MOKNrLX SE(S,+)/’ SBl(S,-) v / \ {~SE(S,-) \\\_/// \ . ----- xor ------ PAW(S,+:~PAW(S,-) W+Z N, distrib PA W-: N, distrib “PA Bia5(s'+") Phon Activation Node WPXLSH‘ \\’/ M \ W_- N \ ------ xor ------- PPAXLS,+/~.PPAXLS W+: none \ / " w-:N2 of PP A . I \ “““~PiLS(N,+/—)\> k~FlLS(N,+/-)/ ~—PlLS(N,+/—) Phhj’gjgne FILS(N,+/—) /Bias(s,+/-)/' “Wm-Lat- FlLS ~ Follows lntralayer Synapse. Weight based on the probability of Y given that X follows. . ~F|LS - Not Follows ILS. Weight based on the probability of Y given that X does not follow. PlLS - Preceeds Intralayer Synapse. Weight based on the probability of Y given that X preceeds. ~P|LS - Not Preceeds ILS. Weight based on the probability of Y given that X does not preCeed. SE - Sequential Excitation. Weight based on a sequence of active nodes or properly bounded by silence. SBl - Silence boundary inhibition. Inhibition due to a silent area found in the middle of a word. MOI - Mutual Overlap inhibition. inhibitory weight between overlapping nodes based on activations and amount of overlap. Ml - Mutual Inhibition. Inhibitory weight based on excitation of other phonemes. Bias - Offset value for node. WGXLS — Word-to-Grammar eXtra Layer Synapse. Weight based on probability of grammar node Y given that word node X is active. GWXLS - Grammar-to-Word eXtra Layer Synapse. Weight based on probability of word node Y given that grammar node X is active. ~GWXLS - Not Grammar-to—Word eXtra Layer Synapse. Probability of word node Y given that grammar node 'X‘does is not active. PAW - Phoneme Activation (PA)—to-Word. Weight based on the probability of word node Y given that the combined activation of word Y's phonemes is positive. . . ~PAW - Not PA-to-Word. Weight based on the probability of word node Y given that the _ combined activation of the word Y's phonemes is negative. PA Bias - Bias for PA node PPAXLS - Phon-to-PA Node eXtra Layer Synapse. Weight based on the presence of phoneme X in word Y ~PPAXLS - Not Phon-to-PA node eXtra Layer Synapse. Weight based on the absence of phoneme X in word Y WPXLS - Word-to-Phon eXtra Layer Synapse. Weight base of probability of Phoneme Y given word X. ' ...
View Full Document

{[ snackBarMessage ]}