# h08-4 - CIS6930/4930 Intro to Computational Neuroscience...

This preview shows pages 1–2. Sign up to view the full content.

CIS6930/4930 Intro to Computational Neuroscience Fall 2008 Home Work Assignment 4: Due Tuesday 11/18/08 before class 1. (Asymptotic Equipartition) Consider the alphabet { A,B } with probabilities p(A)=0.27 and p(B)=0.73. Verify that the entropy of this distribution is H = 0 . 841464 . Now consider all 1048576 sequences of length 20 generated out of this alphabet. Choose a small ± , say 0 . 001 , and ﬁnd the number of sequences that have probabulities between 2 - 20 * ( H + ± ) and 2 - 20 * ( H - ± ) (when the sequences are gen- erated i.i.d). How many bits would be necessary to encode for any one of these sequences assuming that you have a simple index table for these sequences? Now compute the sum of the probabilities of these sequences. Assuming that you use 20 bits to encode for the remaining sequences, how many bits on average would be necessary to encode a string of length 20? 2. Code and test a feed forward net of sigmoidal nodes with two input units, ten hidden units and one output unit that learns the concept of a circle in 2D space. The concept is:

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

## This note was uploaded on 01/15/2012 for the course CIS 4930 taught by Professor Staff during the Spring '08 term at University of Florida.

### Page1 / 2

h08-4 - CIS6930/4930 Intro to Computational Neuroscience...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online