# wk05 - CS195f Homework 4 Mark Johnson and Erik Sudderth...

This preview shows pages 1–2. Sign up to view the full content.

This preview has intentionally blurred sections. Sign up to view the full version.

View Full Document
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: CS195f Homework 4 Mark Johnson and Erik Sudderth Homework due at 2pm, 20th October 2009 In this problem set, we study different approaches to linear regression using a one- dimensional dataset collected from a simulated motorcycle accident. The input variable, x , is the time in milliseconds since impact. The output variable, y , is the recorded head acceleration. The dataset is available here: /course/cs195f/asgn/regression/motor.mat We have divided the full dataset into 40 training examples (variables Xtrain and Ytrain ), and 53 test examples (variables Xtest and Ytest ). When fitting polynomial functions, as explored below, numerical problems can arise when the input variables take even moderate values. To minimize these, all training features should be scaled to lie in the interval [- 1 , +1] before fitting. Note that an equivalent scaling must then be applied to all test data. Here is an example script to get you started: /course/cs195f/asgn/regression/motorDemo.m Question 1: a) Consider a polynomial basis, with functions j ( x ) = x j . Write a function which evaluates these polynomial functions at a vector of points x i R , for any j . In a single figure, plot j ( x ) for- 1 x 1 , and j = 0 , 1 , 2 ,..., 19 . Hint: To create a dense regular grid of points at which to evaluate and plot these functions, use the linspace command. b) Consider the standard linear regression model, in which observations y i follow a Gaussian distribution centered around a linear function w of a fixed set of basis functions: p ( y i | x i ,w, ) = N ( y i | w T ( x i ) ,- 1 ) Here, is the inverse variance or precision. Define a family of regression models, each of which contains all polynomials j ( x ) of order j M , where M is a parameter control- ling model complexity. Compute maximum likelihood (ML) estimates w of the regression parameters for models of order M = 0 , 1 , 2 ,..., 19 . Plot, as a function of x , the mean prediction...
View Full Document

## This note was uploaded on 11/03/2009 for the course CS 195f taught by Professor Johnson during the Spring '09 term at Sanford-Brown Institute.

### Page1 / 4

wk05 - CS195f Homework 4 Mark Johnson and Erik Sudderth...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document
Ask a homework question - tutors are online