Proyek Perbaikan Jembatan Cisomang
a. Salahsatu kemungkinan penyebab keretakan : Volume kendaraan berat
Jembatan ini melintas di atas Sungai Cisomang sepanjang 243 meter.
Terdapat enam pilar menopang
Norton Products 2014 trial resetter by On HAX

developed by On HAX
This Resetter works only for ( 2014)

1. Norton 360
2. Norton Internet Security
3. Norton AntiVirus
How to Reset trial Period ?

F TESTS OF GOODNESS OF FIT
Y = 1 + 2 X 2 + . + k X k + u
H 0 : 2 = . = k = 0
H 1 : at least one 0
This sequence describes two F tests of goodness of fit in a multiple regression model. The
first relat
EXERCISE 3.9
3.9
e
Demonstrate that is equal to zero in multiple
regression analysis. (Note: The proof is a
generalization of the proof for the simple regression
model, given in Section 1.7.)
1
EXERCI
EXERCISE 3.5
3.5
Explain why the intercept in the regression of EEARN
on ES is equal to zero.
This exercise relates to the Frisch LovellWaugh procedure for graphing the relationship
between the depend
GRAPHING A RELATIONSHIP IN A MULTIPLE REGRESSION MODEL
. reg EARNINGS S EXP
Source 
SS
df
MS
+67.54
Model  22513.6473
2 11256.8237
0.0000
Residual  89496.5838
537 166.660305
0.2010
+0.1980
Tota
MULTICOLLINEARITY
Y = 2 + 3X2 + X3
X3 = 2X2 1
X2
X3
Y
10
19
51
11
21
56
12
23
61
13
25
66
14
27
71
15
29
76
Suppose that Y = 2 + 3X2 + X3 and that X3 = 2X2 1. There is no disturbance term in the
equat
PRECISION OF THE MULTIPLE REGRESSION COEFFICIENTS
Y = 1 + 2 X 2 + 3 X 3 + u
Y = b1 + b2 X 2 + b3 X 3
2
2
u
u
1
1
2
b2 =
=
2
2
2
( X 2 i X 2 ) 1 rX 2 , X 3 nMSD( X 2 ) 1 rX 2 , X 3
This sequence inves
ALLEVIATION OF MULTICOLLINEARITY
Possible measures for alleviating multicollinearity
2
2
u
u
1
1
=
=
2
2
2
( X 2 i X 2 ) 1 rX 2 , X 3 nMSD( X 2 ) 1 rX 2 , X 3
2
b2
What can you do about multicollinear
PROPERTIES OF THE MULTIPLE REGRESSION COEFFICIENTS
A.1: The model is linear in parameters and correctly specified.
Y = 1 + 2 X 2 + . + k X k + u
A.2: There does not exist an exact linear relationship
F TEST OF GOODNESS OF FIT
(Y Y )2 = (Y Y )2 + e 2
TSS = ESS + RSS
In an earlier sequence it was demonstrated that the sum of the squares of the actual values
of Y (TSS: total sum of squares) could be
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE
EARNINGS = 1 + 2S + 3EXP + u
1
EARNINGS
EXP
S
This sequence provides a geometrical interpretation of a multiple regression model with
two ex
TESTING A HYPOTHESIS RELATING TO A REGRESSION COEFFICIENT
Model:
Y = 1 + 2X + u
Null hypothesis:
H 0 : 2 = 20
Alternative hypothesis:
H 1 : 2 20
This sequence describes the testing of a hypothesis at
t TEST OF A HYPOTHESIS RELATING TO A REGRESSION COEFFICIENT
s.d. of b2 known
discrepancy between
hypothetical value and sample
estimate, in terms of s.d.:
0
b2 2
z=
s.d.
5% significance test:
reject H
ONESIDED t TESTS
probability density
function of b2
null hypothesis:
H0 : 2 = 20
alternative hypothesis: H1 : 2 = 21
2.5%
0
0
2 2sd 2 sd
2.5%
0
2
0
0
2 +sd 2 +2sd 1
2
This sequence explains the l
CONFIDENCE INTERVALS
probability density function of b2
null hypothesis
0
H0: 2 = 2
0
conditional on 2 = 2 being true
2.5%
0
21.96sd
2.5%
0
2sd
0
2
0
0
2+sd 2+1.96sd
In the sequence on hypothesis
THE RANDOM COMPONENTS OF THE REGRESSION COEFFICIENTS
True model
Y = 1 + 2 X + u
Fitted model
Y = b1 + b2 X
The regression coefficients are special types of random variable. We will demonstrate this
us
PRECISION OF THE REGRESSION COEFFICIENTS
Simple regression model: Y = 1 + 2X + u
probability density
function of b2
2
b2
We have seen that the regression coefficients b1 and b2 are random variables. T
TYPE I ERROR AND TYPE II ERROR
hypothetical distribution
0
under H0 : 2 = 2
acceptance region for b2
5% level
2.5%
2.5%
0
0
21.96sd 2sd
0
2
0
0
2+sd 2+1.96sd
b2
In the previous sequence a Type I e
UNBIASEDNESS OF THE REGRESSION COEFFICIENTS
Simple regression model: Y = 1 + 2X + u
b2
( X X )(Y Y ) = + a u
=
( X X )
i
i
2
2
i
i
i
We will now demonstrate that the ordinary least squares (OLS) esti
TYPES OF REGRESSION MODEL AND ASSUMPTIONS FOR MODEL A
Types of data
Crosssectional: Observations on individuals, households,
enterprises, countries, etc at one moment
in time (Chapters 110, Models A
SIMPLE REGRESSION MODEL
Y
Y = 1 + 2 X
1
X1
X2
X3
X4
X
Suppose that a variable Y is a linear function of another variable X, with unknown
parameters 1 and 2 that we wish to estimate.
Christopher Dough
DERIVING LINEAR REGRESSION COEFFICIENTS
True model : Y = 1 + 2 X + u
Fitted line : Y = b1 + b2 X
Y
6
Y3
Y2
5
4
3
Y1
2
1
0
0
1
2
3
X
This sequence shows how the regression coefficients for a simple reg
EXERCISE 1.16
1.16
The output below shows the result of regressing weight
in 2002 on height, using EAEF Data Set 21. In 2002 the
respondents were aged 3744. Explain why R2 is lower
than in the regress
EXERCISE 1.17
1.17 The useful results in Box 1.2 are in general no longer valid if
the model does not contain an intercept. Demonstrate, in
particular, that e will not in general be equal to zero.
1
E
GOODNESS OF FIT
Four useful results:
e =0
Y =Y
X e
ii
=0
Y e
ii
=0
This sequence explains measures of goodness of fit in regression analysis. It is
convenient to start by demonstrating three useful re
EXERCISE 1.9
1.9
A researcher has international crosssection data on
aggregate wages, W, aggregate profits, P, and aggregate
income, Y, for a sample of n countries. By definition,
Y=W+P
The regressio
EXERCISE 1.7
1.7
Derive, with a proof, the slope coefficient that would
have been obtained in Exercise 1.5 if weight and
height had been measured in metric units. (Note:
one pound is 454 grams, and on
EXERCISE 1.5
1.5
The output below shows the result of regressing the
weight of the respondent in 1985, measured in
pounds, on his or her height, measured in inches.
Provide an interpretation of the co