and to strengthen the empirical model, this study will use E-view software.
3.5.1 Ordinary Least Squares (OLS)
The OLS (Ordinary Least Square) regression model is primarily used to estimate linear
regression parameters (Gauss, 1829). His concept is quite basic. It is to find some value
which minimizes and uses the sum of squares of the difference between the actual value and
the estimate of the model as the estimate of the parameter. That is, by minimizing the sum of
the squares of the errors, the best function fit for the data is found. That is, by minimizing the
sum of the squares of the errors, the best function fit of the data is found. The least square
method can be used to efficiently collect unknown data and to minimise the sum of the error
squares between the collected data and the actual data. The less square method typically is
used for curve fitting mentioned by Farahani (2010).
Basic OLS linear regression principle: the best-fit curve must minimize the number of
distance squares from each point to the line (i.e., the sum of residual squares, or RSS):

Yt — dependent variable
xt — independent variable
α
、
β — regression parameters
μt —Random error
This research will form a model of regression on Gross Domestic Product (GDP) per capita,
with, investment rate (I), net export (NX), inflation rate (CPI), and unemployment rate (UER)
by using OLS in E-views software.
3.5.2 Diagnostic Checking
3.5.2.1 Multicollinearity
Multicollinearity exists whenever an independent variable is highly correlated with other
independent variables in a multiple regression equation. According to Farrar (1967) If the t-
test is most insignificant for all parameters, the F-test is significant, the R-squared is high,
then multicollinearity issues will occur. However, in a regression model, there are different
approaches to detect multicollinearity.
Firstly, the dependent variable has a few important independent variable effects, but the
performance of the regression model indicates that the model has a high R-squared. Thus, the
model could have a problem with multicollinearity.,
Secondly, when predictors are added, substantial changes in coefficients can occur when the
predictors are fully independent of each other when adding or removing one, their
coefficients will not change at all. However, the more they interfere, the more their
coefficients can change dramatically.
Thirdly, The Variance Inflation Factor (VIF) is a method used to assess the severity of
multicollinearity for the identification of the existence of a multicollinearity defect.
VIF
i
=
1
1
−
R
i
2

Moreover, according to Kelava (2008) the higher the R
2
will lead to the higher VIF value, the
more severe multicollinearity the model has between the regressors. According to Figure 1:
Degree of correlation, if R
2
is zero, the VIF is equal to 1, which means that two or more
explanatory variables do not have multicollinearity. If R
2
is greater than 0.9, the VIF is
greater than 10, which means that there is a severe correlation (severe multicollinearity) in
the method.