李天行952221E030002

李天行952221E030002

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 161 ( Independent component analysis, ICA) Regression, SVR) ICA S VR S VR 2 25 S VR (Support Vector S VR 162 Using Independent Component Analysis and Support Vector Regression for Financial Time Series Forecasting Chi-Jie Lu Department of Industrial Engineering and Management, Ching Yung University Tian-Shyug Lee Graduate Institute of Management, Fu-Jen Catholic University Hsueh-Chun Chen Graduate Institute of Applied Statistics, Fu-Jen Catholic University Abstract As financial time series are inherently noisy, non-stationary and deterministically chaotic, it is one of the most challenging applications of modern time series forecasting. Due to the advantages of the generalization capability in obtaining the unique and global optimal solution, support vector regression (SVR), has also been successfully applied in time series prediction, especially in the financial time series forecasting. In the modeling of financial time series using SVR, one of the key problems is the inherent high noise. Therefore, detecting and removing the noise are important but difficult tasks when building an SVR forecasting model. To alleviate the influence of noise, a two-stage approach by integrating independent component analysis (ICA) and support vector regression is proposed in this research for financial time series forecasting. ICA is a novel statistical signal processing technique that was originally proposed to find the latent source signals from observed mixture signal without knowing any prior knowledge of the mixing mechanism. The proposed approach first uses ICA to the forecasting variables for generating the independent components (ICs). After identifying and removing the ICs containing the noise, the rest of the ICs are then used to reconstruct the forecasting variables which contain less noise. The SVR is then applied to use the filtered (or denoised) forecasting variables to build the forecasting model. In order to evaluate the performance of the proposed approach, the Nikkei 225 opening index is used as the illustrative example. The experimental results show that the proposed model outperforms the SVR model with non-filtered forecasting variables and random walk model. Key words: Independent component analysis, Support vector regression, Financial time series forecasting, Stock index 163 ( High frequency Inherently noisy (Deboeck 1994; Yaser & Atiya 1996) Non-stationary (Chaotic) ( Qualitative) (Quantitative) (Delphi (Historical Technique) Analogy) (Jury of Executive Opinion) ARIMA (Moving average) (Vector autoregression) ( Exponential smoothing) GARCH (Antoniou & Holmes 1995; Jung & Boyd 1996; Parisi & Vasquez 2000; Balachandher et al. 2 002; Kwon & Kish 2002) ( Lee & Chen 2002, Lee & Chiu 2002) (Back-propagation neural networks, BPNN) (Zhang et al., 1998; Vellido et al. 1999) ( Local optimum) ( Topology) (Cao & Tay 2001; Lee e t al. 2003; Cao 2003; Kim 2003) (Statistical learning theory) (Support vector machine, SVM) (Vapnik et al. 1 997; Vapnik 2000) ( High-dimensional feature space) (Hyperplane) S VM S VM 164 (Texture classification) ( Image recognition) (Hand-written digit recognition) ( Data mining) (Bioinformatics) Burbidge et al. 2001; Kim et al . 2002; Chang et al. 2 003; Norinder 2003; Li et al. 2003; Shin et al . 2005 ε-insensitivity ( Loss function) (Support vector regression, SVR) Drucker et al. (1997) S VM Vapnik et al . (1997) (Thissen et al .2003; Mohandes et al. 2004; Koike & Takagi 2004; Karras & Mertzios 2004) S VR (Over-fitting) S VR S VR SVR(Suykens et al. 2 002) (Under-fitting) r obust SVR(Chuang et al. 2 002) S VR S VR S VR (Signal separation) w eighted Least Square ( Independent component analysis, ICA) ICA S VR ICA S VR S VR S VR S VR S VR 2 25 S VR 165 (Independent component analysis, ICA) ( ICA) (Blind Source Separation, BSS) Amari 2002) (Mixture signals) ICA ICA ( Bartlett & Sejnowski 1997; Park et al. 1 999; Vigario et al. 2 000; Jung et al. 2001; Bartlett et al. 2002; Jang et al. 2 002; James & Gibson 2003; Beckmann & Smith 2004; Lin e t al. 2004; Kim et al. 2 004) Visser & Lee(2003) ICA (Background noise) ( Interfering point source) (Speaker signal) detection) (MEG) ICA Déniz et al.(2003) ICA ICA (1997) 28 ICA Oja et al.(2000) AR) AR ICA AR ICA ICA (Autoregression, ICA ICA Kiviluoto & Oja(1998) ICA SVM S VM Back & Weigend ICA (Speaker activity Ikeda & Toyama(2000) ( Factor analysis) ICA ( Lee 1998; Hyvärinen & Oja 2000; Cichocki & ( Latent source signals) ( Independent Component, IC) (Hyvärinen et al. 2 001) 166 (Support vector regression, SVR) (Support vector machine, SVM) theory) principle) 1997; Vapnik 2000) SVM (Statistical learning ( Structural risk minimization inductive ( Margin) (Vapnik et al . (Quadratic programming, QP) Drucker et al.(1997) S VM ( Koike & ANN A RMA S VR Vapnik et al .(1997) ε-insensitivity ( SVR) S VR Takagi 2004; Karras & Mertzios 2004) Thissen et al. (2003) S VR S VR A RMA ANN (Multilayer Perceptrons, MLP) S VR Mohandes et al. (2004) R MSE M LP P ai & Lin(2005) S VR S ARIMA(Seasonal ARIMA) ( General Regression Neural Network, GRNN) S VR GRNN S ARIMA Tay & Cao(2001) K im(2003) SVR BPNN (KOSPI) Cao(2003) (Case-based reasoning) S VR S VR (Self-organizing feature map network) S VR (Santa Fe Institute) S VR S VR 5 7.83% ( ICA) Lee 1998; Hyvärinen & Oja 2000) ( Latent variables) ( Comon 1994; ICA ( Latent sources) ICA 167 ( Independent components, ICs) X = [x1, x 2 , ..., x M ]T xi 2000) 1× N xi M ×N ICA M≤N ( Hyvärinen & Oja X = AS = ai vector) vector) s i M aisi i =1 1 (Mixing matrix)A ( Source matrix) S M ×M i i M ×M ( Column ( Row si M ×N X si ICA si X M ×N 2 ( Demixing matrix) W Y Y = [y i ] = WX yi A ICA Oja 2000) ICA Y i ( Row vector) −1 yi W ( Independent component, IC) W=A si (Hyvärinen & W IC W IC ( non-Gaussianity) ( Mutual ( Hyvärinen e t al. IC ICA information) ( non-Gaussian distribution) IC ICA ( High order cumulants) ( Negentropy) ( Entropy) ( Hyvärinen & Oja 2000; Hyvärinen et al. 2 001) 2001; David & Sanchez 2002) ( ) y p y (η ) y H H (y ) = − p y (η ) log p y (η ) dη (Cover & Thomas 1991) (Hyvärinen et al. 2001) H H (Hyvärinen et al. 2 001) J (Negentropy) 168 J (y ) = H (y gauss ) − H (y ) y gauss J (y ) ≥ 0 3 y y J (y ) ICA y y Maximize J (y ) f (y ) H yvärinen(1999) 4 1 G ICA J ( y) H yvärinen (1999) FastICA ICA G J ( y ) ∝ [ E{G ( y )} − E{G (ν )}]2 ν ( Non-quadratic function) F astICA (Support vector regression) S VR r ( x) q = r ( x) + δ q δ q (Scalar) (x i , q i ) , (i = 1, ..., n) S VR ( Input space) (Feature space) (F) S VR f (x) = ( ⋅ Φ(x)) + b b x (Random error) x S VR x ( Map) ( Vapnik 2000) 5 ( bias) Φ (x) ( ⋅ Φ(x)) F ( Minimization of empirical risk) S VR Vapnik et al . (1997) ε-insensitivity ( Lε ) 169 Lε ( f (x) − q) = for f ( x) − q − ε f (x) − q ≥ ε otherwise 0 f (x) ) ( ε 1 ε-insensitivity ε (Slack variables, ξ, ξ*) qi − f (x i ) − ε = ξ i , q i − f (x i ) − ε = ξ i , i = 1, ..., n ξ 1 * ξ* q f( x ) qi f (x) + ε f (x) f (x) − ε ξi ε ε ξ* j qj x 1 (Regularization constant) C term) Minimize: 1 2 ε-insensitivity (Empirical error) (Regularization ( Vapnik 2000) S VR 2 +C n i =1 (ξ i + ξ i* ) q i − ( ⋅ Φ (x i )) − b ≤ ε + ξ i Subject to ( ⋅ Φ (x i )) + b − qi ≤ ε + ξ i* ξ i , ξ i* ≥ 0, for i = 1,...,n 6 170 6 M aximize QP (6) ( Lagrange Multipliers) Ld (α , α * ) = −ε n i =1 (α i* + α i ) + n i =1 n i =1 (α i* − α i )qi − 1n (α i* − α i )(α * − α j )K (x i , x j ) j 2 i , j =1 (α i* − α i ) = 0 7 * i Subject to 0 ≤ α i ≤ C , i = 1,..., n 0 ≤ α ≤ C , i = 1,..., n α α* ( Support vectors) α K (x i , x j ) M ercer’s α* (Kernel function) (Vapnik 2000) K (x i , x j ) = Φ (x i ) × Φ (x j ) ( Polynomial) 2003; Cherkassky & Ma 2004) S VR ( Radial basis function, RBF)(Lin et al. (7) α (7) α* ( Platt 1999; Joachim1999; Vapnik 2000; Collobert & Bengio 2001; Trafalis & Ince 2002) S MO ( Sequential Minimum Optimization) V apnik(2000) Platt(1999) ICA ICA xi SVR S VR ICA ICA X W M M 1× N 1× N ( IC) ( yi ) IC ( ICA M ×M C heung & Xu (2001) (Testing-and-Acceptance, TnA) T nA (8) ) (Data reconstruction) R elative hamming distance(RHD) R HD 0 4 R HD C heung & Xu 2001) T nA R HD RHD 171 RHD RHD 2 4 × 794 1 × 794 4 4 4 ICA 1 × 794 4 3 1 T nA 4 IC IC1 IC3 R HD IC1 R HD R HD 1 IC1 T nA ( IC1-IC4) 4 1 .7531 1 1 .4346 IC1 IC4 IC1 IC3 IC IC3 IC2 IC4 1 IC1 4 4 ( 0) X= ∧ M ai y i , 1 ≤ k ≤ M ∧ ∧ ∧ 8 M×N ( Column vector) A = W −1 xi yi ∧ i =1,i ≠ k ∧ X = [x1 , x 2 ,..., x M ]T ai A i i k 1 151 301 451 601 751 2 4 1 × 794 172 IC1( y 1 ) IC IC2( y 2 ) IC3( y 3 ) IC4( y 4 ) 1 151 301 451 601 751 3 2 4 R HD RHD R HD R HD M M −1 ( ) ( Deboeck 1994; Yaser & Atiya 1996) 1 RHD IC 1 IC1 IC2 IC3 IC4 IC1 IC1 IC1 IC1 IC1 IC1 2 3 4 IC2 IC3 IC4 IC3 IC3 IC3 IC2 IC4 IC2 IC4 RHD 1.7531 1.8528 1.9779 1.9044 1.7137 1.3346 1.7373 0.7662 1.4294 0 173 3 ( T nA 4 IC ( IC4 IC 5 ( Shape) IC IC( IC1 IC1 IC3 IC3 IC2 IC 5 IC2) ( 2 IC1 IC3 IC2) IC1 4 IC3) ( 1 ) 4 ( IC1-IC4) 4 R HD R HD ( IC1) R HD x1 2 01 IC( x1 ) IC4) 2 1.5 1.7531 1.3346 RHD 1 0.5 0 IC ICs 0.7662 0 ICs ICs 4 RHD x1 IC1, IC2 and IC3 IC4 1 51 101 151 201 5 IC I C4 - IC 2 IC1 IC3 201 IC2 x1 174 ICA S VR S VR R BF S VR ( Lin e t al . 2003; Cherkassky & Ma 2004) C C Lin et al.(2003) ( Grid search) C ( C herkassky & Ma(2004) S VR S VR S VR R BF C ( Lin et al . 2003; Cherkassky & Ma 2004) SVR C = 2 −5 ,2 −3 ,2 −1 ,...,215 ) S VR C ( ) (Mean Square Error, MSE) ICA ) 2 25 S VR 2 25 S VR ( ( t-1) ( t) ( Lee & Chen 2002; Lee & Chiu 2002) S VR 2 25 4 1 999 1144 2003 9 3 30 27 3 50 7 94 6 2 003 10 4 2 004 1 999 3 9 10 28 30 2 25 4 2 004 ( Root Mean Square Error, RMSE) square error, NMSE) ( Directional Symmetry, DS) (Correct Up trend, CP) (Normalized mean (Mean Absolute Difference, MAD) 175 (Correct Down trend, CD) 2 ( Deviation) CD ) CP CD DS 6 R MSE NMSE M AD DS CP ( 6 2 25 2 N RMSE (Ti − Ai ) 2 N N i =1 N i =1 − RMSE = i =1 2 NMSE NMSE = 1 /(σ N ) ∗ N (Ti − Ai ) 2 σ 2 = 1 /( N − 1) ∗ ( Ai − A) 2 MAD DS CP CD * Ti − Ai N N MAD = DS = CP = CD = i =1 100 ∗ N 100 ∗ N 100 ∗ N di i =1 N di = di = di = A 1 ( Ai − Ai −1 )(Ti − Ti −1 ) ≥ 0 0 0 0 N otherwise otherwise otherwise 1 (Ti − Ti −1 ) > 0 and ( Ai − Ai −1 )(Ti − Ti −1 ) ≥ 0 1 (Ti − Ti −1 ) < 0 and ( Ai − Ai −1 )(Ti − Ti −1 ) ≥ 0 di i =1 N di i =1 T 176 S VR ( S VR C 2 25 C 21 Chang & Lin (2001) ) Cherkassky & Ma (2004) LIBSVM (Scaling) ε = 0.0019 S VR C =1.25 ε 2 −9 ( ε = 2 −9 , C = 2 1 ) 3 3 ε = 2 , C = 21 −9 M SE ICA ( ε = 2 ,C = 2 ) −9 1 S VR 4 3 3 S VR 4 4 4 ε = 2 , C = 23 −7 M SE 5 5 40.86 DS CD S VR S VR CP 50 S VR SVR ( 6 S VR 3 07, 322 3 49) 50 ICA S VR S VR 8 7.53% 88.77% 8 6.09% 6 6 3 S VR R MSE NMSE M AD 5 6.76 0 .0026 177 3 SVR 2 −13 2 -11 2 −9 2 −7 C 2 −1 21 23 2 −1 21 23 2 −1 21 23 2 −1 21 23 4 M SE 0.0000319491 0.0000291671 0.0000279661 0.0000319861 0.0000291736 0.0000280469 0.0000323087 0.0000294424 0.0000282242 0.0000426816 0.0000346765 0.000031965 M SE 0.0000248156 0.0000214102 0.0000240622 0.0000247999 0.0000214540 0.0000239275 0.0000273171 0.0000230685 0.0000221947 0.0000335855 0.0000342352 0.0000290501 2 -11 2 −9 2 -7 25 C 2 −1 21 23 25 2 −1 21 23 25 2 −1 21 23 25 2 −1 21 23 25 SVR M SE 0.0000547405 0.0000535331 0.0000528961 0.0000525349 0.0000546991 0.0000537528 0.0000529974 0.0000526340 0.0000554437 0.0000544119 0.0000538615 0.0000533897 0.0001088290 0.0001415800 0.0001368980 0.0001514900 Nikkei225 M SE 0.0000279631 0.0000304249 0.0000396468 0.0000469127 0.0000258684 0.0000284480 0.0000344395 0.0000400822 0.0000202124 0.0000198950 0.0000191902 0.0000202790 0.0001002030 0.0001092080 0.0000906426 0.0001442630 5 RMSE S VR S VR 137.85 60.53 56.76 NMSE 0.1431 0.0285 0.0226 MAD 105.77 43.71 40.86 DS 50.43% 83.67% 87.53% CD 54.26% 86.17% 88.77% CP 45.96% 80.75% 86.09% ICA 178 11500 11400 11300 11200 11100 11000 10900 10800 10700 301 308 315 322 329 336 343 350 Actual values Random walk SVR ICA+SVR 6 3 - 50 (Taiwan stock exchange capitalization weighted stock index, TAIEX) 2 003 781 6 9.9% ( 3 0.1%) 2003 2 005 7 3 1 1 18 2 2 2 006 2 005 2 006 TAIEX 2 2 3 27 27 TAIEX 17 5 46 2 35 (Balachandher et al. 2002; Leigh et al. 2 005) (6 days-relative strength indicator, RSI6) amount weight stock price Index, TAPI) MSCI Taiwan opening index futures) TAIEX Taiwan opening index futures) 6 ICA 6 S VR S VR 9 S VR TAIEX TAIEX 10 (Total (SGX-DT (TAIFEX 179 7 TAIEX 6 SVR TAIEX RMSE S VR ICA S VR 53.21 46.60 41.09 NMSE 0.1692 0.0330 0.0297 MAD 39.88 34.63 31.70 DS 46.15% 55.98% 60.15% CD 45.22% 60.00% 60.44% CP 47.93% 52.10% 62.61% ICA S VR SVR SVR 2 25 S VR S VR 180 N SC 95-2221-E-030-002 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. Antoniou, A. and Holmes, P. “Futures Trading, Information and Spot Price Volatility: Evidence for the FTSE-100 Stock Index Futures Contract Using GARCH,” Journal of Banking and Finance ( 19) 1995, pp: 117-129. Back, A. and Weigend, A. “Discovering Structure in Finance Using Independent Component Analysis,” Proceeding of 5th International Conference on Neural Networks in Capital Market 1997, pp: 15-17. Balachandher, K.G., Fauzias, M.N. and Lai, M.M. “An Examination of the Random Walk Model and Technical Trading Rules in the Malaysian Stock Market,” Q uarterly Journal of Business & Economics ( 41) 2002, pp: 81-104. Barlett, M. and Sejnowski, T.J. “Viewpoint Invariant Face Recognition Using Independent Component Analysis and Attractor Networks,” A dvances in Neural Information Processing Systems (9) 1997, pp: 817-823. Bartlett, M.S., Movellan, J.R. and Sejnowski, T.J. “Face Recognition By Independent Component Analysis,” I EEE Transactions on Neural Networks ( 13) 2002, pp: 1450–1464. Beckmann, C.F. and Smith, S.M. “Probabilistic Independent Component Analysis for Functional Magnetic Resonance Imaging,” I EEE Transactions on Medical Imaging (23) 2004, pp: 137-152. Burbidge, R., Trotter, M., Buxton, B. and Holden, S. “Drug Design by Machines Learning: Support Vector Machines for Pharmaceutical Data Analysis,” C omputer & Chemistry (26) 2001, pp: 5-14. Cao, L.J. “Support Vector Machines Experts for Time Series Forecasting,” Neurocomputing ( 51) 2003, pp: 321-339. Cao, L. and Tay, F.E.H. “Financial Forecasting Using Support Vector Machines,” Neural Computing & Applications (10) 2001, pp: 184-192. Chang, C.C. and Lin, C.J. “LIBSVM: A Library for Support Vector Machines,” 2001, Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm. Chang, R.F., Wu, W.J., Moon, W.K. and Chen, D.R. “Improvement in Breast Tumor Discrimination by Support Vector Machines and Speckle-Emphasis Texture Analysis,” Ultrasound in Medicine and Biology (29) 2003, pp: 679-686. Cheung, Y.M. and Xu, L. “Independent Component Ordering in ICA Time Series Analysis,” Neurocomputing (41) 2001, pp: 145-152. Chuang, C.C., Su, S.F., Jeng, J.T. and Hsiao, C.C. “Robust Support Vector Regression Networks for Function Approximation with Outliers,” I EEE Transactions on Neural Networks ( 13) 2002, pp: 1322–1330. Cichocki A. and Amari, S.I. A daptive Blind Signal and Image Processing: Learning Algorithms and Applications, John Wiley & Sons, New York, 2002. 181 15. Cherkassky, V. and Ma, Y. “Practical Selection of SVM Parameters and Noise Estimation for SVM Regression,” N eural Networks (17) 2004, pp: 113–126. 16. Collobert, R. and Bengio, S. “SVMTorch: Support Vector Machines for Large-Scale Regression Problems,” Journal of Machine Learning Research (1) 2001, pp: 143-160. 17. Comon, P. “Independent Component Analysis: A New Concept?” S ignal Processing (36) 1994, pp: 287-314. 18. Cover, T.M. and Thomas, J.A. E lements of Information Theory, John Wiley and Sons, New York, 1991. 19. David, V. and Sanchez, A. “Frontiers of Research in BSS/ICA,” N eruocomputing (49) 2002, pp: 7-23. 20. Deboeck, G.J. T rading on the Edge: Neural, Genetic, and Fuzzy Systems for Chaotic Financial Markets, Wiley, New York, 1994. 21. Déniz, O., Castrillón, M. and Hernández, M. “Face Recognition Using Independent Component Analysis and Support Vector Machines,” Pattern Recognition Letters (24) 2003, pp: 2153-2157. 22. Drucker, H., Burges, C.J.C., Kaufman, L., Smola A. and Vapnik, V.N. “Support Vector Regression Machines,” Advances in Neural Information Processing Systems (9) 1997, pp: 155. 23. Hyvärinen, A. “Fast and Robust Fixed-Point Algorithms for Independent Component Analysis,” I EEE Transactions on Neural Networks (10) 1999, pp: 626-634. 24. Hyvärinen, A. and Oja, E. “Independent Component Analysis: Algorithms and Applications,” N eural Networks (13) 2000, pp: 411-430. 25. Hyvärinen, A., Karhunen, J. and Oja, E. Independent Component Analysis, John Wiley & Sons, New York, 2001. 26. Ikeda, S. and Toyama, K. “Independent Component Analysis for Noisy Data-MEG Data Analysis,” N eural Networks (13) 2000, pp: 1063-1074. 27. James, C.J. and Gibson, O.J. “Temporally Constrained ICA: An Application to Artifact Rejection in Electromagnetic Brain Signal Analysis,” I EEE Transactions on Biomedical Engineering ( 50) 2003, pp: 1108-1116. 28. Jang, G.J., Lee, T.W. and Oh, Y.H. “Learning Statistically Efficient Features for Speaker Recognition,” Neurocomputing (49) 2002, pp: 329-348. 29. Joachim, T. “Making Large-Scale SVN Learning Practical,” A dvances in Kernel Methods-Support Vector Learning (Scholkopf, B., Burges, C.J.C. and Smola, A.J., eds.), MIT Press, Cambridge, MA, 1999. 30. Jung, C. and Boyd, R. “Forecasting UK Stock Prices,” Applied Financial Economics, (6) 1996, pp: 279-286. 31. Jung, T.P., Makeig, S., McKeown, M.J., Bell, A.J., Lee, T.W. and Sejnowski, T.J. “Imaging Brain Dynamics Using Independent Component Analysis,” Proceedings of the IEEE, California University, San Diego, La Jolla, CA, 2001, pp: 1107-1122. 32. Karras, D.A. and Mertzios, B.G. “Time Series Modeling of Endocardial Border Motion in Ultrasonic Images Comparing Support Vector Machines, Multilayer Perceptrons and Linear Estimation Technique,” Measurement ( 36) 2004, pp: 331-345. 33. Kim, K.I., Jung, K., Park, S.H. and Kim, H.J., “Support Vector Machines for Texture Classification,” IEEE Transactions on Pattern Analysis and Machine Intelligence (24) 2002, pp: 1542-1550. 34. Kim, K.J. “Financial Time Series Forecasting Using Support Vector Machines,” Neurocomputing ( 55) 2003, pp: 307-319. 182 35. Kim, T.K., Kim, H., Hwang, W. and Kittler, J. “Independent Component Analysis in A Local Facial Residue Space for Face Recognition,” P attern Recognition (37) 2004, pp: 1873-1885. 36. Kiviluoto, K. and Oja, E. “Independent Component Analysis for Parallel Financial Time Series,” Proceedings of the Fifth International Conference on Neural Information, Tokyo, Japan, 1998, pp: 895–898. 37. Koike, A. and Takagi, T. “Prediction of Protein-Protein Interaction Sites Using Support Vector Machines,” Protein Engineering Design & Selection (17) 2004, pp: 165-173. 38. Kwon, K.Y. and Kish, J.R. “Technical Trading Strategies and Return Predictability: NYSE,” A pplied Financial Economics (12) 2002, pp: 639-653. 39. Li, S., Kwok, J.T., Zhu, H. and Wang, Y. “Texture Classification Using The Support Vector Machines,” P attern Recognition ( 36) 2003, pp: 2883-2893. 40. Lin, Q.H., Zheng, Y.R., Yin F. and Liang, H.L. “Speech Segregation Using Constrained ICA,” L ecture Notes in Computer Science (3173) 2004, pp: 755-760. 41. Lin C.J., Hsu, C.W. and Chang, C.C. “A Practical Guide to Support Vector Classification,” Technical Report , Department of Computer Science and Information Engineering, National Taiwan University, 2003. 42. Lee, T.W. Independent Component Analysis: Theory and Application , Kluwer Academic Publishers, Boston, 1998. 43. Lee, T.S. and Chen, N.J. “Investigating the Information Content of Non-Cash-Trading Index Futures Using Neural Networks,” Expert Systems with Applications (22) 2002, pp: 225-234. 44. Lee, T.S. and Chiu, C.C. “Neural Network Forecasting of An Opening Cash Price Index,” I nternational Journal of Systems Science (33) 2002, pp: 229-237. 45. Lee, T.S., Chen, N.J. and Chiu, C.C. “Forecasting the Opening Cash Price Index Using Gray Forecasting and Neural Networks: Evidence from the SGX-DT MSCI Taiwan Index Futures Contracts,” C omputational Intelligence in Economics and Finance (Wang, P and Chen, S. S., Eds), Springer, 2003. 46. Mohandes, M.A., Halawani, T.O., Rehmam, S. and Hussain, A.A. “Support Vector Machines for Wind Speed Prediction,” Renewable Energy ( 29) 2004, pp: 939-947. 47. Norinder, U. “Support Vector Machine Models in Drug Design: Applications to Transport Processes and QSAR Using Simplex Optimisations and Variable Selection,” Neurocomputing (55) 2003, pp: 337-346. 48. Oja, E., Kiviluoto, K. and Malaroiu, S. “Independent Component Analysis for Financial Time Series,” Proceedings of the IEEE 2000 Adaptive Systems for Signal Processing, Communications, and Control Symposium, Lake Louise, Canada, 2000, pp: 111-116. 49. Pai, P.F. and Lin, C.S. “Using Support Vector Machines in Forecasting Production Values of Machinery Industry in Taiwan,” International Journal of Advanced Manufacturing Technology ( 27) 2005, pp: 205-210. 50. Parisi, F. and Vasquez, A. “Simple Technical Trading Rules of Stock Returns: Evidence from 1987 to 1998 in Chile,” Emerging Markets Review (1) 2000, pp: 152-164. 51. Park, H.M., Jung, H.Y., Lee, T.W. and Lee, S.Y. “On Subband-based Blind Signal Separation for Noisy Speech Recognition,” E lectronic Letters (35) 1999, pp: 2011-2012. 183 52. Platt, J.C. “Fast Training of Support Vector Machines Using Sequential Minimal Optimization,” Advances in Kernel Methods -- Support Vector Learning ( Scholkopf, B., Burges, C.J.C. and Smola, A.J., eds.), MIT Press, Cambridge, MA, 1999. 53. Shin, K.S., Lee, T.S. and Kim, H.J. “An Application of Support Vector Machines in Bankruptcy Prediction Model,” E xpert Systems with Applications (28) 2005, pp: 127-135. 54. Suykens, J.A.K., De Brabanter, J., Lukas, L. and Vandewalle, J. “Weighted Least Squares Support Vector Machines: Robustness and Sparse Approximation,” Neurocomputing (48) 2002, pp: 85-105. 55. Tay, F.E.H. and Cao, L. “Application of Support Vector Machines in Financial Time Series Forecasting,” Omega ( 29) 2001, pp: 309-317. 56. Thissen, U., Van Brakel, R., De Weijer, A.P., Melssen, W.J. and Buydens, L.M.C. “Using Support Vector Machines for Time Series Prediction,” C hemometrics and Intelligent Laboratory Systems (69) 2003, pp: 35-49. 57. Trafalis, T.B. and Ince, H. “Benders Decomposition Technique for Support Vector Regression,” Proceedings of the International Joint Conference on Neural Networks (IJCNN), Honolulu, 2002, pp: 2767-2772. 58. Vapnik, V.N., Golowich, S. and Smola, A.J. “Support Vector Method for Function Approximation, Regression Estimation, and Signal Processing,” Advances in Neural Information Processing Systems (Mozer, M., Jordan,M. and Petsche, T., eds), MIT Press, Cambridge, MA, 1997. 59. Vapnik, V.N. T he Nature of Statistical Learning Theory, Springer, New York, 2000. 60. Vellido, A., Lisboa, P.J.G. and Vaughan, J. “Neural Networks in Business: A Survey of Applications 1 992-1998 ,” Expert Systems with Applications ( 17) 1999, pp: 51-70. 61. Vigario, R., Sarela, J., Jousmaki, V., Hamalainen, M. and Oja, E. “Independent Component Approach to the Analysis Of EEG And MEG Recordings,” IEEE Transactions on Biomedical Engineering ( 47) 2000, pp: 589-593. 62. Visser, E., Lee, T.W. “Speech Enhancement Using Blind Source Separation and Two-Channel Energy Based Speaker Detection,” Proceedings of 2003 IEEE International Conference on Acoustics, Speech, and Signal, La Jolla, CA, 2003, pp: 884-887. 63. Yaser, S. and Atiya, A.F. “Introduction to Financial Forecasting,” A pplied Intelligence (6) 1996, pp: 205–213. 64. Zhang, G., Patuwo, B.E. and Hu, M.Y. “Forecasting with Artificial Neural Networks: the State of the Art,” International Journal of Forecasting ( 14) 1998, pp: 35-62. 184 ...
View Full Document

This note was uploaded on 11/27/2009 for the course IM MA420 taught by Professor Mar,lee during the Spring '09 term at National Taiwan University.

Ask a homework question - tutors are online