• 沒有找到結果。

Online Transductive Support Vector Machine Nguyen, Thanh Tuan、陳木松

N/A
N/A
Protected

Academic year: 2022

Share "Online Transductive Support Vector Machine Nguyen, Thanh Tuan、陳木松"

Copied!
2
0
0

加載中.... (立即查看全文)

全文

(1)

Online Transductive Support Vector Machine Nguyen, Thanh Tuan、陳木松

E-mail: 9707397@mail.dyu.edu.tw

ABSTRACT

The support vector machine (SVM) is one kind of machine learning methods based on the Statistical Learning Theory. The SVM is designed to construct a separating hyperplane between two classes of points, such that the margin between the hyperplane and the points closest to it becomes maximal. However, there are some disadvantages in applying the SVM in pattern classification problems. Firstly, the SVM is usually trained by supervised learning. Thus, the SVM model needs to be retrained from scratch whenever a new sample arrives. Secondly, the SVM trained with only a few labeled data can lead to construct very well performing classification systems, but its generalization ability highly depends on which samples are chosen for training. On the other hand, the labeled data is scarce and expensive to generate while the unlabeled data is often readily available in real world application. To overcome the aforementioned problems, an online trandusctive SVM (OTSVM) is proposed to train the SVM model incrementally with new unlabeled data. The OTSVM is developed to combine the trandusctive SVM model with online learning for classification.

Unlike supervised SVM learning, in which no learning occurs when labeling unlabeled samples, the OTSVM can learn from labeled and unlabeled samples progressively. Besides, the OTSVM increases the classification accuracy while keeping the memory

requirements and computation complexity at a manageable level. In order to investigate the efficiency and effectiveness of the proposed OTSVM method, examples of linearly/non-linearly separable data and terrain classification of SAR images are carried out to compare with supervised SVM learning, TSVM, PTSVM, and unsupervised learning. From simulation results, we conclude that the OTSVM can maintain acceptable classification accuracy with limited labeled data and large quantity of unlabeled data.

Keywords : support vector machine ; Statistical Learning Theory ; online learning ; transductive support vector machine ; labeled and unlabeled data

Table of Contents

CREDENTIAL AUTHORIZATION LETTER ... iii ABSTRACT (ENGLISH) ... iv ABSTRACT(CHINESE) ... v ACKNOWLEDGMENT ... vi TABLE OF CONTENTS ... vii TABLE OF FIGURES ... ix LIST OF TABLES ... xi Chapter 1. INTRODUCTION

... 1 Chapter 2. SUPPORT VECTOR MACHINES ... 6 2.1 Linear Hard-margin Support Vector Machine ... 7 2.2 Linear Soft-margin Support Vector Machine ... 12 2.3 Nonlinear Support Vector Machine

... 15 2.4 Sequential Minimal Optimization Algorithm ... 18 Chapter 3. Online Support Vector Machines ... 21 3.1 Problem Formulation ... 22 3.2 Incremental Learning Algorithm ... 23 3.3 Bookkeeping procedure ... 27 3.4 Recursive Update of the Inverse Matrix ... 31 3.5 Decremental Unlearning Algorithm ... 32 Chapter 4. Transductive SVM ... 35 4.1 Transductive SVM Learning ... 37 4.2 Progressive TSVM ... 42 Chapter 5. ONLINE TRANSDUCTIVE SVM ... 44 5.1 Karush-Kuhn Tucker Conditions ... 44 5.2 Online Transductive Support Vector Machine ... 46 Chapter 6. SIMULATIONS ... 52 6.1 Linearly Separable Data ... 52 6.2 Linearly Non-Separable Data ... 55 6.3 Terrain Classification ... 58 Chapter 7. CONCLUSIONS ... 62 APPENDIX A: Lagrange theorem ... 65 APPENDIX B : Karush-Kuhn-Tucker theorem ... 66 REFERENCES ... 67

REFERENCES

[1] Antoine Border, Seyda Ertekin, Jason Weston, Leon Bottou. Fast kernel classifiers with online and active learning. In Journal of Machine Learning Research, volume 6, pages 1579-1619, 2005.

[2] B. Gabrys and L. Petrakieva. Combining labeled and unlabelled data in the design of pattern classification systems. International Journal of Approximate Reasoning, 35(3):251-273, 2004.

[3] Bezdek, J.C., Pattern Recognition with Fuzzy Objective Function Algorithms. Plenum, New York, 1981.

(2)

[4] C.L. Liu, K. Nakashima, H. Sako, and H. Fujisawa. Handwritten digit recognition using state-of-the-art techniques. In Proc. of 8th International Workshop on Frontiers of Handwriting Recognition (IWFHR-8), pages 320–325, 2002.

[5] D. Gorgevik and D. Cakmakov, ”Handwritten Digit Recognition by Combining SVM Classifiers”, The International Conference on Computer as a Tool, EUROCON 2005, pp. 1393 - 1396, 2005.

[6] G. H. Golub and C. F. van Loan. Matrix Computations. John Hopkins University Press, Baltimore, London, 3rd edition, 1996.

[7] Gert Cauwenberghs, Tomaso Poggio. Incremental and Decremental Support Vector Machine Learning. In T. K. Leen, T. G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems, volume 13, pages 409-415. MIT Press, 2001.

[8] H. Zha, C. Ding, M. Gu, X. He and H.D. Simon. "Spectral Relaxation for K-means Clustering," Neural Information Processing Systems, vol.14, pp. 1057-1064, Vancouver, Canada, Dec. 2001.

[9] Jinlong An, Zheng-Ou Wang, Qingxin Yang, and Zhenping Ma, ”A SVM function approximation approach with good performances in interpolation and extrapolation”, Proceedings of 2005 International Conference on Machine Learning and Cybernetics, pp. 1648 – 1653, 18-21 Aug. 2005.

[10] Joachims, T., "SVM light is an implementation of support vector machines (SVMs) in C," University of Dortmund, Collaborative Research Center on Complexity Reduction in Multivariate Data (SFB475), 2000, ( http://ais.gmd.de/,thorsten/svm_light).

[11] John C. Platt. Fast training of support vector machines using sequential minimal optimization. In B. Scholkopf, C. J. C. Burges, and A. J.

Smola, editors, Advances in Kernel Methods — Support Vector Learning, pages 185–208, Cambridge, MA, 1999. MIT Press.

[12] K. Bennett and A. Demiriz. Semi-supervised support vector machines. In NIPS, volume 12, 1998.

[13] L. Csato and M. Opper, Sparse Representation for Gaussian Process Models. In Adv. Neural Information Processing Systems (NIPS’2000), vol. 13, 2001.

[14] Lazarevic, A., Fiez, T., and Obradovic, Z, ”A software system for spatial data analysis and modeling”, Proceedings of the 33rd Annual Hawaii International Conference on System Sciences, Jan. 4-7, 2000.

[15] Luiz S. Oliveira, Robert Sabourin, Support Vector Machines for Handwritten Numerical String Recognition, Proceedings of the Ninth International Workshop on Frontiers in Handwriting Recognition (IWFHR'04), p.39-44, October 26-29, 2004.

[16] Pavel laskov, Christian Gehl, Stefan Kruger and Klaus-Robert Muller. Incremental Support Vector Learning: Analysis, Implementation and Applications. In Journal of Machine Learning Research, volume 7, pages 1909-1932, 2006.

[17] S. Haykin. Neural Networks: A Comprehensive Foundation. Macmillan College Publishing Company, New York, 1994.

[18] S. Haykin. Neural Networks: A Comprehensive Foundation. Macmillan College Publishing Company, New York, 1994.

[19] S. T. Dumais, Using SVMs for text categorization. In IEEE Intelligent Systems Magazine, Trends and Controversies, Marti Hearst, ed., 13(4), July/August 1998.

[20] Satish, D.S., Sekhar, C.C., “Kernel based clustering and vector quantization for speech recognition”, Proceedings of the 2004 14th IEEE Signal Processing Society Workshop, pp. 315 – 324, Sept. 29, 2004.

[21] Saunders C., Gammerman A. and Vovk V. Computational efficient transductive machines, Algorithmic Learning Theory, 11th International Conference, Sydney, Australia, December 11-13, 2000, Lecture Notes in Computer Science, 1968, Springer, 325-333, 2000.

[22] Shilton, A., Palaniswami, M., Ralph, D. and Tsoi, A., “Incremental training of support vector machines,” IEEE Trans. Neural Networks 16, pp. 114-131, 2005.

[23] T. Joachims, “Transductive inference for text classification using support vector machines” in Proc. ICML, pp. 200-209, 1999.

[24] T. Joachims, Text Categorization with Support Vector Machines: Learning with Many Relevant Features. Proceedings of the European Conference on Machine Learning, Springer, 1998.

[25] Thorsten Joachims. SVM-Light: An implementation of Support Vector Machines. Department of Computer Science, Cornell University.

http://svmlight.joachims.org/.

[26] V. N. Vapnik, Statistical Learning Theory. New York: Wiley, 1998.

[27] Yisong Chen, Guoping Wang and Shihai Dong. Learning with progressive transductive support vector machine in Pattern Recognition Letters 24, 2003, pp.1845–1855.

參考文獻

相關文件

1 Embedding Numerous Features: Kernel Models Lecture 1: Linear Support Vector Machine.. linear SVM: more robust and solvable with quadratic programming Lecture 2: Dual Support

1 Embedding Numerous Features: Kernel Models Lecture 1: Linear Support Vector Machine.

2 Distributed classification algorithms Kernel support vector machines Linear support vector machines Parallel tree learning.. 3 Distributed clustering

2 Distributed classification algorithms Kernel support vector machines Linear support vector machines Parallel tree learning?. 3 Distributed clustering

Keywords Support vector machine · ε-insensitive loss function · ε-smooth support vector regression · Smoothing Newton algorithm..

support vector machine, ε-insensitive loss function, ε-smooth support vector regression, smoothing Newton algorithm..

“Transductive Inference for Text Classification Using Support Vector Machines”, Proceedings of ICML-99, 16 th International Conference on Machine Learning, pp.200-209. Coppin

“Machine Learning Foundations” free online course, and works from NTU CLLab and NTU KDDCup teams... The Learning Problem What is