• 沒有找到結果。

Suggestions for Future Work

CHAPTER 5: CONCLUSION AND FUTURE WORK

5.2 Suggestions for Future Work

1. In this thesis, we just employ the support vector machine to avoid the influence of Hughes phenomenon on high dimension data. Dimensionality reduction, including feature extraction and feature selection, is another algorithm for mitigating the influence of Hughes phenomenon. Hence, in the future, researchers may apply the dimensionality reduction technique to avoid this research problem.

2. According to this study, the experiments are shown applying spatial information to the classification process is an effective way to mitigate the influence of similar spectral properties. In the future, applying spatial information into other classification technique to develop a novel algorithm is an important issue.

3. From the chapter 4, the reference algorithm, Spectral–Spatial classification based on partitional clustering techniques, is an effect approach on the classification problem of large spatial structure area, but it can’t overcome the small spatial structure problem. The most important problem is the clustering

techniques can’t work well in the small spatial structure area, and hence researchers, which want to focus on the small spatial structure problem, can develop a clustering algorithm to overcome this problem.

4. To use other high dimensional data with huge dimensionality such as face recognition data to test the completeness and suitability of the automatic algorithm and SCSVM.

APPENDIX A: THE TEST OF “SECTOR” UNIT

REFERENCES

A. P. Dempster, N.M. Laird, & D. B. Rubin (1997). Maximum likelihood from incomplete data via the EM algorithm. J. R. Stat. Soc. B, 39(1), 1–38.

B. C. Kuo & K. Y. Chang (2007). Feature Extractions for Small Sample Size Classification Problem. IEEE Trans. Geosci. Remote Sens., 45 (3), 756-764.

B. E. Boser, I. M. Guyon, and V. N. Vapnik (1992). A training algorithm for optimal margin classifiers. In Proceedings of the Fifth Annual Workshop on Computational Learning Theory (p.144-152). Pittsburg, ACM.

B. S. Sebastiano & M. Gabriele (2007). Extraction of spectral channels from hyperspectral images for classification purposes. IEEE Trans. Geosci. Remote Sens., 45 (2), 484-495.

B. Schölkopf , C. J. C. Burges, & A. J. Smola (1999) .Advances in Kernel Methods-Support Vector Learning, Cambridge, MA: MIT Press.

B. Schölkopf, C. Burges, and V. Vapnik (1995). Extracting support data for a given task. In U. M. Fayyad and R. Uthurusamy, editors, Proceedings, First

International Conference on Knowledge Discovery & Data Mining. AAAI Press, Menlo Park, CA.

B. Schölkopf, C. Burges, and V. Vapnik (1996). Incorporating invariances in support vector learning machines. In C. von der Malsburg, W. von Seelen, J. C.

Vorbr‥uggen, and B. Sendhoff, editors, Artificial Neural Networks —

ICANN’96, pages 47 – 52, Berlin. Springer Lecture Notes in Computer Science, Vol. 1112.

B.C. Kuo, C. H. Chuang, C. S. Huang, & C. C. Hung (2009). A nonparametric contextual classification based on Markov random fields. Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, 2009. WHISPERS '09.

First Workshop on. Grenoble.

Besag, J. (1974). Spatial Interaction and the Statistical Analysis of Lattice Systems.

Journal of the Royal Statistical Society, 36 (2), 192-236.

Besag, J. (1986). On the Statistical Analysis of Dirty Pictures. Journal of the Royal Statistical Society, 48 (3), 259-302.

Bruzzone, L. & Persello, C. (2009). A novel context-sensitive semisupervised SVM classifier robust to mislabeled training samples. IEEE Trans. Geosci. Remote Sens., 47 (7), 2142-2154.

C. Cortes & V. Vapnik (1995). Support vector network, Mach. Learn. 20, 273–297.

C. Cortes and V. Vapnik (1995). Support vector networks. Machine Learning, 20:

273–297.

C. J. C. Burges and B. Schölkopf (1997). Improving the accuracy and speed of support vector learning machines. In M. Mozer, M. Jordan, and T. Petsche, editors,

Advances in Neural Information Processing Systems 9, pages 375–381, Cambridge, MA, MIT Press.

C.J.C. Burges (1998). A tutorial on support vector machines for pattern recognition, Data Mining Knowledge Discovery, 2 (2), 121–167.

Camps-Valls G. & Bruzzone L. (2005). Kernel-based methods for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens., 43 (6), 1351–1362.

Camps-Valls, G., Gomez-Chova, L., Munoz-Mari, J., Vila-Frances, J., &

Calpe-Maravilla, J. (2006). Composite kernels for hyperspectral image classification. IEEE Geoscience and Remote Sensing Letters, 3 (1), 93- 97.

C. H. Li, C. T. Lin, B. C. Kuo, & H. S. Chu (2010). An automatic method for selecting the parameter of the RBF kernel function to support vector machines.

Proceedings of International Geosciences and Remote Sensing Symposium 2010.

D.A. Landgrebe (2003). Signal Theory Methods in Multispectral Remote Sensing, John Wiley and Sons, Hoboken, NJ: Chichester.

Duda, R. O., Hart, P. E., & Stork, D. G. (2001). Pattern Classification (2nd ed.). NY:

John Wiley & Sons.

F. Melgani & L. Bruzzone (2004). Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens.,42 (8), 1778–1790.

Fukunaga, K. (1990). Introduction to Statistical Pattern Recognition (2nd ed.). San Diego, CA: Academic.

G. Ball & D. Hall (1965). ISODATA, a novel method of data analysis and classification. Stanford Univ., Stanford, CA, Tech. Rep. AD-699616.

G. Borgefors (1986). Distance transformations in digital images. Comput. Vis.Graph.

Image Process.,34 (3), 344–371.

G. F. Hughes (1968). On the mean accuracy of statistical pattern recognizers. IEEE Trans. Inf. Theory, IT-14 (1), 55–63.

German, S. & Geman, D. (1984). Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721-741.

J. A. Benediktsson, J. A. Palmason, & J. R. Sveinsson (2005). Classification of hyperspectral data from urban areas based on extended morphological profiles.

IEEE Trans. Geosci. Remote Sens. 43 (3), 480-491.

K. P. Bennett & A. Demiriz (1998). Semi-supervised support vector machines. In Advances in Neural Information Processing Systems, vol. 10. Cambridge, MA:

MIT Press, 1998, pp. 368–374.

Kinderman, R. & Snell, J. L. (1980). Markov random fields and their applications.

American Mathematical Society, 1, 1-142.

L. Bottou, C.Cortes, J. Denker, H.Drucker, I. Guyon, L.Jackel, Y. LeCun, U.Muller, E.

Sackinger, P. Simard, & V. Vapnik (1994). Comparison of classifier methods: a case study in handwriting digit recognition. In Proc. Int. Conf. on Pattern Recognition, 77-87.

Li, J. & Narayanan, R.M. (2004). Integrated Spectral and Spatial Information Mining in Remote Sensing Imagery. IEEE Transaction on Geoscience and Remote Sensing, 42 (3), 673-685.

M. Fauvel, J. Chanussot, & J. A. Benediktsson (2006). Evaluation of kernels for multiclass classification of hyperspectral remote sensing data. In Proc. ICASSP, II-813–II-816.

Marconcini M., Camps-Valls G. and Bruzzone, L. (2009). A composite semisupervised SVM for classification of hyperspectral images. IEEE Geoscience and Remote Sensing Letters, 6 (2), 234-238.

Q. Jackson, & D.A. Landgrebe (2002). Adaptive Bayesian Contextual Classification Based on Markov Random Fields. IEEE Trans. Geosci. Remote Sens., 40 (11), 2454-2463.

Robert P. W. Duin and Elzbieta Pekalska (2005). Open issues in pattern recognition.

In CORES, pages 27–42.

S. T. John & C. Nello (2004). Kernel Methods for Pattern Analysis. Cambridge University Press.

V. N. Vapnik (1998). Statistical Learning Theory . John Wiley and Sons, Hoboken, NJ:

Chichester.

V. N. Vapnik (2001). The Nature of Statistical Learning Theory (2nd ed.). New York:

Springer-Verlag.

Y. Tarabalka, J. A. Benediktsson, & Jocelyn Chanussot (2009). Spectral–Spatial Classification of Hyperspectral Imagery Based on Partitional Clustering Techniques. IEEE Trans. Geosci. Remote Sens., 47 (8), 2973-2987.

相關文件