• 沒有找到結果。

結論與未來發展

本研究提出了兩個方向去改善與提升半監督式局部區別分析法,分別是運用 權重參數來降低未標記樣本數量對於分類正確率的影響,以及使用 Voronoi diagram 概念使得訓練樣本周遭的幾何性質能保持一致性,來提升分類正確率。

由實驗結果顯示,使用權重參數的特徵萃取法 ASELD 與 kASELD 之分類正 確率並不會隨著未標記樣本數量增加而下降,即 ASELD 與 kASELD 可以使用到 更多的未標記樣本來改善萃取結果。除此之外,也可以發現使用 Voronoi diagram 概念所發展出來的兩種特徵萃取法 kSELD 與 kASELD,因為 k-未標記樣本能保 持訓練樣本周遭的幾何一致性,使得在相同的數量下,其分類正確率高於使用未 標記樣本的特徵萃取法 SELD 與 ASELD。換句換說,可以使用較少數量的樣本 就可以得相同或是類似的分類效果。

在本研究中,都只有討論到特徵萃取法的改良對於分類效能的影響,所以所 搭配的分類器都是使用單一分類器,未來可以嘗試將本研究的架構融入多重辨識 器。對於特徵萃取法的改良上,也能嘗試將空間資訊結合於其他的監督式特徵萃 取法,例如:無參數加權特徵萃取(Nonparametric Weighted Feature Extraction, NWFE) (Kuo & Landgrebe, 2004),以提升分類正確率。而在對於使用 Voronoi diagram 概念來挑選未標記樣本的方法中,能找到一個更好的挑選方式對於處理 位於 Voronoi edge 上的未標記樣本。除此之外,也可以將 kernel 的概念融入本研 究的架構中,嘗試找出更佳的分類效能。

參考文獻

中文部分

丁秀婷、黃俊平(2009)。Voronoi Diagram 之新模式。第六屆台灣作業研究學會理 論與實務學術研討會發表之論文,新竹市元培科技大學。

朱慧珊(2011)。半監督式線性區別分析應用於高維度資料分類。國立臺中教育大 學教育測驗統計研究所碩士論文,未出版,臺中市。

李政軒(2012)。基於模糊線性區別分析之模糊分群法與結合空間資訊之支撐向量 機。國立交通大學電控工程研究所博士論文,未出版,新竹市。

紀明宏(2007)。適用於高維度資料分類之階層式辨識系統。國立臺中教育大學教 育測驗統計研究所碩士論文,未出版,臺中市。

張光佑(2006)。探討特徵萃取要素於小樣本分類問題。國立臺中教育大學教育測 驗統計研究所碩士論文,未出版,臺中市。

張偉民(2012)。一個基於相關矩陣之特徵萃取法。國立臺中教育大學教育測驗統 計研究所碩士論文,未出版,臺中市。

彭偉誠(2010)。使用模糊數學規劃改善支持向量機。國立中央大學電機工程學系 碩士論文,未出版,桃園縣。

葉志雄(2001)。無順序性 3D 點資料網格化及其應用。國立中正大學機械工程研究 所碩士論文,未出版,嘉義縣。

英文部分

Acition, N., Corsini, G., & Diani, M. (2003). An unsupervised algorithm for hyperspectral image segmentation based on the Gaussian mixture model. IEEE Geoscience and Remote Sensing Symposium, 6, 3745-3747.

Bandos, T. V., Bruzzone, L., & Camps, V. G. (2009). Classification of Hyperspectral Images With Regularized Linear Discriminant Analysis. IEEE Transactions on Geoscience and Remote Sensing, 47(3), 862-873.

Benediktsson, J. A., Palmason, A. J., & Sveinsson, J. R., (2005). Classification of hyperspectral data from urban areas based on extended morphological profiles, IEEE Transactions on Geoscience and Remote Sensing, 55, 229-243.

Boser, B. E., Guyon, I. M., & Vapnik, V. N. (1992). A training algorithm for optimal margin classifiers. Proceedings of the Fifth Annual Workshop on Computational Learning Theory, 144-152.

Bottou, L., Cortes, C., Denker, J., Drucker, H., Guyon, I., Jackel, L., LeCun, Y., Muller, U., Sackinger, E., Simard, P., & Vapnik, V. (1994). Comparison of classifier methods: a case study in handwritten digit recognition. Proceedings of the 12th IAPR International. Conference on Pattern Recognition, 2, 77-82.

Cai, D., He, X., & Han, J. (2007). Semisupervised discriminant analysis. IEEE International Conference on Computer Vision, 1-7.

Camps, V. G., & Bruzzone, L. (2005). Kernel-based methods for hyperspectral image classification. IEEE Transactions on Geoscience and Remote Sensing, 43(6), 1351-1362.

Camps, V. G., & Bruzzone, L., (2009). Kernel Methods for Remote Sensing Data Analysis. John Wiley & Sons, Ltd, New York, NY.

Chang, C. I., (2007). Hyperspectral Data Exploitation: Theory and Applications. New Jersey: Wiley-Interscience, 1 edition, Hoboken, NJ.

Chang, C. I., Wu, C. C., Liu, W. M., & Ouyang, Y. C., (2006). A new growing method for simplex-based endmember extraction algorithm, IEEE Transactions on Geoscience and Remote Sensing, 44(10), 2804-2819.

Cortes, C., & Vapnik, V. (1995). Support-vector networks. Machine learning, 20(3), 273-297.

Cover, T., & Hart, P. (1967). Nearest neighbor pattern classification. IEEE Transactions on Information Theory, 13(1), 21-27.

Dell’Acqua, F., Gamba, P., Ferrari, A., Palmason, J. A., & Benediktsson, J. A. (2004).

Exploiting spectral and spatial information in hyperspectral urban data with high resolution. IEEE on Geoscience and Remote Sensing Letters, 1(4), 322-326.

Fauvel, M., Benediktsson, J. A., Chanussot, J., & Sveinsson, J. R. (2008). Spectral and spatial classification of hyperspectral data using SVMs and morphological profiles.

IEEE Transactions on Geoscience and Remote Sensing, 46(11), 3804-3814.

Fauvel, M., Chanussot, J., & Benediktsson, J. A. (2006). Kernel Principal Component Analysis for Feature Reduction in Hyperspectrale Images Analysis. Proceedings of the Nordic Signal Processing Symposium, 7, 238-241.

Feris, R., Tian, Y. L., Zhai, Y., & Hampapur, A. (2008). Facial image analysis using local feature adaptation prior to learning. In Proceedings of the 8th IEEE International Conference on Automatic Face and Gesture Recognition (pp. 1-6). Amsterdam, Netherlands.

Fisher, R. A. (1936). The use of multiple measures in taxonomic problems. Annals of eugenics, 7, 179-188.

Fukunaga, K. (1990). Introduction to Statistical Pattern Recognition. Academic Press,

San Diego, CA, 2nd edition.

Grahn, H., & Geladi, P., (2007). Techniques and Applications of Hyperspectral Image Analysis. John Wiley & Sons Ltd, Chichester, UK.

Han, P. Y., Jin, A. T. B., & Abas, F. S. (2009). Neighbourhood preserving discriminant embedding in face recognition. Journal of Visual Communication and Image Representation, 20, 532-542.

Hanqiu, X. U., (2006). Modification of Normalised Difference Water Index (NDWI) to enhance open water features in remotely sensed imagery, International Journal of Remote Sensing, 27(14), 3025-3033.

Hastie, T., & Tibshirani, R. (1996). Discriminant Adaptive Nearest Neighbor Classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 18(6), 607-616.

He, X., Cai, D., Yan, S., & Zhang, H. J. (2005). Neighborhood preserving embedding.

IEEE International Conference on Computer Vision, 2, 1208-1213.

Hsu, C. W., & Lin, C. J., (2002). A Comparison of Methods for Multiclass Support Vector Machines, IEEE Transaction on Neural Networks, 13(2), 415-425.

Hughes, G. F. (1968). On the mean accuracy of statistical pattern recognizers. IEEE Transactions Information Theory, 14(1), 55-63.

Jackson, Q., & Landgrebe, D. A. (2002). Adaptive Bayesian Contextual Classification Based on Markov Random Fields. IEEE Transaction on Geosciensc and Remote Sensing, 40(11), 2454-2463.

John, S. T., & Nello, C. (2004). Kernel Methods for Pattern Analysis, Cambridge University Press, New York, NY.

Kaya, G. T., Ersoy, O. K., & Kamasak, M. E. (2011). Support Vector Selection and

and Remote Sensing, 49(6), 2071-2079.

Knerr, S., Personnaz, L., & Dreyfus, G. (1990). Single-layer Learning Revisited: a Stepwise Procedure for Building and Training a Neural Network. Neurocomputing:

Algorithms, Architectures and Applications, 68, 41-50.

Kohavi, R. (1995). A study of cross-validation and bootstrap for accuracy estimation and model selection. IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence, 2, 1137-1143.

Kuo, B. C., Chuang, C. H., Huang, C. S., & Hung, C. C. (2009). A nonparametric contextual classification based on Markov random fields. First Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing, 1-4.

Kuo, B. C., Li, C. H., & Yang, J. M. (2009). Kernel Nonparametric Weighted Feature Extraction for Hyperspectral Image Classification. IEEE Transactions on Geoscience and Remote Sensing, 47(4), 1139-1155

Kuo, B-C., & Landgrebe, D. A. (2004). Nonparametric Weighted Feature Extraction for Classification. IEEE Transactions on Geoscience and Remote Sensing, 42(5), 1096-1105.

Landgrebe, D. A., (2003). Signal Theory Methods in Multispectral Remote Sensing.

John Wiley and Sons, Hoboken, NJ.

Li, C. H., Ho, H. H., Kuo, B. C., Taur, J. S., Chu, H. S., & Wang, M. S. (in press). A semi-supervised feature extraction based on supervised and fuzzy-based linear discriminant analysis for hyperspectral image classification. Applied Mathematics

& Information Sciences.

Li, C. H., Ho, H. H., Liu, Y. L., Lin, C. T., Kuo, B. C., & Taur, J. S. (2012). An automatic method for selecting the parameter of the normalized kernel function to support vector machines. Journal of Information Science and Engineering, 28(1), 1-15.

Li, C. H., Kuo, B. C., Lin, C. T., & Huang, C. S. (2012). A spatial-contextual support vector machine for remotely sensed image classification. IEEE Transactions on Geoscience and Remote Sensing, 50(3), 784-799.

Li, J., & Narayanan, R. M. (2004). Integrated spectral and spatial information mining in remote sensing imagery. IEEE Transactions on Geoscience and Remote Sensing, 42(3), 673 - 685.

Liao, W., Pizurica, A., Scheunders, P., Philips, W., & Pi, Y. (2013). Semisupervised Local Discriminant Analysis for Feature Extraction in Hyperspectral Images. IEEE Transaction on Geosciensc and Remote Sensing, 51(1), 184-198.

Okabe, A., Boots, B., & Sugihara, K. (1992). Spatial tessellations: concepts and applications of Voronoi diagrams. John Wiley & Sons, Inc., New York, NY.

Qian, D. (2007). Modified Fisher's Linear Discriminant Analysis for Hyperspectral Imagery. IEEE on Geoscience and Remote Sensing Letters, 4(4), 503-507.

Roweis, S. T., & Saul, L. (2000) Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500), 2323-2326.

Schölkopf, B., & Smola, A. J. (2001). Learning with Kernels, MIT Press, Cambridge, MA.

Sindhwani, V., Niyogi, P., & Belkin, M. (2005). Beyond the point cloud: from transductive to semi-supervised learning. International conference on Machine learning, 824-831.

Sugiyama, M., Ide, T., Nakajima, S., & Sese, J. (2010). Semi-supervised local Fisher discriminant analysis for dimensionality reduction. Machine Learning, 78(1-2), 35-61.

Theodoridis, S., & Koutroumbas, K. (2006). Pattern Recognition. Academic Press, San

Voronoi, G. (1908). Nouvelles applications des parametres continus à la théorie des formes quadratiques. Deuxième Mémoire: Recherches sur les parallelloedres primitifs. Journal für die reine und angewandte Mathematik, 134, 198-287.

Wu, K. L., Yu, J., & Yang, M. S. (2005). A novel fuzzy clustering algorithm based on a fuzzy scatter matrix with optimality tests. Pattern Recognition Letters, 26, 639-652.

Xu, J., & Yang, J. (2009). Local Graph Embedding Discriminant Analysis for Face Recognition with Single Training Sample Per Person. In Proceedings of the 2009 Chinese Conference on Pattern Recognition (pp. 1-5). Nanjing, China.

Zhang, H., Berg, A. C., Maire, M., & Malik, J. (2006). SVM-KNN: Discriminative nearest neighbor classification for visual category recognition. In Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Vol. 2, pp. 2126-2136). New York, NY.

Zhu, P., Hu, Q., & Yang, Y. (2010). Weighted nearest neighbor classification via maximizing classification consistency. In Proceedings of the Rough Sets and Current Trends in Computing (pp. 347-355). Warsaw, Poland.

Melgani, F., & Bruzzone, L. (2004). Classification of hyperspectral remote sensing images with support vector machines. IEEE Transactions on Geoscience and Remote Sensing, 42(8), 1778-1790.

Wu, K. P., & Wang, S. D. (2009). Choosing the kernel parameters for support vector machines by the inter-cluster distance in the feature space. Pattern Recognition, 42(5), 710-717.

Baudat, G., & Anouar, F. (2000). Generalized discriminant analysis using a kernel approach. Neural computation, 12(10), 2385-2404.

附錄一 教育測驗資料試題

1. (𝑥

7

)

=

(A) 7𝑥

6

(B) 6𝑥

6

(C) 𝑥

6

(D) 7𝑥

7

2. (𝑥

52)

=

(A) −

5 2

𝑥 (B) −

7 2

𝑥

72 (C) −

5 2

𝑥

52 (D) −

5 2

𝑥

72 3. (𝑥

√3

)

=

(A) √3𝑥

√3

(B) √3𝑥

√3−1

(C) √3x

√2

(D) √3𝑥 4. (5

3

)

=

(A) 3

5

2

(B) (C) 0 (D) 125 5. (𝑥)

=

(A) (B) (C) 𝑥

−1

(D) −𝑥

−1

6. 若已知𝑓(𝑥)與𝑔(𝑥)均為可微函數,則(𝑓(𝑥)𝑔(𝑥))

=

(A) 𝑓

(𝑥)

𝑔

(𝑥)

(B) 𝑓

(𝑥) + 𝑔

(𝑥)

(C) 𝑓

(𝑥)

𝑔(𝑥) + 𝑓(𝑥)

𝑔

(𝑥)

(D) 𝑓

(𝑥) + 𝑔(𝑥)𝑓(𝑥) + 𝑔

(𝑥)

7. 若已知𝑓(𝑥)為可微函數且𝑔(𝑥) = 𝑥

2

,則(𝑓(𝑥)𝑔(𝑥))

=

(A) 𝑓

(𝑥)

(𝑥

2

)

(B) 𝑓

(𝑥) + (𝑥

2

)

(C) 𝑓

(𝑥)

𝑥

2

(D) 𝑓

(𝑥)

𝑥

2

+ 𝑓(𝑥)

2𝑥 8. 若已知𝑓(𝑥)與𝑔(𝑥)均為可微函數,則 (

𝑓(𝑥) 𝑔(𝑥)

)

=

(A)

𝑓

(𝑥)𝑔(𝑥)−𝑓(𝑥)𝑔

(𝑥)

𝑔

2

(𝑥)

(B)

𝑓

(𝑥) 𝑔

(𝑥)

(C)

𝑓

(𝑥)𝑔(𝑥)+𝑓(𝑥)𝑔

(𝑥)

𝑔

2

(𝑥)

(D) 𝑓

(𝑥)𝑔(𝑥) − 𝑓(𝑥)𝑔

(𝑥) 9. 若已知𝑓(𝑥)為可微函數且𝑔(𝑥) = 𝑥

2

,則(

𝑓(𝑥)

𝑔(𝑥)

)

=

(此題為建構反應題)

10. 若已知𝑓(𝑔(𝑥))與𝑔(𝑥)均為可微函數,則 (𝑓(𝑔(𝑥)))

=

(A) 𝑓

(𝑔

(𝑥)) (B) 𝑓

(𝑔(𝑥))

𝑔

(𝑥)

(C) 𝑓

𝑔(𝑥) + 𝑓

𝑔

(𝑥) (D) 𝑓(𝑔(𝑥))

𝑔

(𝑥) 11. 若已知𝑓(𝑔(𝑥))為可微函數且𝑔(𝑥) = 𝑥

2

,則(𝑓(𝑔(𝑥)))

=

(此題為建構反應題)

12. 若已知𝑓(𝑥),𝑔(𝑥),ℎ(𝑥)均為可微函數,則(𝑓(𝑥) + 𝑔(𝑥)

ℎ(𝑥))

=

(A) 𝑓

(𝑥) + 𝑔

(𝑥)

(𝑥)

(B) 𝑓(𝑥) + 𝑔

(𝑥)

ℎ(𝑥) + 𝑔(𝑥)

(𝑥)

(C) 𝑓

(𝑥) + 𝑔

(𝑥)

ℎ(𝑥) + 𝑔(𝑥)

(𝑥)

(D) 𝑓

(𝑥)𝑔(𝑥)ℎ(𝑥) + 𝑓(𝑥)𝑔

(𝑥)ℎ

(𝑥)

13. 若已知𝑓(𝑥),𝑔(𝑥),ℎ(𝑥)均為可微函數,則 (

𝑓(𝑥) ℎ(𝑥) 𝑔(𝑥)

)

=

(A)

𝑓

(𝑥)𝑔(𝑥)ℎ(𝑥)+𝑓(𝑥)𝑔

(𝑥)ℎ(𝑥)−𝑓(𝑥)𝑔(𝑥)ℎ

(𝑥) ℎ

2

(𝑥)

(B)

(𝑓

(𝑥) ‧ 𝑔

(𝑥))ℎ(𝑥)−(𝑓(𝑥) ‧ 𝑔(𝑥))ℎ

(𝑥) ℎ

2

(𝑥)

(C)

𝑓

(𝑥)𝑔(𝑥)−𝑓(𝑥)𝑔

(𝑥)

ℎ(𝑥)

(D)

𝑓

(𝑥)𝑔(𝑥)+𝑓(𝑥)𝑔

(𝑥)

2

(𝑥)

14. (𝑥

−2

+ 𝑥

2

− 𝑥

4

)

=

(A) −4𝑥

3

(B) −2𝑥

−3

+ 2 − 4𝑥

3

(C) (−2𝑥

−3

)(𝑥

2

− 𝑥

4

) + (𝑥

−2

)(2𝑥 − 4𝑥

3

)

(D) −2𝑥

−3

+ 2𝑥 − 4𝑥

3

15. (

1−𝑥

𝑥+3

)

=(此題為建構反應題)

16. (

𝑥

2

+1

𝑥−2

+ 𝑥

3

− 𝑥

2

)

=

(A)

𝑥 (𝑥−2)

2

−4𝑥−1

2 + 3𝑥

2

− 2𝑥 (B)

3𝑥 (𝑥−2)

2

−4𝑥+1

2 + 3𝑥

2

− 2𝑥

(C) 3𝑥

2

(D)

−𝑥

3

+2𝑥

2

−5𝑥

(𝑥−2)

2 + 3𝑥

2

− 2𝑥 17. (𝑥 − 𝑥

2

+ (𝑥

2

− 1)43)

=

(A) −2𝑥 +

4

3

(𝑥

2

− 1)13

2𝑥 (B) 1 − 2𝑥 +

4

3

(𝑥

2

− 1)13

2𝑥

(C) −𝑥 +

4

3

(𝑥

2

− 1)13

2𝑥

(D) (1 − 2𝑥)(𝑥

2

− 1)43+ (𝑥 − 𝑥

2

) (

4 3

(𝑥

2

− 1)

2𝑥) 18. (𝑥

2

(𝑥

2

+ 1)

10

)

=

(A) 4𝑥

2

10(𝑥

2

+ 1)

9

(B) 𝑥

(𝑥

2

+ 1)

10

+ 10𝑥

3

(𝑥

2

+ 1)

9

(C) 2𝑥

(𝑥

2

+ 1)

10

+ 20𝑥

3

(𝑥

2

+ 1)

9

(D) 2𝑥

(𝑥

2

+ 1)

10

+ 10𝑥

2

(2𝑥)

9

19. (

(𝑥

2

−𝑥)

10

𝑥

3 )

=(此題為建構反應題)

20. ((𝑥

2

+ 1)

10

)

=

(A) 10(𝑥

2

+ 1)

9

(B) 20(𝑥

2

+ 1)

9

(C) 10(2𝑥)

9

(D) 20𝑥

(𝑥

2

+ 1)

9

21. 若已知(𝑒

𝑥

)

= 𝑒

𝑥

,則(𝑒

𝑥

𝑥

2

)

=

(A) 𝑒

𝑥

2𝑥 (B) 𝑒

𝑥

𝑥

2

+ 2𝑒

𝑥

(C)

𝑒

𝑥

𝑥 (𝑥

2

+𝑒

2

)

2𝑥

2𝑥

(D) 𝑒

𝑥

𝑥

2

+ 𝑒

𝑥

2𝑥

22. 若已知(𝑒

𝑥

)

= 𝑒

𝑥

,則(

𝑒

𝑥

𝑥+1 𝑥

2)

=(此題為建構反應題)

23. 若已知𝑓(3) = 4,𝑔(3) = 2,𝑓

(3) = 5,𝑔

(3) = 3,且ℎ(𝑥) = 𝑓(𝑥)

𝑔(𝑥),則 ℎ

(3) =

(A) 15 (B) (C) 22 (D) 23

相關文件