5.1 結論
本眼動儀系統能夠實現在一般辦公室光源環境,不外加光源且可能受室外光 影響的條件下,處理 480fps 取樣率錄製的影像,定位虹膜中心並得到平均誤差 0.23 度的凝視點校正結果與最大誤差平均 1.21 度的凝視點預測結果。在系統架構 的設計上,本系統參考了 Baek[17]提出的眼球模型,並改良匹配方法後得到較高 準確率及較好的計算效率,並在加入粒子群移動演算法後將系統加速使高速眼動 儀(480fps)具有實用性且能夠實現實時(Real-time)30fps 眼動儀系統的計算速度。
而模型參數計算的部分則使用了創新的虹膜輪廓與擬合圓方法,擷取到單層且連 續的虹膜輪廓特徵點並有效的擬合出虹膜圓形。在最後凝視點預測的部分使用了 改良的映射曲線,使其得到優於傳統二次曲線的效果。
5.2 未來展望
本眼動儀系統目前能夠在有效時間內進行影像處理計算並得到高準確率的結 果,因此能在相關的認知實驗上作為替代紅外光眼動儀的實用系統。由於本眼動 儀適用於多樣的環境光源,因此在實驗室設置時將少去紅外光眼動儀的諸多限制,
具有高度的彈性與實用性。但雖然本眼動儀系統目前在固定頭部的條件下已可達 成高精確度的凝視預測結果,但對於受試者來說,長時間的頭部固定有其困難度。
因此基於本文架構上,加入能夠校正微幅頭動誤差的演算法能使眼動儀系統的應 用更符合使用者經驗。
參 考 文 獻
[1] K. Rayner, "Eye movements in reading and information processing: 20 years of research,"
Psychological bulletin., vol. 124, no.3, pp.372-422, Nov. 1998.
[2] Human eye [Online] Available at https://en.wikipedia.org/wiki/Human_eye [3]國立台灣師範大學教育與心理輔導學系的教育神經科學實驗室[Online]
Available at http://web.ntnu.edu.tw/~696010077/
[4] 國立政治大學心理系的眼動與閱讀實驗室[Online]
Available at:http://emrlab.nccu.edu.tw/
[5] 中央大學認知神經科學研究所的情緒與犯罪實驗室[Online]
Available at:http://icn.ncu.edu.tw/f.aspx
[6] 衛生福利部社會及家庭署 輔具資源入口網[Online]
Available at: http://repat.sfaa.gov.tw/07product/pro_a_main.asp?id=5951 [7] SMI (SensoMotoric Instruments). [Online] Available at : http://www.smivision.com [8] Tobii. [Online] Availabe at : http://www.tobii.com
[9] 由田新技股份有限公司[Online]Available at:http://www.utechzone.com.tw/index.aspx [10] R. Valenti and T. Gevers, “Accurate eye center location through invariant isocentric
patterns,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 34, no. 9, pp. 1785–1798, Sep.
2012.
[11] D. W. Hansen and Q. Ji, “In the eye of the beholder: a survey of models for eyes and gaze,”
IEEE Trans. Pattern Anal. Mach. Intell., vol. 32, no.3, pp. 478-500, Mar. 2010.
[12] J. Sigut and S.-A. Sidha, “Iris center corneal reflection method for gaze tracking using visible light,” IEEE Trans. Biomed. Eng., vol. 58, no. 2, pp. 411–419, Feb. 2011.
[13] J.-G. Wang, E. Sung, and R. Venkateswarlu, “Estimating the eye gaze from one eye,”
Comput. Vis. Image Und., vol. 98, no. 1, pp. 83-103, Apr. 2005.
[14] W. Zhang, T.-N. Zhang, and S.-J. Chang, “Eye gaze estimation from the elliptical features of one iris,” Opt. Eng., vol. 50, no. 4, pp. 047003-1-9, Apr. 2011.
[15] J. Daugman, “How iris recognition works,” IEEE Trans. Circuits Syst. Video Technol., vol.
14, no. 1, pp. 21-30, Jan. 2004.
[16] K. Nishino and S. K. Nayar, “Corneal imaging system: environment from eyes,” Int. J.
Comput. Vis., vol. 70, no. 1, pp. 23-40, Oct. 2006.
[17] S.-J. Baek, K.-A. Choi, C. Ma, Y.-H. Kim, and S.-J. Ko, “Eyeball Model-based Iris Center Localization for Visible Image-based Eye-Gaze Tracking Systems,” IEEE Trans.
Consumer Electron., vol. 59, no. 2, pp. 415-421, May 2013
[18] Canny edge detector [Online]
Available at:https://en.wikipedia.org/wiki/Canny_edge_detector [19] RANSAC[Online]Available at:https://en.wikipedia.org/wiki/RANSAC
[20] J. B. Roerdink, “Mathematical morphology on the sphere,” Proc. SPIE Vis. Comm. Image Process. ’90: 5th in a Series, vol. 1360, Lausanne, Switzerland, Sep. 1990, pp. 263-271.
[21] F. Lu, Y. Sugano, T. Okabe, and Y. Sato "Adaptive linear regression for appearance-based gaze estimation," IEEE Trans. Pattern Anal. Mach. Intell., vol. 36. no. 10 pp. 2033-2046, Oct. 2014
[22] S. M. H. Jansen, H. Kingma, and R. L. M. Peeters, ``A confidence measure for real-time eye movement detection in video-oculography,'' in Proc. 13th Int. Conf. Biomed. Eng., 2009, pp. 335-339.
[23] Least squares[Online]Available at:https://en.wikipedia.org/wiki/Least_squares
[24] H. C. Lee, D. T. Luong, C. W. Cho, E. C. Lee, and K. R. Park, “Gaze tracking system at a distance for controlling IPTV,” IEEE Trans. Consumer Electron., vol. 56, no. 4, pp. 2577-2583, Nov. 2010.
[25] T. Moriyama, T. Kanade, J. Xiao, and J. F. Cohn, “Meticulously detailed eye region model and its application to analysis of facial images,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 28, no. 5, pp. 738-752, May 2006.
[26] Gaussian pyramid[Online]
Available at: https://en.wikipedia.org/wiki/Pyramid_(image_processing) [27] Mean Shift[Online]Available at: https://en.wikipedia.org/wiki/Mean_shift [28] Scale-invariant feature transform [Online]
Available at:https://en.wikipedia.org/wiki/Scale-invariant_feature_transform [29] Sobel Operator[Online] Available at: https://en.wikipedia.org/wiki/Sobel_operator