• 沒有找到結果。

第六章 結論與未來發展

6.2 未來發展

立 政 治 大 學

N a tio na

l C h engchi U ni ve rs it y

第六章 結論與未來發展

6.1 結論

在本論文中,我們以 3D 動畫引擎為基礎發展平台,結合瑜珈運動知識,實作出瑜 珈運動教學系統。結合程序式動畫技術 IK、FK 和 Lookup table,設計出各種動作子程 序,利用子程序之間彈性的組合,完成各種需要的瑜珈動作教學。透過模組化的設計概 念,使我們的系統有更好的擴充性。為了讓不同的使用者都可以量身訂做合適的運動學 習,我們增加了客製化人物的功能,並且可透過腳本來調整運動相關的參數和不同的動 作學習腳本,也加入了晃動和搖晃等具有瑜珈特色的運動效果。

在實驗評估中,我們將關節柔軟度、運動平衡感和穩定度參數,以單一維度來做變 化調整,從實驗結果中可明顯看出運動參數調整對人物運動的影響。另外,我們也實做 了三種人物角色,並以同樣的動作來做實驗比較,從不同角色中看到實際的瑜珈動作變 化,以證明本系統可對於不同的人物進行客製化。最後,我們進行了使用者評估實驗,

在實驗中分為三個階段,分別為系統操作體驗、客製化人物情境實驗和教學錄影帶比較 實驗,實驗的結果顯示各方面皆高於中間值,使用者對本教學系統在介面、功能和瑜珈 運動學習上的接受度是較佳的。

6.2 未來發展

透過使用者的評估實驗中,我們得到一些回饋意見,可以用來改進本系統。在系統 的操作介面的部分,可使用圖示按鈕讓操作更直覺,增加放大縮小比例尺功能和更多的 速度選擇,顯示器背景和運動場景也可在更美觀。在人物角色方面,模型和動作的擬真 度可再加強,也可再加入更多人物細節的參數。在系統功能方面,可以增加有助於學習

‧ 國

立 政 治 大 學

N a tio na

l C h engchi U ni ve rs it y

的運動參數,例如呼吸和節奏等,有助於提升人物的擬真度和運動的逼真感。在瑜珈動 作的實作部分,目前是以程序式動畫為產生方法,未來也可以搭配動作擷取資料,以取 得真實人物的動作資料作為分析,來加強程序式動畫的動作擬真性和增加更多的動作變 化。目前動作庫中的動作只有五種,未來可利用程序式動畫擴展性佳的特性,增加需要 的程序來實作出更多的瑜珈動作,使本系統在動作學習上更加的豐富。在教學方面,可 加入客製化的教學教材,以循序漸進的方式讓使用者學習,還可以將多種動作加以組合 出有變化的動作,並搭配學習時間,使學習的內容更加多元,並且加強學習的成效。

本研究針對瑜珈運動來實作,透過我們的研究方法,也可以擴展到不同的運動學習,

像是跳舞或游泳等,以簡單的肢體程序模組為基礎,可組合出更多種複雜的動作和運動。

未來也可以結合硬體設備,配合偵測人物動作的感測器,以壓力或燈光等方式,於使用 者運動時給予適當回饋,對於運動學習會更佳有幫助。

synthesis." ACM Transactions on Graphics (TOG), pp. 501-508, 2002.

[2] Z. Luo, W. Yang, Z. Q. Ding, L. Liu, I.-M. Chen, S. H. Yeo, K. V. Ling, and H.-L. Duh,

"“Left Arm Up!” Interactive Yoga Training in Virtual Environment." IEEE Virtual Reality Conference (VR), pp. 261-262, 2011.

[3] Y.-H. Lin, “Designing Parameterized Procedures for Real-Time 3D Figure Animation with Affective Expression,” Master Thesis, Computer Science Department, National Chengchi University, 2009.

[4] S.-k. Chung, and J. K. Hahn, "Animation of human walking in virtual environments."

Proceedings of IEEE Computer Animation, pp. 4-15, 1999.

[5] J. S. Joon, "Reviewing Principles and Elements of Animation for Motion Capture-Based Walk, Run and Jump." IEEE Seventh International Conference on Computer Graphics, Imaging and Visualization (CGIV), pp. 55-59, 2010.

[6] Z. Liu, "Simulation of pedestrians in computer animation." IEEE First International Conference on Innovative Computing, Information and Control, pp. 229-232, 2006.

[7] P.-F. Yang, J. Laszlo, and K. Singh, "Layered dynamic control for interactive character swimming." Eurographics Association Proceedings of the ACM SIGGRAPH/Eurographics symposium on Computer animation, pp. 39-47, 2004.

[8] K.-S. Huang, C.-F. Chang, Y.-Y. Hsu, and S.-N. Yang, “Key probe: a technique for animation keyframe extraction,” The Visual Computer, vol. 21, no. 8-10, pp. 532-541, 2005.

[9] P. T. Chua, R. Crivella, B. Daly, N. Hu, R. Schaaf, D. Ventura, T. Camill, J. Hodgins, and R. Pausch, "Training for physical tasks in virtual environments: Tai chi."

Proceedings of IEEE Virtual Reality, pp. 87-94, 2003.

[10] M. Oshita, R. Yamanaka, M. Iwatsuki, Y. Nakatsuka, and T. Seki, "Development of Easy-To-Use Authoring System for Noh (Japanese Traditional) Dance Animation."

IEEE International Conference on Cyberworlds (CW), pp. 45-52, 2012.

[11] J. Laszlo, M. van de Panne, and E. Fiume, "Interactive control for physically-based animation." ACM Press/Addison-Wesley Publishing Co. Proceedings of the 27th annual conference on Computer graphics and interactive techniques, pp. 201-208, 2000.

[12] M. Oshita, "Pen-to-mime: A pen-based interface for interactive control of a human figure." Eurographics Association Proceedings of the First Eurographics conference on Sketch-Based Interfaces and Modeling, pp. 43-52, 2004.

[13] "OGRE," : http://www.ogre3d.org/, 2013.

[14] "Qt," : http://qt.digia.com/, 2013.

[15] A. Bruderlin, and T. W. Calvert, “Goal-directed, dynamic animation of human walking,” ACM SIGGRAPH Computer Graphics, vol. 23, no. 3, pp. 233-242, 1989.

[16] K. Yin, K. Loken, and M. van de Panne, "SIMBICON: simple biped locomotion control." ACM Transactions on Graphics (TOG), pp. 105, 2007.

[17] H. Ko, and N. I. Badler, “Animating human locomotion with inverse dynamics,” IEEE Computer Graphics and Applications, vol. 16, no. 2, pp. 50-59, 1996.

[18] J. Laszlo, M. van de Panne, and E. Fiume, "Limit cycle control and its application to the animation of balancing and walking." ACM Proceedings of the 23rd annual conference on Computer graphics and interactive techniques, pp. 155-162, 1996.

[19] J. K. Hodgins, W. L. Wooten, D. C. Brogan, and J. F. O'Brien, “Animating Human Athletics,” Computer Graphics, 1995.

[20] J. K. Hodgins, and N. S. Pollard, "Adapting simulated behaviors for new characters."

ACM Press/Addison-Wesley Publishing Co. Proceedings of the 24th annual conference on Computer graphics and interactive techniques, 1997.

[21] D. Tolani, A. Goswami, and N. I. Badler, “Real-time inverse kinematics techniques for anthropomorphic limbs,” Graphical models, vol. 62, no. 5, pp. 353-388, 2000.

[22] D. Chi, M. Costa, L. Zhao, and N. Badler, "The EMOTE model for effort and shape."

ACM Press/Addison-Wesley Publishing Co. Proceedings of the 27th annual conference on Computer graphics and interactive techniques, pp. 173-182, 2000.

[23] L. Zhao, and N. I. Badler, “Acquiring and validating motion qualities from live limb gestures,” Graphical Models, vol. 67, no. 1, pp. 1-16, 2005.

[24] K. Amaya, A. Bruderlin, and T. Calvert, "Emotion from motion." Citeseer Graphics Interface, pp. 222-229, 1996.

[25] C.-L. Moore, and K. Yamamoto, Beyond Words: Movement Observation and Analysis;

Instructor's Guidebook: Routledge, 1988.

[26] C. Dell, A primer for movement description using effort-shape and supplementary concepts: Dance Notation Bureau, Center for Movement Research and Analysis, 1970.

[27] V. Maletic, Body-space-expression: The development of Rudolf Laban's movement and dance concepts: De Gruyter Mouton, 1987.

[28] P. Chen, and T. Li, “Generating humanoid lower-body motions with real-time planning,” Proceedings of 2002 Computer Graphics Workshop, 2002.

[29] S. Dyer, J. Martin, and J. Zulauf, "Motion capture white paper," : ftp://ftp.sgi.com/sgi/A%7CW/jam/mocap/MoCapWP_v2.0.html, 1995.

[30] A. Bruderlin, and L. Williams, "Motion signal processing." Proceedings of the 22nd annual conference on Computer graphics and interactive techniques, pp. 97-104, 1995.

Proceedings of the 22nd annual conference on Computer graphics and interactive techniques, pp. 229-238, 1995.

[32] J. Lee, and S. Y. Shin, "A hierarchical approach to interactive motion editing for human-like figures." ACM Press/Addison-Wesley Publishing Co. Proceedings of the 26th annual conference on Computer graphics and interactive techniques, pp. 39-48, 1999.

[33] A. Witkin, and Z. Popovic, “Motion warping. ” ACM Proceedings of the 22nd annual conference on Computer graphics and interactive techniques, pp. 105-108, 1995.

[34] J. C. Chan, H. Leung, J. K. Tang, and T. Komura, “A virtual reality dance training system using motion capture technology.” IEEE Transactions on Learning Technologies, vol. 4, no. 2, pp. 187-195, 2011.

[35] 王炫智, “3D 動畫對太極拳學習成效影響之研究,” 碩士論文, 運動科學研究所, 國立體育大學, 2009.

[36] Z. Luo, I.-M. Chen, S. H. Yeo, C.-C. Lin, and T.-Y. Li, “Building Hand Motion-Based Character Animation: The Case of Puppetry. ” IEEE International Conference on Cyberworlds (CW), pp. 46-52, 2010.

[37] C.-H. Liang, and T.-Y. Li, “Enhancing Procedural Animation with Motion Capture Data,” Master Thesis, Computer Science Department, National Chengchi University, 2009.

[38] H. A. W. G. W. D. Consortium, “H-Anim: Specification for a Standard VRML Humanoid, version 1.1,” On-line standard proposal, http://ece.uwaterloo.

ca/h-anim/spec1, vol. 1, 1999.

[39] J. Gu, T. Chang, I. Mak, S. Gopalsamy, H. C. Shen, and M. M. F. Yuen, “A 3D Reconstruction System for Human Body Modeling,” Proceeding of CAPTECH' 98:

Modeling and Motion Capture Technique for Virtual Environment, pp. 229-241, 1998.

[40] D. Xinyu, and J. Pin, "Three dimension human body format and its virtual avatar animation application." IEEE IITA'08. Second International Symposium on Intelligent Information Technology Application, pp. 1016-1019, 2008.

[41] T. Naka, Y. Mochizuki, and S. Asahara, “WonderSpace: web based humanoid animation,” Future Generation Computer Systems, vol. 17, no. 1, pp. 57-64, 2000.

[42] T. Naka, Y. Mochizuki, T. Hijiri, T. Cornish, and S. Asahara, "A compression/decompression method for streaming based humanoid animation." ACM Proceedings of the fourth symposium on Virtual reality modeling language, pp. 63-70, 1999.

[43] Z. Huang, A. Eliëns, and C. Visser, "Implementation of a scripting language for VRML/X3D-based embodied agents." ACM Proceedings of the eighth international

‧ 國

立 政 治 大 學

N a tio na

l C h engchi U ni ve rs it y

conference on 3D Web technology, pp. 91-100, 2003.

[44] M. Endo, T. Yasuda, and S. Yokoi, "An application oriented humanoid animation system based on VRML." IEEE Seventh International Conference on Parallel and Distributed Systems: Workshops, pp. 213-218, 2000.

[45] 陳哲修, and 陳忠慶, “不同伸展運動所引起的生理效果之探討,” 運動生理暨體能 學報, no. 5, pp. 47-59, 2006.

[46] J. A. Raub, “Psychophysiologic effects of Hatha Yoga on musculoskeletal and cardiopulmonary function: a literature review,” The Journal of Alternative &

Complementary Medicine, vol. 8, no. 6, pp. 797-812, 2002.

[47] B. B. Birch, Power yoga: the total strength and flexibility workout: Touchstone, 2010.

[48] M. D. Tran, R. G. Holly, J. Lashbrook, and E. A. Amsterdam, “Effects of Hatha Yoga Practice on the Health‐Related Aspects of Physical Fitness,” Preventive cardiology, vol. 4, no. 4, pp. 165-170, 2001.

[49] V. S. Cowen, and T. B. Adams, “Physical and perceptual benefits of yoga asana practice: results of a pilot study,” Journal of Bodywork and Movement Therapies, vol.

9, no. 3, pp. 211-219, 2005.

[50] N. Belling, and 王俐之, 瑜珈慢慢來: 台北: 相映文化, 2005.

[51] K. Shoemaker, “Animating Rotation with Quaternion Curves,” Proceedings of ACM SIGGRAPH, 1985.

[52] E. B. Dam, M. Koch, and M. Lillholm, Quaternions, interpolation and animation:

Datalogisk Institut, Københavns Universitet, 1998.

<?xml version="1.0" encoding="UTF-8" ?>

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">

<xs:element name="Character">

<xs:element name="skeleton" type="xs:string"></xs:element>

<xs:element name="mesh" type="xs:string"></xs:element>

<xs:element name="Model_variable">

<xs:complexType>

<sequence minOccurs="1" maxOccurs="1">

<xs:element name="close_feet_pitch" type="decimal">

<xs:simpleType>

<xs:restriction base="xs:decimal">

<xs:minInclusive value="0"/>

<xs:maxInclusive value="unbounded"/>

</xs:restriction>

</xs:simpleType>

</xs:element>

<xs:element name="bend_forward_max_angle" type="integer">

<xs:simpleType>

<xs:restriction base="xs:integer">

<xs:minInclusive value="45"/>

<xs:maxInclusive value="90"/>

</xs:restriction>

</xs:simpleType>

</xs:element>

<xs:element name="raise_head_angle" type="integer">

<xs:simpleType>

<xs:restriction base="xs:integer">

<xs:minInclusive value="30"/>

<xs:maxInclusive value="90"/>

</xs:restriction>

</xs:simpleType>

</xs:element>

<xs:element name="scoliosis_max_angle" type="integer">

<xs:simpleType>

<xs:restriction base="xs:integer">

<xs:minInclusive value="70"/>

<xs:maxInclusive value="120"/>

</xs:restriction>

</xs:simpleType>

</xs:element>

<xs:element name="raise_hands_max_angle" type="integer">

<xs:simpleType>

<xs:restriction base="xs:integer">

<xs:minInclusive value="150"/>

<xs:maxInclusive value="180"/>

</xs:restriction>

</xs:simpleType>

</xs:element>

<xs:element name="balance" type="integer">

<xs:simpleType>

<xs:restriction base="xs:integer">

<xs:minInclusive value="0"/>

<xs:maxInclusive value="5"/>

</xs:restriction>

</xs:simpleType>

</xs:element>

<xs:element name="Stability" type="integer">

<xs:simpleType>

<xs:restriction base="xs:decimal">

<xs:minInclusive value="0"/>

<xs:maxInclusive value="3"/>

</xs:restriction>

<?xml version="1.0" encoding="UTF-8" ?>

<xs:schema xmlns:xs="http://www.w3.org/2001/XMLSchema">

<xs:element name="balance" type="xs:nonNegativeInteger">

<xs:simpleType>

<xs:restriction base="xs:nonNegativeInteger">

<xs:minInclusive value="0"/>

<xs:maxInclusive value="5"/>

</xs:restriction>

</xs:simpleType>

</xs:element>

<xs:element name="Stability" type="xs:nonNegativeInteger">

<xs:simpleType>

<xs:restriction base="xs:nonNegativeInteger">

<xs:minInclusive value="0"/>

<xs:maxInclusive value="3"/>

</xs:restriction>

</xs:simpleType>

</xs:element>

<xs:element name="motion">

<xs:complexType>

<xs:element name="name" type="xs:string">

</xs:element>

<xs:choice minOccurs="0" maxOccurs="unbounded">

<xs:element name="tips" type="xs:string">

</xs:choice>

<xs:choice minOccurs="0" maxOccurs="1">

<xs:element ref="balance"></xs:element>

<xs:element ref="Stability"></xs:element>

</xs:choice>

<xs:attribute name="index" type="xs:nonNegativeInteger" use="required">

</xs:attribute>

</xs:complexType>

</xs:element>

<xs:element name="Motions">

<xs:choice minOccurs="1" maxOccurs="unbounded">

<xs:element ref="motion"></xs:element>

</xs:choice>

</xs:complexType>

</xs:element>

</xs:schema>

附錄 C:三種人物模型腳本範例

Medium 角色腳本範例:

<?xml version="1.0" encoding="UTF-8" ?>

<Character>

<skeleton>medium.bvh</skeleton>

<mesh> medium_ skeleton.mesh</mesh>

<Model_variable>

<Stability>2</Stability>

</Model_variable>

</Character>

Thin and flexible 角色腳本範例:

<?xml version="1.0" encoding="UTF-8" ?>

<Character>

<skeleton>thin_small.bvh</skeleton>

<mesh>thin_flexible_skeleton.mesh</mesh>

<Model_variable>

<close_feet_pitch>7</close_feet_pitch>

‧ 國

立 政 治 大 學

N a tio na

l C h engchi U ni ve rs it y

<bend_forward_max_angle>90</bend_forward_max_angle>

<raise_head_angle>70</raise_head_angle>

<scoliosis_max_angle>90</scoliosis_max_angle>

<raise_hands_max_angle>180</raise_hands_max_angle>

<balance>0</balance>

<Stability>0</Stability>

</Model_variable>

</Character>

Heavy 角色腳本範例:

<?xml version="1.0" encoding="UTF-8" ?>

<Character>

<skeleton>Heavy.bvh</skeleton>

<mesh>heavy.mesh</mesh>

<Model_variable>

<close_feet_pitch>11</close_feet_pitch>

<bend_forward_max_angle>45</bend_forward_max_angle>

<raise_head_angle>50</raise_head_angle>

<scoliosis_max_angle>60</scoliosis_max_angle>

<raise_hands_max_angle>150</raise_hands_max_angle>

<balance>5</balance>

<Stability>3</Stability>

</Model_variable>

</Character>