• 沒有找到結果。

立 政 治 大 學

N a tio na

l C h engchi U ni ve rs it y

2. 本研究提出二維方向向量機率模型解決無人機使用前方的攝像機進行線 條跟隨時所產生的方向性問題以及避免線條的錯誤偵測影響線條跟隨。二 維方向向量機率模型在計算的複雜度上較為簡單,並不像馬可夫決策過程 複雜,即便在硬體設備較弱的飛機上亦能使用。

3. 本研究透過慣性速度抑制方法來解決移動慣性所造成的偏移問題,避免移 動慣性使無人機偏離原本的飛行路徑。當無人機偏離原本的飛行路徑時,

除了精準性受到影響外,亦代表無人機需要花費額外的時間以及資源回到 飛行路徑上。因此慣性速度抑制方法除了讓無人機在改變方向飛行時能更 精準的進行線條跟隨外,亦能減少校正所需要得的時間以及飛行資源,增 加自動化的次數。

5.2 未來展望

針對目前提出的方法以及實驗,本研究將提出可以改進與延伸的地方:

1. 無人機本身的不穩定性:

在 4.3.1 節中有提及,本研究所使用之無人機並沒有進行室內定位與額外 的姿態控制,具有不穩定性。因此後續能針對此二點進行改善。針對室內 定位,能開發室內空間模型,讓無人機能即時知道目前位於特定空間中的 哪個位置;針對姿態控制,可以將姿態控制方法應用在無人機上,如 PID(比例、積分、微分)控制,增加移動時的穩定性。

2. 改進位移誤差:

針對本研究目前透過像素誤差所獲得的數值,能成為後續改進的目標,在 飛行控制上根據此誤差進行調整,使線上百分比能再往上增加,同時減少 更多的像素誤差。

‧ 國

立 政 治 大 學

N a tio na

l C h engchi U ni ve rs it y

3. 小於 90 度之測試圖形實驗:

如第 4.4 節之討論,本研究目前提出的方法還未對轉折度數小於 90 度之 測試圖形進行測試,後續可以針對小於 90 度之測試圖形進行實驗,以驗 證提出方法之可行性、穩定性以及精準性。

4. 戶外實際應用:

本研究目前提出的方法主要應用在室內環境的線條上,在實驗環境上的 干擾較室外環境少,後續可以針對室外環境中的實體,使用本研究提出之 方法進行前方視野精準線條跟隨。以高壓電塔檢測為例,透過輪廓偵測找 出高壓電塔之輪廓,使用線條偵測找出高壓電塔之輪廓線,無人機能根據 此輪廓線進行高壓電塔檢測。以山脈岩石紋理觀測為例,透過輪廓偵測找 出山脈岩石紋理的輪廓,使用線條偵測找出輪廓線條,無人機能根據輪廓 線條進行跟隨,完成觀測任務。

References

[1]Khan, M. I., Salam, M. A., Afsar, M. R., Huda, M. N., & Mahmud, T. (2016, July).

Design, fabrication & performance analysis of an unmanned aerial vehicle. In AIP Conference Proceedings (Vol. 1754, No. 1, p. 060007). AIP Publishing.

[2]Intel Corporation. (n.d.). Retrieved January 9, 2018, from https://click.intel.com/intel-aero-ready-to-fly-drone.html

[3]DJI M200. (n.d.). Retrieved November 23, 2018, from https://www.dji.com/zh-tw/matrice-200-series?site=brandsite&from=nav

[4]Parrot Drones SAS. (n.d.). Retrieved October 2, 2018, from https://www.parrot.com/us/drones/parrot-bebop-2.

[5]Punetha, D., Kumar, N., & Mehta, V. (2013). Development and Applications of Line Following Robot Based Health Care Management System.International Journal of Advanced Research in Computer Engineering & Technology (IJARCET), 2(8),

2446-2450.

[6]Dupuis, J. F., & Parizeau, M. (2006, June). Evolving a vision-based line-following robot controller. In Computer and Robot Vision, 2006. The 3rd Canadian Conference on (pp. 75-75). IEEE.

[7]Nelson, D. R., Barber, D. B., McLain, T. W., & Beard, R. W. (2007). Vector field path following for miniature air vehicles. IEEE Transactions on Robotics, 23(3), 519-529.

[8]Sujit, P. B., Saripalli, S., & Sousa, J. B. (2013, July). An evaluation of UAV path following algorithms. In Control Conference (ECC), 2013 European (pp. 3332-3337).

IEEE.

[9]Brandao, A. S., Martins, F. N., & Soneguetti, H. B. (2015, July). A vision-based line

following strategy for an autonomous uav. In Informatics in Control, Automation and Robotics (ICINCO), 2015 12th International Conference on (Vol. 2, pp. 314-319).

IEEE.

[10]Martinez, C., Sampedro, C., Chauhan, A., & Campoy, P. (2014, May). Towards autonomous detection and tracking of electric towers for aerial power line inspection.

In Unmanned Aircraft Systems (ICUAS), 2014 International Conference on (pp. 284-295). IEEE.

[11]Choi, S. S., & Kim, E. K. (2015, July). Building crack inspection using small UAV.

In Advanced Communication Technology (ICACT), 2015 17th International Conference on (pp. 235-238). IEEE.

[12]Nguyen, V. N., Jenssen, R., & Roverso, D. (2018). Automatic autonomous vision-based power line inspection: A review of current status and the potential role of deep learning. International Journal of Electrical Power & Energy Systems, 99, 107-120.

[13]Ghani, N. M. A., Naim, F., & Yon, T. P. (2011). Two wheels balancing robot with line following capability. World Academy of Science, Engineering and Technology, 55, 634-638.

[14]Páll, E., Mathe, K., Tamas, L., & Busoniu, L. (2014, May). Railway track following with the AR. Drone using vanishing point detection. In Automation, Quality and Testing, Robotics, 2014 IEEE International Conference on (pp. 1-6). IEEE.

[15]Cerón, A., Mondragón, I., & Prieto, F. (2018). Onboard visual-based navigation system for power line following with UAV. International Journal of Advanced Robotic Systems, 15(2), 1729881418763452.

[16]Hartley, R., Kamgar-Parsi, B., & Narber, C. (2018). Using Roads for Autonomous Air Vehicle Guidance. IEEE Transactions on Intelligent Transportation Systems.

[17]Shen, W., Wang, X., Wang, Y., Bai, X., & Zhang, Z. (2015). Deepcontour: A deep convolutional feature learned by positive-sharing loss for contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp.

3982-3991).

[18]Bertasius, G., Shi, J., & Torresani, L. (2015). Deepedge: A multi-scale bifurcated deep network for top-down contour detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 4380-4389).

[19]Arbelaez, P., Maire, M., Fowlkes, C., & Malik, J. (2011). Contour detection and hierarchical image segmentation. IEEE transactions on pattern analysis and machine intelligence, 33(5), 898-916.

[20]Barton, M. J. (2001). Controller development and implementation for path planning and following in an autonomous urban vehicle. Undergraduate thesis, University of Sydney.

[21]Park, S., Deyst, J., & How, J. P. (2007). Performance and lyapunov stability of a nonlinear path following guidance method. Journal of Guidance, Control, and Dynamics, 30(6), 1718-1728.

[22]Kothari, M., Postlethwaite, I., & Gu, D. W. (2010). A Suboptimal Path Planning Algorithm Using Rapidly-exploring Random Trees. International Journal of Aerospace Innovations, 2.

[23]Ratnoo, A., Sujit, P. B., & Kothari, M. (2011, September). Adaptive optimal path following for high wind flights. In 18th International Federation of Automatic Control (IFAC) World Congress (pp. 12-985).

[24]Su, J. H., Lee, C. S., Huang, H. H., Chuang, S. H., & Lin, C. Y. (2010). An intelligent line-following robot project for introductory robot courses. World Transactions on

Engineering and Technology Education, 8(4), 455-461.

[25]Khalife, J., Shamaei, K., Bhattacharya, S., & Kassas, Z. (2018, September).

Centimeter-accurate UAV navigation with cellular signals. In Proceedings of ION GNSS Conference.

[26]Kothari, M., Postlethwaite, I., & Gu, D. W. (2014). UAV path following in windy urban environments. Journal of Intelligent & Robotic Systems, 74(3-4), 1013-1028.

[27]Sipser, M. (2006). Introduction to the Theory of Computation(Vol. 2). Boston:

Thomson Course Technology.

[28]Alsalam, B. H. Y., Morton, K., Campbell, D., & Gonzalez, F. (2017, March).

Autonomous UAV with vision based on-board decision making for remote sensing and precision agriculture. In Aerospace Conference, 2017 IEEE (pp. 1-12). IEEE.

[29]Puterman, M. L. (2014). Markov decision processes: discrete stochastic dynamic programming. John Wiley & Sons.

[30]Koenig, J., Malberg, S., Martens, M., Niehaus, S., Krohn-Grimberghe, A., &

Ramaswamy, A. (2018). Multi-Stage Reinforcement Learning For Object Detection.

arXiv preprint arXiv:1810.10325.

[31]Xiang, Y., Alahi, A., & Savarese, S. (2015). Learning to track: Online multi-object tracking by decision making. In Proceedings of the IEEE international conference on computer vision (pp. 4705-4713).

[32]Turchetta, M., Berkenkamp, F., & Krause, A. (2016). Safe exploration in finite markov decision processes with gaussian processes. In Advances in Neural Information Processing Systems (pp. 4312-4320).

[33]Ferreira, L. A., Bianchi, R. A., Santos, P. E., & de Mantaras, R. L. (2018, June). A method for the online construction of the set of states of a Markov Decision Process

‧ 國

立 政 治 大 學

N a tio na

l C h engchi U ni ve rs it y

using Answer Set Programming. In International Conference on Industrial, Engineering and Other Applications of Applied Intelligent Systems (pp. 3-15).

Springer, Cham.

[34]Suprayitno, H., Ratnasari, V., & Saraswati, N. (2017, November). Experiment Design for Determining the Minimum Sample Size for Developing Sample Based Trip Length Distribution. In IOP Conference Series: Materials Science and Engineering (Vol. 267, No. 1, p. 012029). IOP Publishing.

[35]Parrot Bebop2 Power Technical specifications. (n.d.). Retrieved November 13, 2018, from https://www.parrot.com/us/drones/parrot-bebop-2-power-pack-fpv#technicals.

[36]Bebop_autonomy. (n.d.). Retrieved November 13, 2018, from https://bebop-autonomy.readthedocs.io/en/latest.

https://ppt.cc/fIogmx

2. 使用基於向量域之線條跟隨方法跟隨折線之影片:

https://ppt.cc/fFQFwx

3. 使用基於向量域之線條跟隨方法跟隨長方形之影片:

https://ppt.cc/fJdbvx

4. 使用單層二維方向向量機率模型跟隨直線之影片:

https://ppt.cc/fL5cFx

5. 使用單層二維方向向量機率模型跟隨折線之影片:

https://ppt.cc/fklkRx

6. 使用單層二維方向向量機率模型跟隨長方形之影片:

https://ppt.cc/fTJN7x

7. 使用雙層二維方向向量機率模型跟隨直線之影片:

https://ppt.cc/faMdHx

8. 使用雙層二維方向向量機率模型跟隨折線之影片:

https://ppt.cc/fbLAPx

9. 使用雙層二維方向向量機率模型跟隨長方形之影片:

https://ppt.cc/fji30x

10. 使用雙層二維方向向量機率模型搭配慣性速度抑制方法跟隨直線之影片:

https://ppt.cc/fs3pax

11. 使用雙層二維方向向量機率模型搭配慣性速度抑制方法跟隨折線之影片:

https://ppt.cc/faD3Hx

‧ 國

立 政 治 大 學

N a tio na

l C h engchi U ni ve rs it y

12. 使用雙層二維方向向量機率模型搭配慣性速度抑制方法跟隨長方形之影片:

https://ppt.cc/fkor0x