| 研究生: |
黃子健 Huang, Tzu-Chien |
|---|---|
| 論文名稱: |
以CIELAB色彩空間為基礎之動態影像伺服追蹤 Study on Vision-Based Tracking Control Using CIELAB Color Space |
| 指導教授: |
莊智清
Juang, Jyh-Ching |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 電機工程學系碩士在職專班 Department of Electrical Engineering (on the job class) |
| 論文出版年: | 2011 |
| 畢業學年度: | 99 |
| 語文別: | 中文 |
| 論文頁數: | 58 |
| 中文關鍵詞: | 影像 、追蹤控制 、CIELAB |
| 外文關鍵詞: | Vision, Tracking Control, CIELAB |
| 相關次數: | 點閱:96 下載:6 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本篇論文的目的是為了改善影像受光的照度影響,並以CIELAB色彩空間做為描述影像的基礎,且設計一部利用影像辨識來追蹤特定目標物之自走車。在辨識目標過程中,分析RGB色彩空間與CIELAB色彩空間兩者受光的亮度影響。可以發現當使用CIELAB之影像可避免因受到亮度的影響所造成影像資料的判斷錯誤。並能將物體從動態背景中分離出來與計算物體移動之方向,根據移動方位的資訊透過運動控制器計算出力量大小,進而從藍芽傳輸傳回至車上的控制器中,控制自走車前進方向達到追蹤目的。實驗中將以加入類似目標的雜訊、動態中改變環境光源、面向點光源的辨識等實驗來說明。結果顯示,經由本論文所提出之方法,可避免受亮度影響並且得到正確影像資訊並追蹤目標。
This study aims to present a method of processing digital images by reducing the effect due to brightness. In this study, the images are described on the basis of CIELAB color space and the data are used to design an activated motorized car which can trace the target by means of phantom identification techniques. During the process of identifying the target, the influence of brightness on the RGB color space and the CIELAB color space are analyzed. Images using CIELAB are free from the disturbance of brightness which may cause errors of phantom identification. In addition, images using CIELAB can be separated from the background and the direction of the target’s movement can be computed. The data obtained can be transmitted to the controller of the car wirelessly and then direct the car to trace the target. Three different scenarios are tested, which are (1) adding the noise similar to the target; (2) changing the surrounding light sources in the movement; (3) identification of the point-light sources. By adopting the methods proposed in this study, the disturbance of brightness can be successfully avoided, the data of the correct image can be obtained, and the target can be tracked.
[1] D. J. Dailey, F. W. Cathey, and S. Pumrin, “An algorithm to estimate mean traffic speed using uncalibrated cameras,” IEEE Transations on Tntelligent Transportation Systems, Vol. 1, No. 2, pp. 98-107, 2000.
[2] D. S. Jang, G. Y. Kim, and H.I.Choi, “Model-based tracking of moving object,” Pattern Recognition, Vol. 30, No. 6, pp. 999-1008, 1997
[3] Elkonyaly, E., Areed, F., Enab, Y., and Zada, F., 1995, “Range Sensory-Based Robot Nsvigation in Unknown Terrains,” The International Society for Optical Engineerin, 2591, 76-85
[4] Gentile R. S., Walowitt E. and Allebach J. P.,“ A Comparison of Techniques for Color gamut mismatch compensation,” Journal of 85 Imaging Technology, Volume 16, pp. 176-181, 1990
[5] J. Hill and W. T. Park, “Real Time Control of a Robot with a Mobile Camera,” in Proc. Of 9th ISIR, pp.233-246, 1979
[6] Javier Herna'ndez-Andre's, Raymond L. Lee, Jr., and Javier Romero, Calculating correlated color temperatures across the entire gamut of daylight and skylight chromaticities,” APPLIED OPTICS , Volume. 38, No. 27, pp. 5703-5709, 20 September 1999
[7] M. Oren, C. Papageorgiou, P. Sinha, E. Osuna and T. Poggio, “Computer Vision and Pattern Recognition,” in Proc. Of IEEE Conference on Computer Vision and Pattern recogniton, pp.193-199,1997
[8] S. Hutchinson, G. D. Hager and P. I. Corke, “A Tutorial on Visual Servo Control,” IEEE Trans. on Robotics and Automation, Vol. 12, pp. 651-670, 1996.
[9] Wikipedia, http://www.wikipedia.org
[10] Y. Ando and S. Yuta, “Following a Well by an Autonomous Mobile Root with a Sonar-Ring,” IEEE International Conference on Robotics and Automation, Vol. 4, pp. 2599-2606, 1995.
[11] Y. Han and H.Hahn, “Localization and Classification of Traget Surfaces Using Two Pairs of Ultrasonic Sensors,” Elsevier Science on Robotics and Autonomous Systems, Vol. 1, pp. 31-41, 2000.
[12] Y. W. Lin, “The Research of robot Building Map with an Ultrasonic Sensor,” Master Thesis, Department of Engineering Science, N.C.K.U., 2004.
[13] 王文俊,”認識Fuzzy ”第二版,全華科技圖書,2001。
[14] 北京圖像圖形學學會編,”圖像圖形技術研究與應用”,北京交通大學出版社,2010。
[15] 李宜達,”控制系統設計與模擬使用Matlab/Simulink”,全華圖書,2003。
[16] 李培華,”序列圖像中運動目標跟蹤方法”,北京科學出版社,2010。
[17] 沈志忠、張聖明 譯,”Matlab程式設計與應用”,全華圖書,2009。
[18] 汪在祥,”自走車在智慧型空間下之動態目標物追蹤”,大同大學機械工程研究所碩士論文,2005。
[19] 林家民,”以視覺伺服為基礎之物體追蹤系統之設計與實現”,國立成功大學工程科學系碩士論文,2006。
[20] 洪國敦,”色域映對理論應用於顯示器色彩調校之研究”,國立中央大學光電科學研究所碩士論文,2006。
[21] 胡來招,”無源定位”,國防工業出版社,2004。
[22] 莊智清,”電子導航”,全華圖書,2001。
[23] 郭桓甫,”應用即時影像辨識技術於物體追蹤之無線遙控自走車的設計製作”,國立成功大學航太工程學系碩士論文,2006。
[24] 陳俊嘉,”即時影像追蹤之自走車設計”,國立中央大學電機工程研究所碩士論文,2007。
[25] 章毓晋等,”基於子空間的人臉識別”,清華大學出版社,2009。
[26] 廖名揚,”應用即時影像及模糊PID控制自走車於無線攔截任務”,國立成功大學航太工程學系碩士論文,2007。
[27] 廖詩芳,”符合人眼色差知覺之均勻色彩空間”,國立中央大學光電科學研究所碩士論文,2007。
[28] 蔣育承,”以視覺為基礎之機器人導航及應用”,國立中央大學資訊工程研究所碩士論文,2006。
[29] 賴岱佑、劉敏,”數位影像處理技術手冊”文魁資訊,2007。
[30] 繆紹綱 譯,”數位影像處理-運用MATLAB”,東華書局,2005。
[31] 繆紹綱,”數位影像處理 活用-Matlab”全華科技圖書,1999。