| 研究生: |
黃子軒 Huang, Tzu-Hsuan |
|---|---|
| 論文名稱: |
透視式正顎手術計劃擴增實境系統 An Orthognathic Surgical Planning System with See-through Augmented Reality |
| 指導教授: |
方晶晶
Fang, Jing-Jing |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 機械工程學系 Department of Mechanical Engineering |
| 論文出版年: | 2017 |
| 畢業學年度: | 105 |
| 語文別: | 中文 |
| 論文頁數: | 130 |
| 中文關鍵詞: | 正顎手術計畫 、導航式咬合器系統 、智慧眼鏡 、擴增實境 、校正 |
| 外文關鍵詞: | Orthognathic Surgical Planning, Navigational Articulator System, Smart Glasses, AR |
| 相關次數: | 點閱:163 下載:3 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
前人研究所提供的正顎手術計畫導航咬合器系統—Nart(Navigational Articulator),提供醫師觀察對稱度、咬合、測顱及對稱評估指標的三維數據,人體臨床試驗案例已超過200例。有鑒於醫師使用Nart系統進行手術計畫時,需在操作牙模的同時也抬頭檢視電腦螢幕上相當數量的資訊,緊抓牙模的手易隨視線轉移不經意移動牙模影響精細的規劃過程。本研究的目的希望利用擴增實境技術協助醫師,無須抬頭檢視就可進行計畫。
本研究基於Nart系統發展Nart AR(Augmented Reality)系統,加入智慧眼鏡硬體與Nart APP軟體,可以Wi-Fi即時傳輸資訊、顯示模型與對稱主要評估指標呈現在智慧眼鏡螢幕上,並透過語音指令改變影像或資訊的顯示方式以利觀察。我們追蹤架設於咬合器上板的標記物,以自主發展的透視式AR校正方法—VSCM疊合實物牙模與電腦牙模影像。本研究以實驗證實VSCM較ARToolkit的AR校正方法穩定且操作步驟少,平均疊合誤差2.6mm,相較ARToolkit的平均誤差可下降約50%。然而未來需改善智慧眼鏡的硬體設備,結合人眼成像理論、與標記物追蹤演算法的突破,才可達到更實際有效的運用。
Nart (Navigational Articulator) is an orthognathic surgical planning system developed by former researcher in our laboratory. It provides physicians a comprehensive way to evaluate not only the symmetry situation of facial bones, but also compromise both occlusion and cephalometry. Physicians has applied the Nart system to plan more than 200 cases. During planning, physicians need to switch their sights between the dental casts on articulator and screens while using the Nart system. With slightly hold the dental casts might inadvertently move while switching sights. Therefore, the goal of this study is to help physicians avoiding sight switching with assistance of Augmented Reality (AR) technology.
The author has developed an AR Nart system based on the Nart system by invoke smart glasses and its associated software (Nart APP). Crucial information transmitted from Nart system to smart glasses through Wi-Fi in real-time. Models of the facial bone and 3D cephalometric information display on glasses’ screen. The users can give commands to change the content on screen by voice recognition system embedded in. By tracking the markers fixed on the articulator in the Nart system, we are able to locate the dental casts positions. We render see-through AR image on the tracked markers with the proposed calibration method—VSCM(Virtual Superimposed Calibration Method). Experiment shows that the VSCM is much more stable and accurate than the open source ARToolkit SDK. Deviations of VSCM calibrated see-through AR image, verified by the designed experiments, is around 1.62.58±0.944mm. The accuracy has been increased 50% more than ARToolkit.
We suggest that the smart glasses we used need to be improved before it applied to medical applications. Human eye projection theory and the tracking algorithm need advanced study in order to improve the accuracy of Nart APP.
[1] 林子源, "實體模型與電腦輔助技術於口腔顎面術前計劃之應用," 國立成功大學機械工程研究所碩士論文, 2000.
[2] 李俊毅, "顎面手術輔助空間導引系統之設計與實作," 國立成功大學機械工程研究所碩士論文, 2001.
[3] 郭泰宏, "醫療影像軟體開發-基礎介面與三維實體模型重建," 國立成功大學機械工程研究所碩士論文, 2001.
[4] 方晶晶, 王東堯, and 吳東錦, 可量化之對稱性判準方法, 中華民國發明專利 I288894, 2007/10/21~2025/10/16.
[5] 吳東錦, "口腔顎面多面向手術計畫基礎建構與臨床應用," 國立成功大學機械工程研究所碩士論文, 2006.
[6] 郭泰宏, "發展運動追蹤式咬合分析與正顎手術計畫," 國立成功大學機械工程研究所博士論文, 2009.
[7] 蔡佳彰, "新型咬合器雛形系統之開發," 國立成功大學機械工程研究所碩士論文, 2010.
[8] 方晶晶、王東堯, 正顎手術計畫產生方法, 中華民國專利申請案號105141836, 2016/12/16申請.
[9] Wikipedia, "Camera Calibration," <https://en.wikipedia.org/wiki/Camera_calibration>, accessed on Feb 13th, 2017.
[10] Wikipedia, "Camera Resectioning," <https://en.wikipedia.org/wiki/Camera_resectioning>, accessed on Feb 13th, 2017.
[11] Tsai, R., "A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses," IEEE Journal on Robotics and Automation, Vol.3, No.4, pp.323-344, 1987.
[12] Heikkila, J. and Silven, O., "A four-step camera calibration procedure with implicit image correction," Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp.1106-1112, 1997.
[13] Zhang, Z., "Flexible camera calibration by viewing a plane from unknown orientations," Proceedings of IEEE International Conference on Computer Vision, pp.666-673, 1999.
[14] Zhang, Z., "A flexible new technique for camera calibration," IEEE Transactions on pattern analysis and machine intelligence, Vol.22, No.11, pp.1330-1334, 2000.
[15] OpenCV, "Camera Calibration and 3D Reconstruction," <http://docs.opencv.org/2.4/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html#zhang2000>, accessed on Feb 14th, 2017.
[16] Bouguet, J.-Y., "Camera Calibration Toolbox for Matlab - References," <https://www.vision.caltech.edu/bouguetj/calib_doc/htmls/ref.html>, accessed on Feb 14th, 2017.
[17] Wikipedia, "擴增實境 Augmented Reality," <https://zh.wikipedia.org/wiki/%E6%93%B4%E5%A2%9E%E5%AF%A6%E5%A2%83>, accessed on Feb 15th, 2017.
[18] Milgram, P., Takemura, H., Utsumi, A., and Kishino, F., "Augmented reality: A class of displays on the reality-virtuality continuum," Proceedings of International Society for Optics and Photonics on Photonics for industrial applications, pp.282-292, 1995.
[19] Azuma, R. T., "A survey of augmented reality," Presence: Teleoperators and virtual environments, Vol.6, No.4, pp.355-385, 1997.
[20] Kutulakos, K. N. and Vallino, J. R., "Calibration-free augmented reality," IEEE Transactions on Visualization and Computer Graphics, Vol.4, No.1, pp.1-20, 1998.
[21] Rekimoto, J., "Matrix: A realtime object identification and registration method for augmented reality," Proceedings of the 3rd Asia Pacific Proceeding on Computer Human Interaction, pp.63-68, 1998.
[22] Kato, H. and Billinghurst, M., "Marker tracking and hmd calibration for a video-based augmented reality conferencing system," Proceedings of IEEE and ACM International Workshop on Augmented Reality, pp.85-94, 1999.
[23] Fuchs, H., Livingston, M. A., Raskar, R., Keller, K., Crawford, J. R., Rademacher, P., Drake, S. H., and Meyer, A. A., "Augmented reality visualization for laparoscopic surgery," Proceedings of International Conference on Medical Image Computing and Computer-Assisted Intervention, pp.934-943, 1998.
[24] Sato, Y., Nakamoto, M., Tamaki, Y., Sasama, T., Sakita, I., Nakajima, Y., Monden, M., and Tamura, S., "Image guidance of breast cancer surgery using 3-D ultrasound images and augmented reality visualization," IEEE Transactions on Medical Imaging, Vol.17, No.5, pp.681-693, 1998.
[25] Hansen, C., Wieferich, J., Ritter, F., Rieder, C., and Peitgen, H.-O., "Illustrative visualization of 3D planning models for augmented reality in liver surgery," International journal of computer assisted radiology and surgery, Vol.5, No.2, pp.133-141, 2010.
[26] Liao, H., Inomata, T., Sakuma, I., and Dohi, T., "Three-dimensional augmented reality for mriguided surgery using integral videography auto stereoscopic-image overlay," IEEE transactions on biomedical engineering, Vol.57, No.6, pp.1476-1486, 2010.
[27] Shao, P., Ding, H., Wang, J., Liu, P., Ling, Q., Chen, J., Xu, J., Zhang, S., and Xu, R., "Designing a wearable navigation system for image-guided cancer resection surgery," Annals of biomedical engineering, Vol.42, No.11, pp.2228-2237, 2014.
[28] Mischkowski, R. A., Zinser, M. J., Kübler, A. C., Krug, B., Seifert, U., and Zöller, J. E., "Application of an augmented reality tool for maxillary positioning in orthognathic surgery–a feasibility study," Journal of cranio-Maxillofacial surgery, Vol.34, No.8, pp.478-483, 2006.
[29] Tran, H. H., Suenaga, H., Kuwana, K., Masamune, K., Dohi, T., Nakajima, S., and Liao, H., "Augmented reality system for oral surgery using 3D auto stereoscopic visualization," Proceedings of International Conference on Medical Image Computing and Computer-Assisted Intervention, pp.81-88, 2011.
[30] Badiali, G., Ferrari, V., Cutolo, F., Freschi, C., Caramella, D., Bianchi, A., and Marchetti, C., "Augmented reality as an aid in maxillofacial surgery: validation of a wearable system allowing maxillary repositioning," Journal of cranio-Maxillofacial surgery, Vol.42, No.8, pp.1970-1976, 2014.
[31] Lin, Y. K., Yau, H. T., Wang, I.-C., Zheng, C., and Chung, K. H., "A novel dental implant guided surgery based on integration of surgical template and augmented reality," Clinical implant dentistry and related research, Vol.17, No.3, pp.543-553, 2015.
[32] Wikipedia, "Google眼鏡," <https://zh.wikipedia.org/wiki/Google%E7%9C%BC%E9%95%9C>, accessed on Mar 3rd, 2017.
[33] Morerio, P., Marcenaro, L., and Regazzoni, C. S., "Hand detection in first person vision," Proceedings of 16th International Conference on Information Fusion (FUSION), pp.1502-1507, 2013.
[34] Hwang, A. D. and Peli, E., "An augmented-reality edge enhancement application for Google Glass," Optometry and vision science: official publication of the American Academy of Optometry, Vol.91, No.8, p.1021, 2014.
[35] Zhang, L., Li, X.-Y., Huang, W., Liu, K., Zong, S., Jian, X., Feng, P., Jung, T., and Liu, Y., "It starts with igaze: Visual attention driven networking with smart glasses," Proceedings of the 20th annual international conference on Mobile computing and networking, pp.91-102, 2014.
[36] Kim, G., Choi, S., and Yoo, H.-J., "K-glass: Real-time markerless augmented reality smart glasses platform," Proceedings of IEEE International Conference on on Industrial Technology (ICIT), pp.1712-1717, 2015.
[37] Ruminski, J. and Czuszynski, K., "Application of smart glasses for fast and automatic color correction in health care," Proceedings of IEEE 37th Annual International Conference on Engineering in Medicine and Biology Society (EMBC), pp.4950-4953, 2015.
[38] Hu, C., Zhai, G., and Li, D., "An Augmented-Reality night vision enhancement application for see-through glasses," Proceedings of IEEE International Conference on Multimedia & Expo Workshops (ICMEW), pp.1-6, 2015.
[39] Kruijff, E., Swan, J. E., and Feiner, S., "Perceptual issues in augmented reality revisited," Proceedings of IEEE International Symposium on Mixed and Augmented Reality (ISMAR), pp.3-12, 2010.
[40] Horaud, R. and Dornaika, F., "Hand-eye calibration," The international journal of robotics research, Vol.14, No.3, pp.195-210, 1995.
[41] Tsai, R. Y. and Lenz, R. K., "A new technique for fully autonomous and efficient 3D robotics hand/eye calibration," robotics and automation, Vol.5, No.3, pp.345-358, 1989.
[42] Remy, S., Dhome, M., Lavest, J.-M., and Daucher, N., "Hand-eye calibration," Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp.1057-1065, 1997.
[43] ARToolKit.org, "ARToolKit Documentation," <https://artoolkit.org/documentation/>, accessed on Mar 3rd, 2017.
[44] Tuceryan, M., Genc, Y., and Navab, N., "Single-Point active alignment method (SPAAM) for optical see-through HMD calibration for augmented reality," Presence: Teleoperators and virtual environments, Vol.11, No.3, pp.259-276, 2000.
[45] Genc, Y., Sauer, F., Wenzel, F., Tuceryan, M., and Navab, N., "Optical see-through HMD calibration: A stereo method validated with a video see-through system," Proceedings of IEEE and ACM International Symposium on Augmented Reality(ISAR), pp.165-174, 2000.
[46] Itoh, Y. and Klinker, G., "Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization," Proceedings of IEEE Symposium on 3D User Interfaces (3DUI), pp.75-82, 2014.
[47] Axholt, M., Cooper, M. D., Skoglund, M. A., Ellis, S. R., O'Connell, S. D., and Ynnerman, A., "Parameter estimation variance of the single point active alignment method in optical see-through head mounted display calibration," IEEE Virtual Reality Conference (VR) on, pp.27-34, 2011.
[48] ARToolkit, "Using an Optical See-Through Display," <https://artoolkit.org/documentation/doku.php?id=8_Advanced_Topics:config_optical_see-through>, accessed on May 9th, 2017.
[49] HP, "HP Color LaserJet Enterprise CP4020 printer series," <http://store.hp.com/wcsstore/hpusstore/pdf/cc490a.pdf>, accessed on Mar 26th, 2017.
[50] 鉅景科技, "SiME Smart Glasses," <http://cht.chipsip.com/computing/index.php?mode=data&id=126&top=60>, accessed on Mar 23rd, 2017.
[51] CMU, "Sphinx Knowledge Base Tool -- VERSION 3," <http://www.speech.cs.cmu.edu/tools/lmtool-new.html>, accessed on Apr 9th, 2017.
[52] Wikipedia, "Transmission Control Protocol (TCP)," <https://en.wikipedia.org/wiki/Transmission_Control_Protocol>, accessed on May 4th, 2017.
[53] Immersion, "MicroScribe G2," Solution Technologies Inc., <http://www.3d-microscribe.com/msg2_0704_v1.pdf>, accessed on May 8th, 2017.
[54] ARToolKit, "ARToolKit community forums, Discussions:How to import optical_param.dat into AS project?," <https://artoolkit.org/community/forums/viewtopic.php?f=26&t=9681&p=13712&hilit=optical#p13712>, accessed on Jun 5th, 2017.
校內:2022-08-26公開