| 研究生: |
施洊錩 Shih, Chien-Chang |
|---|---|
| 論文名稱: |
基於機器視覺的水域水面色彩辨識技術 Water Surface Color Recognition Technology Based on Machine Vision |
| 指導教授: |
劉建聖
Liu, Chien-Sheng |
| 學位類別: |
碩士 Master |
| 系所名稱: |
工學院 - 機械工程學系 Department of Mechanical Engineering |
| 論文出版年: | 2024 |
| 畢業學年度: | 112 |
| 語文別: | 中文 |
| 論文頁數: | 117 |
| 中文關鍵詞: | 機器視覺 、水面色彩辨識 、色彩校正 、水質監測 、養殖業 |
| 外文關鍵詞: | Machine vision, water surface color recognition, color correction, water quality monitoring, aquaculture industry |
| 相關次數: | 點閱:76 下載:3 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
本研究旨在開發一種基於機器視覺的水域水面色彩辨識技術,以解決傳統水質監測方法的不足。水質是養殖業成功的關鍵,但現有方法受限於成本高、技術複雜和環境干擾等問題。傳統的水質監測方法主要依賴昂貴的感測器和複雜的化學分析,不僅需要專業技術人員操作,還無法實時提供水質變化的數據。此外,這些方法在面對藻類增生等環境變化時容易受到干擾,導致監測結果不穩定。
為了解決這些問題,本研究提出了一種創新的解決方案,利用機器視覺技術和色彩校正方法對水面色彩進行辨識和分析。具體而言,本研究使用數位相機捕捉水面的色彩資訊,並通過自製的電子色彩校正版對拍攝的水色影像進行精確校正。這種方法能夠有效還原水體的真實色彩,從而提供可靠的水質評估數據。
研究中,本研究詳細設計和搭建了一套完整的水色辨識系統,包括實驗設備、拍攝流程、色彩校正技術等。通過多次實驗,我們驗證了該系統在不同光照條件下的穩定性和準確性,並使用四種方法相互比較。結果顯示,基於機器視覺的水色辨識技術在準確性方面具有顯著優勢,能夠為漁民提供一套高效、經濟的水質監測解決方案。
本研究的成果不僅提高了水色影像的準確性和可靠性,還大幅降低了水質監測的成本和技術門檻,適合廣泛應用於水產養殖業。未來,我們希望進一步優化該技術,並探索其在其他水域環境中的應用,為水質管理提供更加科學和高效的支持,促進水產養殖業的可持續發展。
This study aims to develop a machine vision-based water surface color recognition technology to address the limitations of traditional water quality monitoring methods. Current methods are costly, complex, and prone to environmental interferences, requiring professional technicians and failing to provide real-time data. By using digital cameras to capture water surface colors and custom electronic color correction plates for accurate calibration, this approach effectively restores true water colors, providing reliable water quality assessment data.
We meticulously designed and constructed a comprehensive water color recognition system, validated through multiple experiments under various lighting conditions. The results demonstrated the system's high accuracy, stability, and significant outperformance compared to traditional methods. This efficient and economical solution reduces costs and technical barriers, making it suitable for widespread application in the aquaculture industry, and promoting sustainable development by providing more scientific and efficient support for water quality management and environmental protection.
[1] 徐承堉, “失落十年的水產養殖王國如何再起?台灣養殖漁業的永續之路”, https://www.newsmarket.com.tw/blog/104163/ (accessed 2024/05).
[2] 新聞行銷處公共關係科, “嘉義縣打造智慧漁場 遠端監測遙控助漁民省時、省工”, https://news.cyhg.gov.tw/News_Content.aspx?n=422&s=215007 (accessed 2024/05).
[3] 海盛科技, “Tic100 養殖永續生產 - 青年返鄉用ai監控科技養好魚”, https://si.taiwan.gov.tw/Home/Raise/View/15 (accessed 2024/05).
[4] 魚你同行, “【水質】今天教你如何通過水色看水 ”, https://www.sohu.com/a/439442564_760532 (accessed 2024/05).
[5] 調水專家, “水色及其與藻類的關係,你知道嗎?”, https://kknews.cc/zh-tw/news/ag9y36x.html (accessed 2024/05).
[6] 馮校長·水產人, “分享八種常見水色對應藻類!”, https://zhuanlan.zhihu.com/p/239741724 (accessed 2024/05).
[7] A. M. S. Detoni and A. M. Ciotti, “Trichome Abundance, Chlorophyll Content and the Spectral Coefficient for Light Absorption of Trichodesmium Slicks Observed in the Southwestern Atlantic,” Journal of Plankton Research, vol. 42, no. 2, pp. 135-139, 2020.
[8] 畢永紅 and 胡征宇, “水色及其與藻類的關係,” 生態科學, vol. 24, no. 1, pp. 66-68, 2005.
[9] A. Joseph, Investigating Seafloors and Oceans-from Mud Volcanoes to Giant Squid, Elsevier, America, pp. 555-574, 2017.
[10] S. Mitra, M. A. Khan, R. Nielsen et al., “Total Factor Productivity and Technical Efficiency Differences of Aquaculture Farmers in Bangladesh: Do Environmental Characteristics Matter?,” Journal of the World Aquaculture Society, vol. 51, no. 4, pp. 918-930, 2020.
[11] C. H. Chuang, U. C. Chiu, C. W. Huang et al., “Associations of Anomalous Water Temperature, Salinity, and Ph with Change in Water Color of Fish Farming Ponds,” Journal of the World Aquaculture Society, vol. 54, no. 6, pp. 1563-1574, 2023.
[12] B. Wissel, W. J. Boeing, and C. W. Ramcharan, “Effects of Water Color on Predation Regimes and Zooplankton Assemblages in Freshwater Lakes,” Limnology and Oceanography, vol. 48, no. 5, pp. 1965-1976, 2003.
[13] A. M. Mane, S. S. Pattanaik, R. Jadhav et al., “Pond Coloration, Interpretation and Possible Measures of Rectification for Sustainable Aquaculture Practice,” Aquac. Times, vol. 3, pp. 2394-2398, 2017.
[14] DCFS編輯部, “如何透過色調及構圖呈現《雲端情人》的孤寂疏離之感”, https://dcfilmschool.com/preview/%E5%A6%82%E4%BD%95%E9%80%8F%E9%81%8E%E8%89%B2%E8%AA%BF%E5%8F%8A%E6%A7%8B%E5%9C%96%E5%91%88%E7%8F%BE%E3%80%8A%E9%9B%B2%E7%AB%AF%E6%83%85%E4%BA%BA%E3%80%8B%E7%9A%84%E5%AD%A4%E5%AF%82%E7%96%8F%E9%9B%A2%E4%B9%8B%E6%84%9F (accessed 2024/05).
[15] DCFS編輯部, “專業調光師教你如何判定螢幕色彩的準確度”, https://dcfilmschool.com/article/%E5%B0%88%E6%A5%AD%E8%AA%BF%E5%85%89%E5%B8%AB%E6%95%99%E4%BD%A0%E5%A6%82%E4%BD%95%E5%88%A4%E5%AE%9A%E8%9E%A2%E5%B9%95%E8%89%B2%E5%BD%A9%E7%9A%84%E6%BA%96%E7%A2%BA%E5%BA%A6#google_vignette (accessed 2024/05).
[16] 歐諾影像, “我們為什麼需要校正色彩?”, https://oknowimage.com/color-calibration/ (accessed 2024/05).
[17] 中關村在線, “攝影知識|Raw格式一文看清 與jpg區別在這 初學者應否使用?”, https://www.hk01.com/%E6%95%B8%E7%A2%BC%E7%94%9F%E6%B4%BB/853761/%E6%94%9D%E5%BD%B1%E7%9F%A5%E8%AD%98-raw%E6%A0%BC%E5%BC%8F%E4%B8%80%E6%96%87%E7%9C%8B%E6%B8%85-%E8%88%87jpg%E5%8D%80%E5%88%A5%E5%9C%A8%E9%80%99-%E5%88%9D%E5%AD%B8%E8%80%85%E6%87%89%E5%90%A6%E4%BD%BF%E7%94%A8 (accessed 2024/05).
[18] “新手疑問:我需要拍攝raw檔嗎?”, https://www.canon.com.hk/tc/club/article/itemDetail.do?itemId=10394&page=1 (accessed 2024/05).
[19] 歲月磋跎一杯酒, “在matlab中處理raw圖像”, https://blog.csdn.net/weixin_44690935/article/details/107464196 (accessed 2024/05).
[20] T. Ting, “什麼是白平衡? | 餐廳拍照不再又黃又醜!一秒改變照片冷暖的救星”, https://timtingphotography.com/what-is-camera-white-balance/#t-1654134778812 (accessed 2024/05).
[21] 賀禎禎, “白平衡 是什麼? 正確設定相機白平衡,還原最精彩的色彩”, https://hojenjen.com/camera-white-balance-setting/ (accessed 2024/05).
[22] M. Afifi, M. A. Brubaker, and M. S. Brown, “Auto White-Balance Correction for Mixed-Illuminant Scenes,” pp. 1210-1219.
[23] Y. Qian, S. Pertuz, J. Nikkanen et al., “Revisiting Gray Pixel for Statistical Illumination Estimation,” arXiv preprint arXiv:1803.08326, 2018.
[24] Y. Qian, J.-K. Kamarainen, J. Nikkanen et al., “On Finding Gray Pixels,” pp. 8062-8070.
[25] M. Afifi and M. S. Brown, “Interactive White Balancing for Camera-Rendered Images,” arXiv preprint arXiv:2009.12632, 2020.
[26] M. Afifi, B. Price, S. Cohen et al., “When Color Constancy Goes Wrong: Correcting Improperly White-Balanced Images,” pp. 1535-1544.
[27] M. Afifi and M. S. Brown, “Deep White-Balance Editing,” pp. 1397-1406.
[28] Y. Tao, L. Dong, and W. Xu, “A Novel Two-Step Strategy Based on White-Balancing and Fusion for Underwater Image Enhancement,” IEEE Access, vol. 8, pp. 217651-217670, 2020.
[29] A. Galdran, “Image Dehazing by Artificial Multiple-Exposure Image Fusion,” Signal Processing, vol. 149, pp. 135-147, 2018.
[30] Z. Zhu, H. Wei, G. Hu et al., “A Novel Fast Single Image Dehazing Algorithm Based on Artificial Multiexposure Image Fusion,” IEEE Transactions on Instrumentation and Measurement, vol. 70, pp. 1-23, 2020.
[31] M. Qi, S. Cui, X. Chang et al., “Multi-Region Nonuniform Brightness Correction Algorithm Based on L-Channel Gamma Transform,” 2022.
[32] Y. Zhang, X. Guo, J. Ma et al., “Beyond Brightening Low-Light Images,” International Journal of Computer Vision, vol. 129, pp. 1013-1037, 2021.
[33] K. Sun, F. Meng, and Y. Tian, “Underwater Image Enhancement Based on Noise Residual and Color Correction Aggregation Network,” Digital Signal Processing, vol. 129, p. 103684, 2022.
[34] M. Muniraj and V. Dhandapani, “Underwater Image Enhancement by Color Correction and Color Constancy Via Retinex for Detail Preserving,” Computers and Electrical Engineering, vol. 100, p. 107909, 2022.
[35] Y. Li, D. Li, Z. Gao et al., “Underwater Image Enhancement Utilizing Adaptive Color Correction and Model Conversion for Dehazing,” Optics & Laser Technology, vol. 169, p. 110039, 2024.
[36] C. O. Ancuti, C. Ancuti, C. De Vleeschouwer et al., “Color Balance and Fusion for Underwater Image Enhancement,” IEEE Transactions on image processing, vol. 27, no. 1, pp. 379-393, 2017.
[37] D. Berman, D. Levy, S. Avidan et al., “Underwater Single Image Color Restoration Using Haze-Lines and a New Quantitative Dataset,” IEEE transactions on pattern analysis and machine intelligence, vol. 43, no. 8, pp. 2822-2837, 2020.
[38] D. Akkaynak and T. Treibitz, “Sea-Thru: A Method for Removing Water from Underwater Images,” pp. 1682-1691.
[39] M. J. Islam, Y. Xia, and J. Sattar, “Fast Underwater Image Enhancement for Improved Visual Perception,” IEEE Robotics and Automation Letters, vol. 5, no. 2, pp. 3227-3234, 2020.
[40] C. Li, S. Anwar, J. Hou et al., “Underwater Image Enhancement Via Medium Transmission-Guided Multi-Color Space Embedding,” IEEE Transactions on Image Processing, vol. 30, pp. 4985-5000, 2021.
[41] H.-C. Chen, S.-Y. Xu, and K.-H. Deng, “Water Color Identification System for Monitoring Aquaculture Farms,” Sensors, vol. 22, no. 19, p. 7131, 2022.
[42] R. Nabati and H. Qi, “Rrpn: Radar Region Proposal Network for Object Detection in Autonomous Vehicles,” pp. 3093-3097.
[43] C.-M. Chang, Y.-D. Liou, Y.-C. Huang et al., “Yolo Based Deep Learning on Needle-Type Dashboard Recognition for Autopilot Maneuvering System,” Measurement and Control, vol. 55, no. 7-8, pp. 567-582, 2022.
[44] X. Gao, D. Ge, and Z. Chen, “The Research on Autopilot System Based on Lightweight Yolo-V3 Target Detection Algorithm,” vol. 1486, p. 032028.
[45] N. Surantha and W. R. Wicaksono, “Design of Smart Home Security System Using Object Recognition and Pir Sensor,” Procedia computer science, vol. 135, pp. 465-472, 2018.
[46] T. Prabhu and C. Dhanamjayulu, “Automated Home Security Using Facial Recognition with Iot Remote Monitoring,” pp. 1-7.
[47] M. Patel, A. Yadav, and C. Valderrama, “Image Enhancement and Object Recognition for Night Vision Traffic Surveillance,” pp. 733-748.
[48] S. M. Anwar, M. Majid, A. Qayyum et al., “Medical Image Analysis Using Convolutional Neural Networks: A Review,” Journal of medical systems, vol. 42, pp. 1-13, 2018.
[49] G. Litjens, T. Kooi, B. E. Bejnordi et al., “A Survey on Deep Learning in Medical Image Analysis,” Medical image analysis, vol. 42, pp. 60-88, 2017.
[50] Q. Fan, W. Zhuo, C.-K. Tang et al., “Few-Shot Object Detection with Attention-Rpn and Multi-Relation Detector,” pp. 4013-4022.
[51] B. Hu, J. Huang, Y. Liu et al., “Nerf-Rpn: A General Framework for Object Detection in Nerfs,” pp. 23528-23538.
[52] P. Song, P. Li, L. Dai et al., “Boosting R-Cnn: Reweighting R-Cnn Samples by Rpn’s Error for Underwater Object Detection,” Neurocomputing, vol. 530, pp. 150-164, 2023.
[53] A. Pirinen and C. Sminchisescu, “Deep Reinforcement Learning of Region Proposal Networks for Object Detection,” pp. 6945-6954.
[54] R. Goel, A. Sharma, and R. Kapoor, “Object Recognition Using Deep Learning,” Journal of Computational and Theoretical nanoscience, vol. 16, no. 9, pp. 4044-4052, 2019.
[55] K. Wang and M. Z. Liu, “Object Recognition at Night Scene Based on Dcgan and Faster R-Cnn,” IEEE Access, vol. 8, pp. 193168-193182, 2020.
[56] R. Gavrilescu, C. Zet, C. Foșalău et al., “Faster R-Cnn: An Approach to Real-Time Object Detection,” pp. 0165-0168.
[57] J. Tompson, R. Goroshin, A. Jain et al., “Efficient Object Localization Using Convolutional Networks,” pp. 648-656.
[58] Y. Long, Y. Gong, Z. Xiao et al., “Accurate Object Localization in Remote Sensing Images Based on Convolutional Neural Networks,” IEEE Transactions on Geoscience and Remote Sensing, vol. 55, no. 5, pp. 2486-2498, 2017.
[59] J. H. Bappy and A. K. Roy-Chowdhury, “Cnn Based Region Proposals for Efficient Object Detection,” pp. 3658-3662.
[60] A.-H. A. El-Shafie, M. Zaki, and S. E. D. Habib, “Fast Cnn-Based Object Tracking Using Localization Layers and Deep Features Interpolation,” pp. 1476-1481.
[61] F. Ashiq, M. Asif, M. B. Ahmad et al., “Cnn-Based Object Recognition and Tracking System to Assist Visually Impaired People,” IEEE access, vol. 10, pp. 14819-14834, 2022.
[62] A. A. Rafique, Y. Y. Ghadi, S. A. Alsuhibany et al., “Cnn Based Multi-Object Segmentation and Feature Fusion for Scene Recognition,” pp. 27-29.
[63] Y.-C. Du, M. Muslikhin, T.-H. Hsieh et al., “Stereo Vision-Based Object Recognition and Manipulation by Regions with Convolutional Neural Network,” Electronics, vol. 9, no. 2, p. 210, 2020.
[64] J. Li, W. Chen, Y. Sun et al., “Object Detection Based on Densenet and Rpn,” pp. 8410-8415.
[65] Q. Zhong, C. Li, Y. Zhang et al., “Cascade Region Proposal and Global Context for Deep Object Detection,” Neurocomputing, vol. 395, pp. 170-177, 2020.
[66] H. Oussama and I. A. El Mehdi, “Developing Mask R-Cnn Framework for Real-Time Object Detection,” pp. 1-8.
[67] M. Wu, H. Yue, J. Wang et al., “Object Detection Based on Rgc Mask R‐Cnn,” IET Image Processing, vol. 14, no. 8, pp. 1502-1508, 2020.
[68] M. Kalaitzakis, B. Cain, S. Carroll et al., “Fiducial Markers for Pose Estimation: Overview, Applications and Experimental Comparison of the Artag, Apriltag, Aruco and Stag Markers,” Journal of Intelligent & Robotic Systems, vol. 101, pp. 1-26, 2021.
[69] M. Kalaitzakis, S. Carroll, A. Ambrosi et al., “Experimental Comparison of Fiducial Markers for Pose Estimation,” pp. 781-789.
[70] A. Khazetdinov, A. Zakiev, T. Tsoy et al., “Embedded Aruco: A Novel Approach for High Precision Uav Landing,” pp. 1-6.
[71] O. Kedilioglu, T. M. Bocco, M. Landesberger et al., “Arucoe: Enhanced Aruco Marker,” pp. 878-881.
[72] Z. Xu, M. Haroutunian, A. J. Murphy et al., “An Underwater Visual Navigation Method Based on Multiple Aruco Markers,” Journal of Marine Science and Engineering, vol. 9, no. 12, p. 1432, 2021.
[73] G. C. La Delfa, V. Catania, S. Monteleone et al., “Computer Vision Based Indoor Navigation: A Visual Markers Evaluation,” pp. 165-173.
[74] W. Tian, D. Chen, Z. Yang et al., “The Application of Navigation Technology for the Medical Assistive Devices Based on Aruco Recognition Technology,” pp. 2894-2899.
[75] B. Li, B. Wang, X. Tan et al., “Corner Location and Recognition of Single Aruco Marker under Occlusion Based on Yolo Algorithm,” Journal of Electronic Imaging, vol. 30, no. 3, pp. 033012-033012, 2021.
[76] L. Teplyakov, K. Kaymakov, E. Shvets et al., “Line Detection Via a Lightweight Cnn with a Hough Layer,” vol. 11605, pp. 376-385.
[77] C. A. D. G. Traoré and A. Séré, “Straight-Line Detection with the Hough Transform Method Based on a Rectangular Grid,” pp. 599-610.
[78] Y. Zhao, C. Wen, Z. Xue et al., “3d Room Layout Estimation from a Cubemap of Panorama Image Via Deep Manhattan Hough Transform,” pp. 637-654.
[79] X. Li, Z. Cong, and Y. Zhang, “Rail Track Edge Detection Methods Based on Improved Hough Transform,” pp. 12-16.
[80] Y. S. Wang, Y. Qi, and Y. Man, “An Improved Hough Transform Method for Detecting Forward Vehicle and Lane in Road,” vol. 1757, p. 012082.
[81] X. Li, P. Yin, Y. Zhi et al., “Vertical Lane Line Detection Technology Based on Hough Transform,” vol. 440, p. 032126.
[82] S. Kumar, M. Jailia, and S. Varshney, “An Efficient Approach for Highway Lane Detection Based on the Hough Transform and Kalman Filter,” Innovative infrastructure solutions, vol. 7, no. 5, p. 290, 2022.
[83] W. Farag, “Real-Time Detection of Road Lane-Lines for Autonomous Driving,” Recent Advances in Computer Science and Communications (Formerly: Recent Patents on Computer Science), vol. 13, no. 2, pp. 265-274, 2020.
[84] M. A. Hashmani, M. Umair, S. S. H. Rizvi et al., “A Survey on Edge Detection Based Recent Marine Horizon Line Detection Methods and Their Applications,” pp. 1-5.
[85] Y. Lin, S.-L. Pintea, and J. van Gemert, “Semi-Supervised Lane Detection with Deep Hough Transform,” pp. 1514-1518.
[86] M. Mohammadpour, A. Bahroudi, and M. Abedi, “Automatic Lineament Extraction Method in Mineral Exploration Using Canny Algorithm and Hough Transform,” Geotectonics, vol. 54, no. 3, pp. 366-382, 2020.
[87] M. V. P. Kalyan, D. R. K. Reddy, and P. Y. Reddy, “Design and Development of Automated Lane Detection Using Improved Canny Edge Detection Method,” PSYCHOLOGY AND EDUCATION, vol. 57, no. 9, pp. 1350-1358, 2020.
[88] B. Iqbal, W. Iqbal, N. Khan et al., “Canny Edge Detection and Hough Transform for High Resolution Video Streams Using Hadoop and Spark,” Cluster Computing, vol. 23, no. 1, pp. 397-408, 2020.
[89] M. F. Rizal, R. Sarno, and S. I. Sabilla, “Canny Edge and Hough Circle Transformation for Detecting Computer Answer Sheets,” pp. 346-352.
[90] S. Monfared and L. Rada, “Optimized Road Lane Detection through a Combined Canny Edge Detection, Hough Transform, and Scaleable Region Masking toward Autonomous Driving,” p. 121.
[91] N. Uddin, H. Hermawan, F. J. P. Sitorus et al., “Lane Detection System Based on Canny Method for Driving Assistance,” pp. 1-6.
[92] K. Zhao, Q. Han, C.-B. Zhang et al., “Deep Hough Transform for Semantic Line Detection,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 9, pp. 4793-4806, 2021.
[93] L. Cheng, J. Fang, Y. Wu et al., “Research on Improved Image Edge Detection Based on Hough Transform,” vol. 11928, pp. 146-150.
[94] 王啟榮, “【色彩學】光與色彩”, https://www.gameislearning.url.tw/article_content.php?getb=4&foog=9998 (accessed 2024/05).
[95] 程傑銘, 陳夏潔, and 顧凱, 色彩學, 科學出版社, 中國, 2006.
[96] “Chapter 1 認識色彩”, https://www.charts.kh.edu.tw/teaching-web/98color/color1-3.htm (accessed 2024/05).
[97] “光與能量”, https://scitechvista.nat.gov.tw/Article/C000003/detail?ID=9c60b403-53ae-4500-b4ae-1ee8aa4bdeec (accessed 2024/05).
[98] “色彩學(一) - 光與光譜和色彩的本質”, https://aquamarine0326.pixnet.net/blog/post/103905186 (accessed 2024/05).
[99] “Chapter 2 色彩體系”, https://www.charts.kh.edu.tw/teaching-web/98color/color2-1.htm (accessed 2024/05).
[100] “構成色彩的三大要素”, https://ral.qtccolor.com/Article/119.aspx (accessed 2024/05).
[101] “設計師入門一定要知道的色彩學基礎:顏色3分類與顏色3屬性”, https://www.holibear.com/hblog/%E8%89%B2%E5%BD%A9%E8%A8%AD%E8%A8%88%E7%B3%BB%E5%88%97%EF%BD%9C%E8%A8%AD%E8%A8%88%E5%B8%AB%E4%B8%80%E5%AE%9A%E8%A6%81%E7%9F%A5%E9%81%93%E7%9A%84%E9%A1%8F%E8%89%B2%E5%9F%BA%E7%A4%8E%E5%AD%B8%EF%BC%9A/#google_vignette (accessed.
[102] “色彩三要素的類別及概念”, http://m.chenggongguiji.com/scjc/437.html (accessed.
[103] “色彩的三要素”, https://blog.csdn.net/weixin_44174312/article/details/112977171 (accessed 2024/05).
[104] “色彩與視覺的原理”, https://blog.csdn.net/springmaster/article/details/1649573?utm_medium=distribute.pc_relevant.none-task-blog-2~default~baidujs_baidulandingword~default-0-1649573-blog-112040679.235^v43^pc_blog_bottom_relevance_base5&spm=1001.2101.3001.4242.1&utm_relevant_index=3 (accessed 2024/05).
[105] “色彩如何搭配?色相環教學與五大配色技巧”, https://daydayding.com/color-matching-color-wheels/#%E9%85%8D%E8%89%B2%E6%9C%89%E6%8A%80%E5%B7%A7%EF%BC%8C%E5%8E%9F%E4%BE%86%E9%A1%8F%E8%89%B2%E6%98%AF%E6%9C%89%E6%83%85%E7%B7%92%E7%9A%84 (accessed 2024/05).
[106] “伊登十二色相環”, https://ananedu.com/a/7/6/color-circle-all.htm (accessed 2024/05).
[107] “想學會配色?先認識「色相環」、7大配色技巧”, https://blog.andhouse.com.tw/flatsheet/about-colorwheel/ (accessed 2024/05).
[108] “[攝影小知識]Color Wheel色相環”, https://reun.com.tw/news-color-wheel/ (accessed 2024/05).
[109] “色相環”, https://lih-shang.com.tw/blog/%E8%89%B2%E7%9B%B8%E7%92%B0/ (accessed 2024/05).
[110] “Pccsとは”, https://www.dic-color.com/knowledge/pccs.html (accessed 2024/05).
[111] “基礎色彩與心理學”, http://mavis.cc/psychology-of-color/ (accessed 2024/05).
[112] E. Güneş and N. Olguntürk, “Color‐Emotion Associations in Interiors,” Color Research & Application, vol. 45, no. 1, pp. 129-141, 2020.
[113] X. F. He and X. G. Lv, “From the Color Composition to the Color Psychology: Soft Drink Packaging in Warm Colors, and Spirits Packaging in Dark Colors,” Color Research & Application, vol. 47, no. 3, pp. 758-770, 2022.
[114] A. Ishikawa, “A Review of Effects of Visual Environmental Factors on Interpersonal Cognition and Behavior: Focusing on Brightness, Color, and Depth,” Japan Architectural Review, vol. 6, no. 1, p. e12343, 2023.
[115] “何謂彩度”, https://www.jcolor.com.tw/blog_post/63bbe70234518d3c0056bc64 (accessed 2024/05).
[116] R. G. Kuehni, “Color Space and Its Divisions,” Color Research & Application: Endorsed by Inter‐Society Color Council, The Colour Group (Great Britain), Canadian Society for Color, Color Science Association of Japan, Dutch Society for the Study of Color, The Swedish Colour Centre Foundation, Colour Society of Australia, Centre Français de la Couleur, vol. 26, no. 3, pp. 209-222, 2001.
[117] “光、色彩和色彩空間簡介”, https://www.scratchapixel.com/lessons/digital-imaging/colors/color-space.html (accessed 2024/05).
[118] “What Is Color Space — Mastering Color in Post in Photo & Film”, https://www.studiobinder.com/blog/what-is-color-space-definition/ (accessed 2024/05).
[119] “色彩空間與色彩模型的本質區別是什麼?”, https://www.zhihu.com/question/38303244/answer/77581900 (accessed 2024/05).
[120] “數字影像處理(2): 顏色空間/模型—— Rgb, Cmy/Cmyk, Hsi, Hsv, Yuv”, https://blog.csdn.net/zaishuiyifangxym/article/details/89429221 (accessed 2024/05).
[121] “Colorspace顏色空間簡介”, https://blog.csdn.net/zb1165048017/article/details/109003125 (accessed 2024/05).
[122] “Understanding Color Spaces and Color Space Conversion”, https://www.mathworks.com/help/images/understanding-color-spaces-and-color-space-conversion.html (accessed 2024/05).
[123] S. Süsstrunk, R. Buckley, and S. Swen, “Standard Rgb Color Spaces,” vol. 7, pp. 127-134.
[124] “Color Models and Color Spaces”, https://programmingdesignsystems.com/color/color-models-and-color-spaces/index.html (accessed 2024/05).
[125] R. D. Dony and S. Wesolkowski, “Edge Detection on Color Images Using Rgb Vector Angles,” vol. 2, pp. 687-692.
[126] M. Loesdau, S. Chabrier, and A. Gabillon, “Chromatic Indices in the Normalized Rgb Color Space,” pp. 1-8.
[127] E. Cernadas, M. Fernandez-Delgado, E. González-Rufino et al., “Influence of Normalization and Color Space to Color Texture Classification,” Pattern Recognition, vol. 61, pp. 120-138, 2017.
[128] “Modelo Rgb”, http://cindy2906.blogspot.com/2013/02/modelo-rgb.html (accessed 2024/05).
[129] P. W. Trezona, “Derivation of the 1964 Cie 10° Xyz Colour‐Matching Functions and Their Applicability in Photometry,” Color Research & Application: Endorsed by Inter‐Society Color Council, The Colour Group (Great Britain), Canadian Society for Color, Color Science Association of Japan, Dutch Society for the Study of Color, The Swedish Colour Centre Foundation, Colour Society of Australia, Centre Français de la Couleur, vol. 26, no. 1, pp. 67-75, 2001.
[130] G. Wyszecki and W. S. Stiles, Color Science: Concepts and Methods, Quantitative Data and Formulae, John wiley & sons, 2000.
[131] “色彩空間 Color Space”, https://www.mblock.com.tw/zh-tw/Driver_ICs/enduser/define/59 (accessed 2024/05).
[132] “Cie 1931色彩空間”, https://zh.wikipedia.org/zh-tw/CIE_1931%E8%89%B2%E5%BD%A9%E7%A9%BA%E9%97%B4 (accessed 2024/05).
[133] “文組生也能看懂的色度學:淺介cie 1931與1976”, https://medium.com/%E5%AF%AB%E5%B9%80%E9%9B%86-frame/%E6%96%87%E7%B5%84%E7%94%9F%E4%B9%9F%E8%83%BD%E7%9C%8B%E6%87%82%E7%9A%84%E8%89%B2%E5%BA%A6%E5%AD%B8-%E6%B7%BA%E4%BB%8Bcie-1931%E8%88%871976-f48993b36f39 (accessed 2024/05).
[134] “Cie1931和cie1976的色度空間有何區別?”, https://www.qtccolor.com/Article/649.aspx (accessed 2024/05).
[135] “色域馬蹄圖是怎麼來的?——Cie 1931 Xyz色彩空間詳解”, https://zhuanlan.zhihu.com/p/137639368 (accessed 2024/05).
[136] D. Wang, Z. Liu, H. Wang et al., “Structural Color Generation: From Layered Thin Films to Optical Metasurfaces,” Nanophotonics, vol. 12, no. 6, pp. 1019-1081, 2023.
[137] “Color Part 1: Cie Chromaticity and Perception”, https://clarkvision.com/imagedetail/color-cie-chromaticity-and-perception/ (accessed 2024/05).
[138] H. Palus, "Representations of Colour Images in Different Colour Spaces," in The Colour Image Processing Handbook, Springer, pp. 67-90, 1998.
[139] “From Xyz to Rgb”, https://www.oceanopticsbook.info/view/photometry-and-visibility/from-xyz-to-rgb (accessed 2024/05).
[140] “Cie Rgb、Cie Xyz、 Lab空間轉換”, https://blog.csdn.net/qq_38528731/article/details/108417505 (accessed 2024/05).
[141] K.-K. Min, K.-Y. Shin, H.-S. Kim et al., “A Study on Chromaticity Coordinates Transform Method from Rgb to Xyz,” vol. 11, pp. 108-112.
[142] “Cielab Color Space”, https://en.wikipedia.org/wiki/CIELAB_color_space (accessed 2024/05).
[143] “What Is Cielab?”, https://www.datacolor.com/business-solutions/blog/what-is-cielab/ (accessed 2024/05).
[144] “Cielab色彩空間與座標互換”, https://www.acttr.com/tw/tw-report/tw-report-technology/381-tw-tech-color-space-conversion.html (accessed 2024/05).
[145] J. E. Agudo, P. J. Pardo, H. Sánchez et al., “A Low-Cost Real Color Picker Based on Arduino,” Sensors, vol. 14, no. 7, pp. 11943-11956, 2014.
[146] “Colorchecker Classic”, https://www.dhd.com.tw/product/colorchecker-classic/ (accessed 2024/05).
[147] “Colorchecker 灰度”, https://www.dhd.com.tw/product/colorchecker-%E7%81%B0%E5%BA%A6/ (accessed 2024/05).
[148] “使用曲線(Curves)調整反差、亮度及色調”, https://www.fotobeginner.com/1870/photoshop-curves/ (accessed 2024/05).
[149] “Geometric Transformation of Points – Getting Started”, https://blogs.mathworks.com/steve/2020/04/06/geometric-transformation-of-points-getting-started/?from=cn (accessed 2024/05).
[150] R. C. Gonzalez, R. E. Woods, and S. L. Eddins, Digital Image Processing Using Matlab, 2nd Ed, Gatesmark Publishing, USA, 2009.
[151] 台灣農業部漁業署, “水產品產銷履歷資訊系統”, https://ap.fishtap.org.tw/ (accessed 2024/05).