| 研究生: |
廖家麟 Liao, Jia-Lin |
|---|---|
| 論文名稱: |
基於知覺之低失真色調映射與細節強化 Low-Distortion Perceptual-Based Tone Mapping and Detail Enhancement |
| 指導教授: |
陳培殷
Chen, Pei-Yin |
| 學位類別: |
碩士 Master |
| 系所名稱: |
電機資訊學院 - 資訊工程學系 Department of Computer Science and Information Engineering |
| 論文出版年: | 2013 |
| 畢業學年度: | 101 |
| 語文別: | 英文 |
| 論文頁數: | 40 |
| 中文關鍵詞: | 高動態範圍影像 、色調映射 、Naka-Rushton方程式 |
| 外文關鍵詞: | High Dynamic Range images, Tone Mapping, Naka-Rushton equation |
| 相關次數: | 點閱:125 下載:0 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
高動態範圍影像是利用多重曝光度的拍攝方法,把不同曝光程度的影像合併起來,因此高動態範圍影像的亮度(luminance)顯示範圍遠高於一般低動態範圍顯示器,如果把一張高動態範圍影像直接由一般顯示螢幕撥放,容易出現顯示上的問題,例如影像過曝或過暗等等。為了適當地把高動態範圍影像顯示在一般顯示螢幕上,常見的一種解決方法就是色調映射(tone mapping),色調映射可以把高動態範圍影像的亮度壓縮成一般低動態範圍顯示器的亮度顯示範圍,而這是一個失真壓縮。一個好的色調映射必須真實地還原原始場景的亮度,而不是破壞影像來得到一張誇大,求鮮豔的影像。本論文提出一個兩階段的色調映射方法,第一階段我們利用Naka-Rushton方程式來實作一個基於知覺的全域性色調映射方法,並且在色調映射的過程中特別保留住了亮處部份的細節資訊。而第二階段我們分析了一些細部增強的演算法,為了避免光暈(halos)產生以及降低失真率,我們選擇了Guided Image Filter去實作我們的第二階段。最後我們以一個通過物理實驗驗證的公制標準來評估我們做完色調映射後的失真率,與過去的方法比較,本論文不僅能更清楚地保留住亮處細節而且在失真率的降低上也表現得更為出色。
High-Dynamic Range images have a span widely radiances, so the HDR images are not displayable in a regular Low-Dynamic Range monitors. The common solution of this problem is called Tone Mapping. Tone mapping can compress the HDR images into the LDR images with distortion. A good tone mapping operator should reproduce the original scene realistically instead of damaging the original images to obtain the exaggerate results. We propose our method in two stages. The first stage is a global perceptual-based tone mapping operator that implements the Naka-Rushton equation to do visual adaptation and preserves the detail of the brightness region. The second stage is to implement the Guided Image Filter to analyze local detail enhancement methods in order to avoid halos artifact and reduce the distortion rate efficiently. Focusing on distortion, we evaluate both two stages of our method with metric validated by psychophysical experiments, the distortion rate of two stages outperform the state of the art.
[1] D. C. Hood, M. A. Finkelstein, and E. Buckingham, “Psychophysical tests of
models of the response function,” Vision Res, vol. 19, pp. 401–406, 1979.
[2] D. Tamburrino, D. Alleysson, L. Meylan, and S. Susstrunk,“Digital Camera
Workflow for High Dynamic Range Images Using a Model of Retinal
Processing,” Proc. SPIE, vol. 6817,2008.
[3] E. Reinhard, M. Stark, P. Shirley, and J. Ferwerda, “Photographic Tone
Reproduction for Digital Images,” ACM Trans. Graphics, vol. 21, pp. 267-276,
2002.
[4] E. Reinhard, “Parameter estimation for photographic tone reproduction,” Journal
of Graphics Tools, vol. 7, no. 1, pp. 45–51, 2003.
[5] E. Reinhard and K. Devlin, “Dynamic Range Reduction Inspired by
Photoreceptor Physiology,” IEEE Trans. Visualization and Computer Graphics,
vol. 11, no. 1, pp. 13-24, Jan./Feb. 2005.
[6] F. Durand and J. Dorsey, “Fast Bilateral Filtering for the Display of
High-Dynamic-Range Images,” Proc. ACM SIGGRAPH, pp. 257-266, 2002.
[7] F. Drago, K. Myszkowski, T. Annen, and N. Chiba, “Adaptive Logarithmic
Mapping for Displaying High Contrast Scenes,” Computer Graphics Forum, vol.
22, pp. 419-426, 2003.
[8] Ferradans, S., Bertalmio, M., Provenzi, E., Caselles, V., "An Analysis of Visual
Adaptation and Contrast Perception for Tone Mapping," IEEE Trans. Pattern
Analysis and Machine Intelligence, vol. 33, no. 10, pp. 2002-2012, Oct. 2011.
[9] G. Ward, “A Contrast-Based Scalefactor for Luminance Display,” Graphics Gems IV, pp. 415-421, Academic Press, 1994.
[10] J. Valeton and D. van Norren, “Light Adaptation of Primate Cones: An Analysis
Based on Extracellular Data,” Vision Research, vol. 23, no. 12, pp. 1539-1547,
1983.
[11] K. I. Naka and W. A. H. Rushton, “S-potentials from luminosity units in the
retina of fish (cyprinidae),” J Physiol, vol. 185, pp. 587–599, 1966.
[12] K. He, J. Sun, and X. Tang, "Guided image filtering," IEEE Trans. Pattern
Analysis and Machine Intelligence, vol. 35, no. 6, June. 2013.
[13] Lagendijk, R. L., Biemond, J., And Boekee, D. E. 1988. Regularized iterative image restoration with ringing reduction. IEEE Trans. Acoustics, Speech, and Signal Proc., Speech, Signal Proc. 36, 12 (December), 1874–1888.
[14] Lischinski, D., Farbman, Z., Uyttendaele, M., And Szeliski, R. Interactive local adjustment of tonal values. ACM Trans. Graph. 25, 646–653, July 2006.
[15] M. Ashikhmin, “A Tone Mapping Algorithm for High Contrast Images,” Proc.
Eurographics Workshop Rendering, pp. 1-11, 2002.
[16] M. Cadik, M. Wimmer, L. Neumann, and A. Artusi, “Evaluation of HDR Tone
Mapping Methods Using Essential Perceptual Attributes,” Computers and
Graphics, vol. 32, no. 3, pp. 330-349, 2008.
[17] M. Kim, T. Weyrich, and J. Kautz, “Modeling Human Color Perception under
Extended Luminance Levels,” ACM Trans. Graphics, vol. 28, no. 3, p. 27, 2009.
[18] P. Ledda, A. Chalmers, T. Troscianko, and H. Seetzen, “Evaluation of Tone
Mapping Operators Using a High Dynamic Range Display,” Proc. ACM Trans.
Graphics, vol. 24, pp. 640-648, 2005.
[19] R. Fattal, D. Lischinski, and M. Werman, “Gradient Domain High Dynamic
Range Compression,” Proc. ACM Trans. Graphics, vol. 21, no. 3, pp. 249-256, 2002.
[20] R. Mantiuk, K. Myszkowski, and H. Seidel, “A Perceptual Framework for
Contrast Processing of High Dynamic Range Images,” ACM Trans. Applied Perception, vol. 3, no. 3, pp. 286-308, 2006.
[21] S. Pattanaik, J. Tumblin, H. Yee, and D. Greenberg, “Time-Dependent Visual
Adaptation for Fast Realistic Image Display,” Proc. ACM SIGGRAPH, pp.
47-54, 2000.
[22] T. Aydin, R. Mantiuk, K. Myszkowski, and H. Seidel, “Dynamic Range
Independent Image Quality Assessment,” Proc. ACM SIGGRAPH ’08 Papers, pp. 1-10, 2008.
[23] http://en.wikipedia.org/wiki/Weber%E2%80%93Fechner_law
[24] http://research.microsoft.com/en-us/um/people/kahe/
[25] http://driiqm.mpi-inf.mpg.de/index.php
[26] http://pfstools.sourceforge.net/.
[27] http://www.gpi.upf.edu/static/sira/Sira_Ferradans/Me.html
[28] http://en.wikipedia.org/wiki/Adaptive_histogram_equalization
[29] http://en.wikipedia.org/wiki/Bilateral_filter
[30] http://www.cs.huji.ac.il/~danix/epd/
校內:2023-12-31公開