| 研究生: |
歐陽諺 Ou, Yang-Yen |
|---|---|
| 論文名稱: |
基於深度學習與機器學習之橘色科技研究與應用 Research and Application of Orange Technology based on Deep Learning and Machine Learning |
| 指導教授: |
王駿發
Wang, Jhing-Fa |
| 學位類別: |
博士 Doctor |
| 系所名稱: |
電機資訊學院 - 電機工程學系 Department of Electrical Engineering |
| 論文出版年: | 2020 |
| 畢業學年度: | 108 |
| 語文別: | 英文 |
| 論文頁數: | 99 |
| 中文關鍵詞: | 橘色科技 、用藥安全 、幸福感探索 、情感辨識 、物件偵測 、深度學習 、機器學習 |
| 外文關鍵詞: | Orange Technology, Medication Safety, Happiness Discovery, Emotion Recognition, Object Detection, Deep Learning, Machine Learning |
| 相關次數: | 點閱:287 下載:6 |
| 分享至: |
| 查詢本校圖書館目錄 查詢臺灣博碩士論文知識加值系統 勘誤回報 |
橘色科技以人本與人道關懷為核心目標,透過科技的技術創新、研究、發展,讓科技帶給人類健康幸福與人文關懷。橘色科技的研究與應用定位在於健康科技、幸福科技與關懷科技三大主軸。本論文之主要目的在於利用深度學習與機器學習的技術探討健康科技、幸福科技與關懷科技的研究範疇。
在健康科技研究上,本研究關注於在深度學習技術應用於用藥安全的問題。本研究使用卷積神經網路架構提出一裸藥辨識系統解決多顆藥物定位與藥物隨機擺設的問題。系統包含藥物定位與藥物分類兩個架構,增強式特徵金字塔被提出用來解決多顆藥物偵測的問題;卷積神經網路分類器被用來進行大量相似類別與藥物隨機擺設的分類任務上。在612類藥物的辨識任務上,藥物定位系統取得96%的偵測正確率,在藥物分類任務,本系統的Top-1、 Top-3、Top-5正確率分別為82.1%、92.4%和94.7%。
在幸福科技研究與應用上,本研究透過客觀量測分析母愛情感以及個人幸福感。透過功能性核磁共振成像與認知實驗的研究探討人類情感在大腦區域神經元活動所引發的血液動力反應。在母愛情感的研究上,本研究使用母親的購物行為實驗蒐集母愛情感fMRI資料庫,透過統計分析法找出母愛情感的活化腦區。此外,本研究使用支援向量機與五種腦區特徵參數提出判別多體像數分析法,本提出之系統在活化腦區的辨識率達到83.33%。在個人幸福感的研究上,本研究透過自我關注與他人關注的幸福感為例建構幸福感fMRI資料庫並使用多元像素迴歸分析法觀察活化腦區。實驗結果指出發現大腦區域前後扣帶迴(anterior and posterior cingulate cortex, ACC; PCC)與幸福感強度有高度重疊與正相關。環狀前腦(orbitofrontal)能夠高度的區分自我關注與成就關注的幸福感的差異。
在關懷科技的研究與發展,本研究以情感運算應用於家庭關懷研究導向,以口摘要系統與人臉情緒辨識系統提出情感因子分析系統。在口語摘要系統中,對話語意通過自動語音識別系統截取出文字並使用中文斷詞系統CKIP進行語句分割,採用詞性標記有效選取關鍵字和排序。本系統提出逐點互信息進行情感因子的匹配。針對人臉情緒因子,本研究採用兩階段的偵測架構基於空洞卷機神經網路與人臉特徵點提出混合人臉情緒辨識網路。在口語摘要系統與人臉情緒辨識的情感因子分析正確率分別為73.5%與86.14%。
Orange Technology is mainly aimed at enhancing the research for humanistic and humanitarian technology. "Orange Technology" pays attention to upgrading the innovation, research, and development of technology that could bring human beings true health, happiness, and humane care. Deep learning and machine learning techniques are used to explore the research areas of health technology, happiness technology and care technology in this thesis.
Firstly, this thesis focused on the medication safety of health technology. The deep learning techniques are used to propose an automatic drug pill detection, which is solving the problems of multiple drugs and randomly placed drugs. The proposed system includes a localization stage and a classification stage. The Enhance Feature Pyramid Network (EFPN) is proposed for drug localization, and Inception ResNet v2 is used in drug classification. The proposed EFPN achieves over 96% accuracy in localization experiment. In the complete system evaluation, the proposed system has obtained the Top-1, Top-3, and Top-5 accuracies of 82.1%, 92.4%, and 94.7%, respectively.
On the research of happiness technology, the functional Magnetic Resonance Imaging (fMRI) and cognitive experiments are used to observe the hemodynamic response of human emotions caused in the brain; and two kinds of happiness emotion, maternal love and personal happiness, are observed by objective measurement. The maternal love fMRI database is collected by the experiment behaviors of mother shopping, and statistical analysis is used to find the active brain region. Besides, the support vector machine and five parameters are used to propose discriminant multivoxel analysis, which achieves 83.33% accuracy of evaluation for activated brain regions. In the personal happiness discovery, self-and-other concerned well-being, is used to build a happiness fMRI database. The General Learning Model is used to analysis for the objection of active region. The study finds a high correlation between the intensity of happiness and the brain region, APCC (anterior and posterior cingulate cortex). Furthermore, the brain region, orbitofrontal, found to be different between the happiness of self-concerned and other concerned.
Lastly, artificial emotional intelligence is applied to homecare application of care technology. The spoken dialog summarization system and hybrid facial emotion recognition neural network are proposed for emotional factor analysis. In the spoken dialog summarization system, the semantics are captured by automatic speech recognition and CKIP (Chinese Knowledge and Information Processing); The keywords are ranked by part-of-speech tags, and then the emotional factor is done by the proposed point-wise mutual information. Moreover, the hybrid facial emotion recognition neural network is proposed with two kinds features, dilated convolution neural network and facial landmark, for facial emotion recognition. The spoken dialog summarization system and hybrid facial emotion recognition have achieved 73.5% and 86.14%.
[1] D. Ushizima, A. Carneiro, M. Souza, and F. Medeiros, "Investigating pill recognition methods for a new national library of medicine image dataset," presented at the International Symposium on Visual Computing, Las Vegas, NV, USA, Dec. 2015, 2015.
[2] C. Bekker, H. Gardarsdottir, A. Egberts, B. v. d. Bemt, and M. Bouvy, "Unused medicines returned to community pharmacy: An analysis of medication waste and possibilities for redispensing," International Journal of Clinical Pharmacy, vol. 39, no. 1, pp. 240-240, Feb. 2017 2017, doi: 10-1874-356216.
[3] J. Lenzer, "US could recycle 10 million unused prescription drugs a year, report says," British Medical Journal, vol. 349, pp. 1-1, Dec. 2014 2014, doi: https://doi.org/10.1136/bmj.g7677.
[4] B. Hazell and R. Robson, "Pharmaceutical waste reduction in the NHS," National Health Service, U.K., England, Jun. 2015 2015, vol. 1.
[5] M. A. V. Neto, J. a. W. M. d. Souza, P. P. R. c. Filho, and A. W. d. O. Rodrigues, "CoforDes: An Invariant Feature Extractor for the Drug Pill Identification," presented at the IEEE 31st International Symposium on Computer-Based Medical Systems, Karlstad, Sweden, Jun. 2018, 2018.
[6] X. Zeng, K. Cao, and M. Zhang, "MobileDeepPill: A small-footprint mobile deep learning system for recognizing unconstrained pill images," presented at the Proceedings of the 15th Annual International Conference on Mobile Systems, Applications, and Services, Niagara Falls, NY, USA, Jun. 2018, 2017.
[7] Y. Wang, J. Ribera, C. Liu, S. Yarlagadda, and F. Zhu, "Pill Recognition Using Minimal Labeled Data," presented at the IEEE Third International Conference on Multimedia Big Data, Laguna Hills, CA, USA, Apr. 2017, 2017.
[8] J. Yu, Z. Chen, S.-i. Kamata, and J. Yang, "Accurate system for automatic pill recognition using imprint information," IET Image Processing, vol. 9, no. 12, pp. 1039-1047, 2015, doi: 10.1049/iet-ipr.2014.1007.
[9] R.-C. Chen, Y.-K. Chan, Y.-H. Chen, and C.-T. Bau, "An automatic drug image identification system based on multiple image features and dynamic weights," International Journal of Innovative Computing, Information and Control, vol. 8, no. 5, pp. 2995-3013, May 2012 2012.
[10] A. Hartl, "Computer-vision based pharmaceutical pill recognition on mobile phones," presented at the 14th Central European Seminar on Computer Graphics, Budmerice, Slovakia, May 2010, 2010.
[11] Y.-B. Lee, U. Park, and A. K. Jain, "Pill-ID: Matching and retrieval of drug pill imprint images," presented at the 20th International Conference on Pattern Recognition, Istanbul, Turkey, Aug. 2010, 2010.
[12] J. J. Caban, A. Rosebrock, and T. S. Yoo, "Automatic identification of prescription drugs using shape distribution models," presented at the 19th IEEE International Conference on Image Processing, Orlando, FL, USA Oct. 2012, 2012.
[13] Z. Chen and S.-i. Kamata, "A new accurate pill recognition system using imprint information," presented at the Sixth International Conference on Machine Vision, London, UK, Dec. 2013, 2013.
[14] J. Yu, Z. Chen, and S.-i. Kamata, "Pill Recognition Using Imprint Information by Two-Step Sampling Distance Sets," presented at the 22nd International Conference on Pattern Recognition, Stockholm, Sweden, Aug. 2014, 2014.
[15] S. Suntronsuk and S. Ratanotayanon, "Automatic text imprint analysis from pill images," presented at the 9th International Conference on Knowledge and Smart Technology, Chonburi, Thailand, Feb. 2017, 2017.
[16] S. Suntronsuk and S. Ratanotayanon, "Pill image binarization for detecting text imprints," presented at the 13th International Joint Conference on Computer Science and Software Engineering, Khon Kaen, Thailand, Jul. 2016, 2016.
[17] Z. Yaniv et al., "The national library of medicine pill image recognition challenge: An initial report," presented at the IEEE Applied Imagery Pattern Recognition Workshop, Washington, DC, USA, Oct. 2016, 2016.
[18] R. Girshick, J. Donahue, T. Darrell, and J. Malik, "Rich feature hierarchies for accurate object detection and semantic segmentation," presented at the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, Jun. 2014, 2014.
[19] K. He, X. Zhang, S. Ren, and J. Sun, "Spatial pyramid pooling in deep convolutional networks for visual recognition," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 37, no. 9, pp. 1904-1916, 2014, doi: 10.1109/TPAMI.2015.2389824.
[20] R. Girshick, "Fast R-CNN," presented at the IEEE International Conference on Computer Vision, Washington, DC, USA, Dec. 2015, 2015.
[21] S. Ren, K. He, R. Girshick, and J. Sun, "Faster r-cnn: Towards real-time object detection with region proposal networks," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no. 6, pp. 1137-1149, 2015, doi: 10.1109/TPAMI.2016.2577031.
[22] K. He, G. Gkioxari, P. Doll´ar, and R. Girshick, "Mask r-cnn," presented at the IEEE International Conference on Computer Vision, Venice, Italy, Oct. 2017, 2017.
[23] X. Zhang, Y. Yuan, and Q. Wang, "ROI-wise Reverse Reweighting Network for Road Marking Detection," presented at the 29th British Machine Vision Conference, Newcastle, UK, September 3-6, 2018, 2018.
[24] K. Li, G. Cheng, S. Bu, and X. You, "Rotation-insensitive and context-augmented object detection in remote sensing images," IEEE Transactions on Geoscience and Remote Sensing, vol. 56, no. 4, pp. 2337-2348, 2018.
[25] T.-Y. Lin, P. Dollar´, R. Girshick, K. He, B. Hariharan, and S. Belongie, "Feature Pyramid Networks for Object Detection," presented at the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, Jul. 2017, 2017.
[26] S. Zhang, L. Wen, X. Bian, Z. Lei, and S. Z. Li, "Single-shot refinement neural network for object detection," presented at the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, 2018.
[27] G. Cheng, P. Zhou, and J. Han, "Learning rotation-invariant convolutional neural networks for object detection in VHR optical remote sensing images," IEEE Transactions on Geoscience and Remote Sensing, vol. 54, no. 12, pp. 7405-7415, 2016.
[28] G. Cheng, J. Han, P. Zhou, and D. Xu, "Learning rotation-invariant and fisher discriminative convolutional neural networks for object detection," IEEE Transactions on Image Processing, vol. 28, no. 1, pp. 265-278, 2019.
[29] K. Simonyan and A. Zisserman, "Very deep convolutional networks for large-scale image recognition," presented at the International Conference on Learning Representations, San Diego, CA, USA, May 2015, 2015.
[30] K. He, X. Zhang, S. Ren, and J. Sun, "Deep residual learning for image recognition," presented at the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, Jun. 2016, 2016.
[31] Y.-Y. Ou, A.-C. Tsai, J.-F. Wang, and J. Lin, "Automatic Drug Pills Detection based on Convolution Neural Network," presented at the International Conference on Orange Technologies, Bali, Indonesia, Oct. 2018, 2018.
[32] F. Chollet, "Xception: Deep learning with depthwise separable convolutions," presented at the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, Jul. 2017, 2017.
[33] C. Peng, X. Zhang, G. Yu, G. Luo, and J. Sun, "Large kernel matters—improve semantic segmentation by global convolutional network," presented at the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, Jun. 2017, 2017.
[34] A. Neubeck and L. V. Gool, "Efficient non-maximum suppression," presented at the 18th International Conference on Pattern Recognition, Hong Kong, China, Aug. 2006, 2006.
[35] T.-Y. Lin, P. Goyal, R. Girshick, K. He, and P. Doll´ar, "Focal loss for dense object detection," in Proceedings of the IEEE international conference on computer vision, 2017, pp. 2980-2988.
[36] C. Szegedy, V. Vanhoucke, S. Ioffe, and J. Shlens, "Rethinking the inception architecture for computer vision," presented at the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, Jul., 2016.
[37] C. Szegedy, S. Ioffe, V. Vanhoucke, and A. A. Alemi, "Inception-v4, inception-resnet and the impact of residual connections on learning," presented at the Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, Feb. 2017, 2017.
[38] M. Pelletier et al., "Separate neural circuits for primary emotions? Brain activity during self-induced sadness and happiness in professional actors," Neuroreport, vol. 14, no. 8, pp. 1111-1116, 2003.
[39] R. B. Rutledge, N. Skandali, P. Dayan, and R. J. Dolan, "Dopaminergic modulation of decision making and subjective well-being," Journal of Neuroscience, vol. 35, no. 27, pp. 9811-9822, 2015.
[40] M. C. Graham, L. Priddy, and S. Graham, Facts of Life: ten issues of contentment. Outskirts Press, 2014.
[41] P. Steel, J. Schmidt, and J. Shultz, "Refining the relationship between personality and subjective well-being," Psychological Bulletin, vol. 134, no. 1, p. 138, 2008.
[42] M. Argyle, The psychology of happiness. Routledge, 2013.
[43] S. Oishi, J. Graham, S. Kesebir, and I. C. Galinha, "Concepts of happiness across time and cultures," Personality and Social Psychology Bulletin, vol. 39, no. 5, pp. 559-577, 2013.
[44] "Sinica Trebank." http://ckip.iis.sinica.edu.tw/CKIP/treebank.htm (accessed.
[45] A. Bartels and S. Zeki, "The neural correlates of maternal and romantic love," Neuroimage, vol. 21, no. 3, pp. 1155-1166, 2004.
[46] G.-A. Hossein-Zadeh, B. A. Ardekani, and H. Soltanian-Zadeh, "Activation detection in fMRI using a maximum energy ratio statistic obtained by adaptive spatial filtering," IEEE Transactions on Medical Imaging, vol. 22, no. 7, pp. 795-805, 2003.
[47] B. Ng, G. Hamarneh, and R. Abugharbieh, "Modeling brain activation in fMRI using group MRF," IEEE Transactions on Medical Imaging, vol. 31, no. 5, pp. 1113-1123, 2012.
[48] M. Noriuchi, Y. Kikuchi, and A. Senoo, "The functional neuroanatomy of maternal love: mother’s response to infant’s attachment behaviors," Biological Psychiatry, vol. 63, no. 4, pp. 415-423, 2008.
[49] J. C. Rajapakse and J. Piyaratna, "Bayesian approach to segmentation of statistical parametric maps," IEEE Transactions on Biomedical Engineering, vol. 48, no. 10, pp. 1186-1194, 2001.
[50] F. Schneider et al., "Functional MRI reveals left amygdala activation during emotion," Psychiatry Research: Neuroimaging, vol. 76, no. 2-3, pp. 75-82, 1997.
[51] P. Skudlarski, R. Fulbright, J. Gore, and B. Wexler, "Emotions changes the functional connectivity measured by the fMRI time-course correlations," Neuroimage, vol. 5, no. 11, p. 246, 2000.
[52] J. E. Swain, J. F. Leckman, L. C. Mayes, R. Feldman, R. T. Constable, and R. T. Schultz, "Neural Substrates of Human Parent-Infant Attachment in the Postpartum," Biol Psychiatry, vol. 55, p. 242, 2004.
[53] J. E. Swain, "Baby stimuli and the parent brain: functional neuroimaging of the neural substrates of parent-infant attachment," Psychiatry, vol. 5, no. 8, p. 28, 2008.
[54] Y. Wang and J. C. Rajapakse, "Contextual modeling of functional MR images with conditional random fields," IEEE Transactions on Medical Imaging, vol. 25, no. 6, pp. 804-812, 2006.
[55] C.-C. Kung, Y.-Y. Ou, D.-R. Yeh, and J.-F. Wang, "The neural substrate of maternal love in shopping: Mothers' willingness to pay for her child vs. for herself: An fMRI study," in International Conference on Orange Technologies, 2014: IEEE, pp. 137-140.
[56] R. M. Ryan and E. L. Deci, "On happiness and human potentials: A review of research on hedonic and eudaimonic well-being," Annual Review of Psychology, vol. 52, no. 1, pp. 141-166, 2001.
[57] C. Kim-Prieto, E. Diener, M. Tamir, C. Scollon, and M. Diener, "Integrating the diverse definitions of happiness: A time-sequential framework of subjective well-being," Journal of Happiness Studies, vol. 6, no. 3, pp. 261-300, 2005.
[58] B. Reis, The virtuous life in Greek ethics. Cambridge University Press, 2006.
[59] A. S. Waterman, "Two conceptions of happiness: Contrasts of personal expressiveness (eudaimonia) and hedonic enjoyment," Journal of Personality and Social Psychology, vol. 64, no. 4, p. 678, 1993.
[60] R. Dodge, A. P. Daly, J. Huyton, and L. D. Sanders, "The challenge of defining wellbeing," International Journal of Wellbeing, vol. 2, no. 3, 2012.
[61] C. D. Ryff, "Happiness is everything, or is it? Explorations on the meaning of psychological well-being," Journal of Personality and Social Psychology, vol. 57, no. 6, pp. 1069-1081, 1989.
[62] C. D. Ryff and B. H. Singer, "Know thyself and become what you are: A eudaimonic approach to psychological well-being," Journal of Happiness Studies, vol. 9, no. 1, pp. 13-39, 2008.
[63] F. F. Chen, Y. Jing, A. Hayes, and J. M. Lee, "Two concepts or two approaches? A bifactor analysis of psychological and subjective well-being," Journal of Happiness Studies, vol. 14, no. 3, pp. 1033-1068, 2013.
[64] P. Kesebir and E. Diener, "In pursuit of happiness: Empirical answers to philosophical questions," in The Science of Well-being: Springer, 2009, pp. 59-74.
[65] P. A. Linley, J. Maltby, A. M. Wood, G. Osborne, and R. Hurling, "Measuring happiness: The higher order factor structure of subjective and psychological well-being measures," Personality and Individual Differences, vol. 47, no. 8, pp. 878-884, 2009.
[66] K. L. Siedlecki, T. A. Salthouse, S. Oishi, and S. Jeswani, "The relationship between social support and subjective well-being across age," Social Indicators Research, vol. 117, no. 2, pp. 561-576, 2014.
[67] M. Haller and M. Hadler, "How social relations and structures can produce happiness and unhappiness: An international comparative analysis," Social Indicators Research, vol. 75, no. 2, pp. 169-216, 2006.
[68] A. Furnham and C. R. Brewin, "Personality and happiness," Personality and Individual Differences, vol. 11, no. 10, pp. 1093-1096, 1990.
[69] W. Pavot, E. Diener, and F. Fujita, "Extraversion and happiness," Personality and Individual Differences, vol. 11, no. 12, pp. 1299-1306, 1990.
[70] U. Schimmack, P. Radhakrishnan, S. Oishi, V. Dzokoto, and S. Ahadi, "Culture, personality, and subjective well-being: Integrating process models of life satisfaction," Journal of Personality and Social Psychology, vol. 82, no. 4, pp. 582-593, 2002.
[71] E. Barrett-Cheetham, L. A. Williams, and T. C. Bednall, "A differentiated approach to the link between positive emotion, motivation, and eudaimonic well-being," The Journal of Positive Psychology, vol. 11, no. 6, pp. 595-608, 2016.
[72] M. Dambrun and M. Ricard, "Self-centeredness and selflessness: A theory of self-based psychological functioning and its consequences for happiness," Review of General Psychology, vol. 15, no. 2, pp. 138-157, 2011.
[73] M. Dambrun, G. Desprès, and G. Lac, "Measuring happiness: from fluctuating happiness to authentic–durable happiness," Frontiers in Psychology, vol. 3, p. 16, 2012.
[74] D. Kahneman, D. Kahneman, and A. Tversky, "Experienced utility and objective happiness: A moment-based approach," The Psychology of Economic Decisions, vol. 1, pp. 187-208, 2003.
[75] Y. Rim, "Values, happiness and family structure variables," Personality and Individual Differences, vol. 15, no. 5, pp. 595-598, 1993.
[76] S. A. Huettel, A. W. Song, and G. McCarthy, Functional magnetic resonance imaging. Sinauer Associates Sunderland, MA, 2004.
[77] A.-C. Batut, D. Gounot, I. J. Namer, E. Hirsch, P. Kehrli, and M.-N. Metz-Lutz, "Neural responses associated with positive and negative emotion processing in patients with left versus right temporal lobe epilepsy," Epilepsy & Behavior, vol. 9, no. 3, pp. 415-423, 2006.
[78] N. Medford et al., "Emotional memory—Content and context: An fMRI study," Neuroimage, vol. 5, no. 11, p. S241, 2000.
[79] M. L. Phillips et al., "Investigation of facial recognition memory and happy and sad facial expression perception: an fMRI study," Psychiatry Research: Neuroimaging, vol. 83, no. 3, pp. 127-138, 1998.
[80] A. A. Baird et al., "Functional magnetic resonance imaging of facial affect recognition in children and adolescents," Journal of the American Academy of Child & Adolescent Psychiatry, 1999.
[81] B. Schuller, G. Rigoll, and M. Lang, "Hidden Markov model-based speech emotion recognition," in Proceedings of IEEE International Conference on Acoustics, Speech, and Signal Processing, 2003, vol. 2: IEEE, pp. 401-404.
[82] J. V. Tu, "Advantages and disadvantages of using artificial neural networks versus logistic regression for predicting medical outcomes," Journal of Clinical Epidemiology, vol. 49, no. 11, pp. 1225-1231, 1996.
[83] M. W. Kadous, Temporal classification: Extending the classification paradigm to multivariate time series. University of New South Wales Kensington, 2002.
[84] N. Gillian, "Gesture recognition for musician computer interaction," Doctor of Philosophy, Faculty of Arts, Humanities and Social Sciences, 2011.
[85] A. Bordi, G. De Rosa, F. Napolitano, M. Litterio, V. Marino, and R. Rubino, "Postpartum development of the mother-young relationship in goats," Applied Animal Behaviour Science, vol. 42, no. 2, pp. 145-152, 1994.
[86] S. J. Hudson and M. Mullord, "Investigations of maternal bonding in dairy cattle," Applied Animal Ethology, vol. 3, no. 3, pp. 271-276, 1977.
[87] K. Kendrick, F. Levy, and E. Keverne, "Importance of vaginocervical stimulation for the formation of maternal bonding in primiparous and multiparous parturient ewes," Physiology & Behavior, vol. 50, no. 3, pp. 595-600, 1991.
[88] J. Kent, "note concerning the use of the maternal bond concept," Applied Animal Behaviour Science, 1987.
[89] R. Feldman, A. Weller, J. F. Leckman, J. Kuint, and A. I. Eidelman, "The nature of the mother's tie to her infant: Maternal bonding under conditions of proximity, separation, and potential loss," The Journal of Child Psychology and Psychiatry and Allied Disciplines, vol. 40, no. 6, pp. 929-939, 1999.
[90] J. Jansen, C. de Weerth, and J. M. Riksen-Walraven, "Breastfeeding and the mother–infant relationship—a review," Developmental Review, vol. 28, no. 4, pp. 503-521, 2008.
[91] A. Levine, O. Zagoory-Sharon, R. Feldman, and A. Weller, "Oxytocin during pregnancy and early postpartum: individual patterns and maternal–fetal attachment," Peptides, vol. 28, no. 6, pp. 1162-1169, 2007.
[92] J. F. Leckman, R. Feldman, J. E. Swain, V. Eicher, N. Thompson, and L. Mayes, "Primary parental preoccupation: circuits, genes, and the crucial role of the environment," Journal of Neural Transmission, vol. 111, no. 7, pp. 753-771, 2004.
[93] J. F. Leckman, L. C. Mayes, R. Feldman, D. W. Evans, R. A. King, and D. J. Cohen, "Early parental preoccupations and behaviors and their possible relationship to the symptoms of obsessive‐compulsive disorder," Acta Psychiatrica Scandinavica, vol. 100, pp. 1-26, 1999.
[94] J. P. Lorberbaum et al., "Feasibility of using fMRI to study mothers responding to infant cries," Depression and Anxiety, vol. 10, no. 3, pp. 99-104, 1999.
[95] J. P. Lorberbaum et al., "A potential role for thalamocingulate circuitry in human maternal behavior," Biological Psychiatry, vol. 51, no. 6, pp. 431-445, 2002.
[96] S. Nishitani, H. Doi, A. Koyama, and K. Shinohara, "Differential prefrontal response to infant facial emotions in mothers compared with non-mothers," Neuroscience Research, vol. 70, no. 2, pp. 183-188, 2011.
[97] H. Doi and K. Shinohara, "Event-related potentials elicited in mothers by their own and unfamiliar infants’ faces with crying and smiling expression," Neuropsychologia, vol. 50, no. 7, pp. 1297-1307, 2012.
[98] H. K. Laurent, A. Stevens, and J. C. Ablow, "Neural correlates of hypothalamic-pituitary-adrenal regulation of mothers with their infants," Biological Psychiatry, vol. 70, no. 9, pp. 826-832, 2011.
[99] H. K. Laurent and J. C. Ablow, "The missing link: mothers’ neural response to infant cry related to infant attachment behaviors," Infant Behavior and Development, vol. 35, no. 4, pp. 761-772, 2012.
[100] E. D. Musser, H. Kaiser-Laurent, and J. C. Ablow, "The neural correlates of maternal sensitivity: an fMRI study," Developmental Cognitive Neuroscience, vol. 2, no. 4, pp. 428-436, 2012.
[101] B. Knutson, S. Rick, G. E. Wimmer, D. Prelec, and G. Loewenstein, "Neural predictors of purchases," Neuron, vol. 53, no. 1, pp. 147-156, 2007.
[102] C.-W. Hsu and C.-J. Lin, "A comparison of methods for multiclass support vector machines," IEEE Transactions on Neural Networks, vol. 13, no. 2, pp. 415-425, 2002.
[103] Y. Luo, S. Qi, X. Chen, X. You, X. Huang, and Z. Yang, "Pleasure attainment or self-realization: the balance between two forms of well-beings are encoded in default mode network," Social Cognitive and Affective Neuroscience, vol. 12, no. 10, pp. 1678-1686, 2017.
[104] S. Sul et al., "Spatial gradient in value representation along the medial prefrontal cortex reflects individual differences in prosociality," Proceedings of the National Academy of Sciences, vol. 112, no. 25, pp. 7851-7856, 2015.
[105] N. Kriegeskorte, R. Goebel, and P. Bandettini, "Information-based functional brain mapping," Proceedings of the National Academy of Sciences, vol. 103, no. 10, pp. 3863-3868, 2006.
[106] M. Kleiner, D. Brainard, and D. Pelli, "What's new in Psychtoolbox-3?," 2007.
[107] K. Jimura and R. A. Poldrack, "Analyses of regional-average activation and multivoxel pattern information tell complementary stories," Neuropsychologia, vol. 50, no. 4, pp. 544-552, 2012.
[108] R. L. Buckner, J. R. Andrews-Hanna, and D. L. Schacter, "The brain's default network: anatomy, function, and relevance to disease," 2008.
[109] D. McNamee, A. Rangel, and J. P. O'doherty, "Category-dependent and category-independent goal-value codes in human ventromedial prefrontal cortex," Nature Neuroscience, vol. 16, no. 4, pp. 479-487, 2013.
[110] M. L. Kringelbach and K. C. Berridge, "The neuroscience of happiness and pleasure," Social Research, vol. 77, no. 2, pp. 659-678, 2010.
[111] J. X. O’Reilly, M. W. Woolrich, T. E. Behrens, S. M. Smith, and H. Johansen-Berg, "Tools of the trade: psychophysiological interactions and functional connectivity," Social Cognitive and Affective Neuroscience, vol. 7, no. 5, pp. 604-609, 2012.
[112] "CKIP中文詞知識詞庫小組." http://ckip.iis.sinica.edu.tw/CKIP (accessed.
[113] S. Xie and Y. Liu, "Using N-Best Lists and Confusion Networks for Meeting Summarization," IEEE Transactions on Audio, Speech, and Language Processing, vol. 19, no. 5, pp. 1160-1169, 2011, doi: 10.1109/TASL.2010.2082534.
[114] B.-W. Chen and W. Ji, "Intelligent marketing in smart cities: Crowdsourced data for geo-conquesting," IT Professional, vol. 18, no. 4, pp. 18-24, 2016.
[115] M. R. Mohammadi, E. Fatemizadeh, and M. H. Mahoor, "Intensity Estimation of Spontaneous Facial Action Units Based on Their Sparsity Properties," IEEE Transactions on Cybernetics, vol. 46, no. 3, pp. 817-826, 2016, doi: 10.1109/TCYB.2015.2416317.
[116] M. Pantic and L. J. Rothkrantz, "Automatic analysis of facial expressions: The state of the art," IEEE Transactions on Pattern Analysis and Machine Intelligence, no. 12, pp. 1424-1445, 2000.
[117] C.-F. Chuang and F. Y. Shih, "Recognizing facial action units using independent component analysis and support vector machine," Pattern Recognition, vol. 39, no. 9, pp. 1795-1798, 2006.
[118] L. Zhong, Q. Liu, P. Yang, J. Huang, and D. N. Metaxas, "Learning multiscale active facial patches for expression analysis," IEEE Transactions on Cybernetics, vol. 45, no. 8, pp. 1499-1510, 2014.
[119] W.-S. Chu, F. De la Torre, and J. F. Cohn, "Selective transfer machine for personalized facial expression analysis," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 39, no. 3, pp. 529-545, 2016.
[120] H. Meng, N. Bianchi-Berthouze, Y. Deng, J. Cheng, and J. P. Cosmas, "Time-delay neural network for continuous emotional dimension prediction from facial expression sequences," IEEE Transactions on Cybernetics, vol. 46, no. 4, pp. 916-929, 2015.
[121] G. Zen, L. Porzi, E. Sangineto, E. Ricci, and N. Sebe, "Learning personalized models for facial expression analysis and gesture recognition," IEEE Transactions on Multimedia, vol. 18, no. 4, pp. 775-788, 2016.
[122] B. Jiang, M. Valstar, B. Martinez, and M. Pantic, "A dynamic appearance descriptor approach to facial actions temporal modeling," IEEE Transactions on Cybernetics, vol. 44, no. 2, pp. 161-174, 2013.
[123] S. M. Mavadati, M. H. Mahoor, K. Bartlett, P. Trinh, and J. F. Cohn, "Disfa: A spontaneous facial action intensity database," IEEE Transactions on Affective Computing, vol. 4, no. 2, pp. 151-160, 2013.
[124] S. Happy and A. Routray, "Automatic facial expression recognition using features of salient facial patches," IEEE Transactions on Affective Computing, vol. 6, no. 1, pp. 1-12, 2014.
[125] M.-I. Georgescu, R. T. Ionescu, and M. Popescu, "Local Learning with Deep and Handcrafted Features for Facial Expression Recognition," arXiv preprint arXiv:1804.10892, 2018.
[126] E. Barsoum, C. Zhang, C. C. Ferrer, and Z. Zhang, "Training deep networks for facial expression recognition with crowd-sourced label distribution," in Proceedings of the 18th ACM International Conference on Multimodal Interaction, 2016: ACM, pp. 279-283.