学生のATMAJAさんがThe 1st Biennial International Conference on Acoustic and Vibration 2020においてBest Student Paper and Presentation Awardを受賞

 学生のATMAJA, Bagus Trisさん(博士後期課程3年、ヒューマンライフデザイン領域赤木研究室)がThe 1st Biennial International Conference on Acoustic and Vibration 2020 (ANV2020)においてBest Student Paper and Presentation Awardを受賞しました。

 この会議は、インドネシアのスラバヤ工科大学(ITS)工学物理学部の振動音響研究所(Vibrastic Lab)が主催する、音響、振動、及びその応用に関する技術カンファレンスです。会議のテーマは「Sound of Indonesia」で、 今回はオンラインで開催されました。様々な音響と振動の話題、及びその応用に関する幅広いセッションプログラムがあり、産業界の参加・展示のために貴重な機会と場を提供します。



Evaluation of Error and Correlation-Based Loss Functions For Multitask Learning Dimensional Speech Emotion Recognition

Bagus Tris Atmaja and Masato Akagi

The choice of a loss function is a critical part in machine learning. This paper evaluates two different loss functions commonly used in regression-task dimensional speech emotion recognition -- error-based and correlation-based loss functions. We found that using correlation-based loss function with concordance correlation coefficient (CCC) loss resulted in better performance than error-based loss functions with mean squared error (MSE) and mean absolute error (MAE). The evaluations were measured in averaged CCC among three emotional attributes. The results are consistent with two input feature sets and two datasets. The scatter plots of test prediction by those two loss functions also confirmed the results measured by CCC scores.

I would like to thank Akagi-sensei for supervising me and MEXT for funding my research. ANV 2020 is the first biennial conference on acoustic and vibration. This conference is planned to be held in Bali, Indonesia. However, due to Covid-19, this conference was held virtually via Zoom. In this conference, I presented my word on evaluation of correlation-based loss function for dimensional speech emotion recognition. Since the goal is performance measured in correlation-based score, it is reasonable to also choose a correlation-based loss function. The experimental results show the usefulness of using correlation-based loss function over error-based loss function.