Movement affect estimation in motion capture data

التفاصيل البيبلوغرافية
العنوان: Movement affect estimation in motion capture data
المؤلفون: Li, Dishen
سنة النشر: 2018
الوصف: The problem that we are addressing is that of emph{movement affect estimation}: estimating the emotional states of motion capture data as input. Motion capture data contains just a skeleton and no information about body type and facial expressions. The motion capture data consists of professional actors and dancers performing movements such as walking, sitting, and improv. Machine learning models are then built using these motion capture to learn the affective states behind these movement. Overall, we conducted a series of three experiments. First, using the labels that were given to the actors as the ground truth, our Hidden Markov Models (HMM) models were able to reach over 70% accuracy in prediciting the affective state, out of a total possible 9 affective states. Second, we attempted recognition through establishing a ground truth using ratings. We used a continuous approach by asking university students to rate each movement in valence and arousal simultaneously. The ratings are then used as ground truth labels for supervised machine learning with stepwise linear regression. This achieved a high coefficient of determination, the performance metric we used in this experiment. In our third experiment and in light of more literature review after the first two experiments, we gathered more data using a crowdsourcing platform and modified our machine learning techniques by switching to rank-based methods. In this case, rather than assigning an absolute numerical rating value to quantify the affective state of the movement, each movement is now ranked relative to other movement along the dimensions of affect. Results are then analyzed using the Goodman-Kruskal Gamma. The performance of models in this approach show that it is highly dependent on the movement type in that consistent movement pattern lead to a more consistent ranking. It also appears to be more effective for recognizing affect in postures.
URL الوصول: https://explore.openaire.eu/search/publication?articleId=od_______497::3ba9d6b4c3081ea02fc7496e1f42eb1e
http://summit.sfu.ca/item/18532
حقوق: OPEN
رقم الأكسشن: edsair.od.......497..3ba9d6b4c3081ea02fc7496e1f42eb1e
قاعدة البيانات: OpenAIRE