dc.contributor.author |
Malatesta, L |
en |
dc.contributor.author |
Raouzaiou, A |
en |
dc.contributor.author |
Kollias, S |
en |
dc.date.accessioned |
2014-03-01T01:24:40Z |
|
dc.date.available |
2014-03-01T01:24:40Z |
|
dc.date.issued |
2006 |
en |
dc.identifier.issn |
15715736 |
en |
dc.identifier.uri |
https://dspace.lib.ntua.gr/xmlui/handle/123456789/17385 |
|
dc.subject |
Embodied Conversational Agent |
en |
dc.subject |
Facial Animation |
en |
dc.subject |
Facial Expression |
en |
dc.subject |
Action Unit |
en |
dc.title |
MPEG-4 facial expression synthesis based on appraisal theory |
en |
heal.type |
journalArticle |
en |
heal.identifier.primary |
10.1007/0-387-34224-9_43 |
en |
heal.identifier.secondary |
http://dx.doi.org/10.1007/0-387-34224-9_43 |
en |
heal.publicationDate |
2006 |
en |
heal.abstract |
MPEG-4 facial animation parameters are used in order to evaluate theoretical predictions for intermediate expressions of a given emotion episode, based on Scherer's appraisal theory. MPEG-4 FAPs and action units are combined in modelling the effects of appraisal checks on facial expressions and temporal evolution issues of facial expressions are investigated. The results of the synthesis process can then be applied to Embodied Conversational Agents (ECAs), rendering their interaction with humans, or other ECAs, more affective. © 2006 International Federation for Information Processing. |
en |
heal.journalName |
IFIP International Federation for Information Processing |
en |
dc.identifier.doi |
10.1007/0-387-34224-9_43 |
en |
dc.identifier.volume |
204 |
en |
dc.identifier.spage |
378 |
en |
dc.identifier.epage |
384 |
en |