dc.contributor.author |
Raouzaiou, A |
en |
dc.contributor.author |
Tsapatsoulis, N |
en |
dc.contributor.author |
Karpouzis, K |
en |
dc.contributor.author |
Kollias, S |
en |
dc.date.accessioned |
2014-03-01T01:18:11Z |
|
dc.date.available |
2014-03-01T01:18:11Z |
|
dc.date.issued |
2002 |
en |
dc.identifier.issn |
1110-8657 |
en |
dc.identifier.uri |
https://dspace.lib.ntua.gr/xmlui/handle/123456789/14847 |
|
dc.subject |
Activation |
en |
dc.subject |
Facial expression |
en |
dc.subject |
MPEG-4 facial definition parameters |
en |
dc.subject |
Parameterized expression synthesis |
en |
dc.subject.classification |
Engineering, Electrical & Electronic |
en |
dc.subject.other |
Animation |
en |
dc.subject.other |
Human computer interaction |
en |
dc.subject.other |
Image analysis |
en |
dc.subject.other |
Speech |
en |
dc.subject.other |
Facial animation parameters (FAP) |
en |
dc.subject.other |
Face recognition |
en |
dc.title |
Parameterized facial expression synthesis based on MPEG-4 |
en |
heal.type |
journalArticle |
en |
heal.identifier.primary |
10.1155/S1110865702206149 |
en |
heal.identifier.secondary |
http://dx.doi.org/10.1155/S1110865702206149 |
en |
heal.language |
English |
en |
heal.publicationDate |
2002 |
en |
heal.abstract |
in the framework of MPEG-4, one can include applications where virtual agents, utilizing both textual and multisensory data, including facial expressions and nonverbal speech, help systems become accustomed to the actual feelings of the user. Applications of this technology are expected in educational environments, virtual collaborative workplaces, communities, and interactive entertainment. Facial animation has gained much interest within the MPEG-4 framework; with implementation details being an open research area (Tekalp, 1999). In this paper, we describe a method for enriching human computer interaction, focusing on analysis and synthesis of primary and intermediate facial expressions (Ekman and Friesen (1978)). To achieve this goal, we utilize facial animation parameters (FAPs) to model primary expressions and describe a rule-based technique for handling intermediate ones. A relation between FAPs and the activation parameter proposed in classical psychological studies is established, leading to parameterized facial expression analysis and synthesis notions, compatible with the MPEG-4 standard. |
en |
heal.publisher |
HINDAWI PUBLISHING CORPORATION |
en |
heal.journalName |
Eurasip Journal on Applied Signal Processing |
en |
dc.identifier.doi |
10.1155/S1110865702206149 |
en |
dc.identifier.isi |
ISI:000179327900003 |
en |
dc.identifier.volume |
2002 |
en |
dc.identifier.issue |
10 |
en |
dc.identifier.spage |
1021 |
en |
dc.identifier.epage |
1038 |
en |