Navigation Bar HOME CONTACT PROJECTS PUBLICATIONS RESOURCES SEARCH STAFF face and body human motion pheromones simulation urbanisation



The analysis of embodied communicative feedback in multimodal corpora

– a prerequisite for behavior simulation


Jens Allwood 1)
Stefan Kopp 2)
Karl Grammer 3)
Elisabeth Ahlsén 1)
Elisabeth Oberzaucher 3)
Markus Koppensteiner 3)

Journal of Language Resources and Evaluation

1) Department of Linguistics Göteborg University, Box 200 SE-40530 Göteborg, Sweden
2) Artificial Intelligence Group Bielefeld University, P.O. 100131, D-33501 Bielefeld, Germany
3) Ludwig Boltzmann Institute for Urban Ethology - Department of Anthropology of the University of Vienna, Althanstrasse 14, 1090 Vienna, Austria


Communicative feedback refers to unobtrusive (usually short) vocal or bodily expressions whereby a recipient of information can inform a contributor of information about whether he/she is able and willing to communicate, perceive the information, and understand the information. This paper provides a theory for embodied communicative feedback, describing the different dimensions and features involved. It also provides a corpus analysis part, describing a first data coding and analysis method geared to find the features postulated by the theory. The corpus analysis part describes different methods and statistical procedures and discusses their applicability and the possible insights gained with these methods.


Complex behavioral feedback pattern which was repeated several times in an interaction (analyzed with THEME).

An example of a complex feedback pattern. In I is the hierarchical organization of the pattern. II shows the duration of the different behavior elements in the pattern. III shows the temporal distribution of the pattern in the whole sequence. The dark grey area marks the first occurrence of the pattern, the light grey area marks the second occurrence.

In (a) the hierarchical organization of the pattern is shown, which consists of 14 behaviour events in time. Y and X denote the interactants, b and e the end of a behavior. The pattern starts with Y starting (y,b,aut) and ending automanipulation (y,e,aut), followed by the same person ending a second automanipulative behavior (y,e,aut). This is followed by a brow raise (y,b,bra; y,e,bra) and the end of a first utterance (y,e,utt). X responds with a repeated head nod (x,b,rno), one word verbal feedback (x,e,giv) and Y (y,b,utt) and later X start producing an utterance (x,b,utt). X during his utterance starts looking at Y (x,b,ilo), automanipulates (x,b,aut) and ends the utterance (x,e,utt). Finally Y starts looks away from X (y,e,ilo). This complex pattern is created twice (c) in the same time configuration (b). THEME thus can reveal hidden rhythmic structures in interactions.

Our analysis also shows that simple implementations of feedback, which just start at the end of an utterance, are not the solution for simulation – verbal feedback is present during utterances and combined with many small behaviors.
We have presented a theory for communicative feedback, describing the different dimensions involved. This theory is supposed to provide the basis of a framework for analyzing embodied feedback behavior in natural interactions. We have started to design a coding scheme and a data analysis method suited to capture some of the features that are decisive in this account (such as type of expression, relevant function, or time scale). Currently, we are investigating how the resultant multimodal corpus can be analyzed for patterns and rules as required for a predictive model of embodied feedback. We will also want this model to support simulation in a state-of-the-art embodied conversational character.
The results from our empirical study suggest that feedback is a multimodal, complex, and highly dynamic process—supporting the differentiating assumptions we made in our theoretical account.

In ongoing work, we are building a computational model of feedback behavior that is informed by our theoretical and empirical work, and that can be simulated and tested in the embodied agent „Max“. Max is a virtual human under development at the A.I. Group at Bielefeld University.





all rights reserved