A data-driven paradigm to understand multimodal communication in human-human and human-robot interaction

8Citations
Citations of this article
24Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Data-driven knowledge discovery is becoming a new trend in various scientific fields. In light of this, the goal of the present paper is to introduce a novel framework to study one interesting topic in cognitive and behavioral studies - multimodal communication between human-human and human-robot interaction. We present an overall solution from data capture, through data coding and validation, to data analysis and visualization. In data collection, we have developed a multimodal sensing system to gather fine-grained video, audio and human body movement data. In data analysis, we propose a hybrid solution based on visual data mining and information-theoretic measures. We suggest that this data-driven paradigm will lead not only to breakthroughs in understanding multimodal communication, but will also serve as a successful case study to demonstrate the promise of data-intensive discovery which can be applied in various research topics in cognitive and behavioral studies. © 2010 Springer-Verlag Berlin Heidelberg.

Cite

CITATION STYLE

APA

Yu, C., Smith, T. G., Hidaka, S., Scheutz, M., & Smith, L. B. (2010). A data-driven paradigm to understand multimodal communication in human-human and human-robot interaction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 6065 LNCS, pp. 232–244). https://doi.org/10.1007/978-3-642-13062-5_22

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free