Cyphone - Mobile multimodal personal augmented reality

3Citations
Citations of this article
8Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Advances in multimedia, virtual reality, and immersive environments have expanded human-computer interaction beyond text and vision to include touch, gestures, voice and 3D sound. Although there exist well-developed single modalities for communication, we do not really understand the general problem of designing integrated multimodal systems. Recent advances in mobile communication based on picocellular technologies allow the transmission of high bandwidth of data over personal surrounding networks. The technology offers some more freedom for the design of mobile multimodal 3D user interfaces but does not solve the design problem. In this paper we offer an approach to adding aspects of mobility and augmented reality to multimodal user interfaces, discuss the technology and potential future product concept vision, the CyPhone, and depict the general architecture and integration framework briefly.

Cite

CITATION STYLE

APA

Pulli, P., Pyssysalo, T., Kuutti, K., Simila, J., Metsavainio, J. P., & Komulainea, O. (1998). Cyphone - Mobile multimodal personal augmented reality. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) (Vol. 1368, pp. 326–336). Springer Verlag. https://doi.org/10.1007/3-540-64216-1_58

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free