Conceiving human interaction by visualising depth data of head pose changes and emotion recognition via facial expressions

7Citations
Citations of this article
38Readers
Mendeley users who have this article in their library.

Abstract

Affective computing in general and human activity and intention analysis in particular comprise a rapidly-growing field of research. Head pose and emotion changes present serious challenges when applied to player’s training and ludology experience in serious games, or analysis of customer satisfaction regarding broadcast and web services, or monitoring a driver’s attention. Given the increasing prominence and utility of depth sensors, it is now feasible to perform large-scale collection of three-dimensional (3D) data for subsequent analysis. Discriminative random regression forests were selected in order to rapidly and accurately estimate head pose changes in an unconstrained environment. In order to complete the secondary process of recognising four universal dominant facial expressions (happiness, anger, sadness and surprise), emotion recognition via facial expressions (ERFE) was adopted. After that, a lightweight data exchange format (JavaScript Object Notation (JSON)) is employed, in order to manipulate the data extracted from the two aforementioned settings. Motivated by the need to generate comprehensible visual representations from different sets of data, in this paper, we introduce a system capable of monitoring human activity through head pose and emotion changes, utilising an affordable 3D sensing technology (Microsoft Kinect sensor).

References Powered by Scopus

Recognizing action units for facial expression analysis

1366Citations
N/AReaders
Get full text

Head pose estimation in computer vision: A survey

1108Citations
N/AReaders
Get full text

Random Forests for Real Time 3D Face Analysis

461Citations
N/AReaders
Get full text

Cited by Powered by Scopus

Emotional Design

10Citations
N/AReaders
Get full text

BCI for Assessing the Emotional and Cognitive Skills of Children with Special Educational Needs

9Citations
N/AReaders
Get full text

A hybrid model combining neural networks and decision tree for comprehension detection.

3Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Kalliatakis, G., Stergiou, A., & Vidakis, N. (2017). Conceiving human interaction by visualising depth data of head pose changes and emotion recognition via facial expressions. Computers, 6(3). https://doi.org/10.3390/computers6030025

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 10

67%

Researcher 3

20%

Professor / Associate Prof. 1

7%

Lecturer / Post doc 1

7%

Readers' Discipline

Tooltip

Computer Science 6

46%

Engineering 3

23%

Design 2

15%

Neuroscience 2

15%

Save time finding and organizing research with Mendeley

Sign up for free