An Algorithm to Define Emotions Based on Facial Gestures as Automated Input in Survey Instrument
This study is carried out to help researchers who are having problem in acquiring accurate responses or answers from dyslexic children in their survey instrument.Due to the children impairment in reading, spelling and writing, researchers who conduct studies on this particular group are unable to ga...
Main Authors: | Tayib, Saifulazmi, Jamaludin, Zulikha |
---|---|
Format: | Article |
Published: |
American Scientific Publishers
2016
|
Subjects: |
Similar Items
-
Multimodal fusion: Gesture and speech input in augmented reality environment
by: Ismail, Ajune Wanis, et al.
Published: (2015) -
Multimodal fusion : gesture and speech input in augmented reality environment
by: Ismail, Ajune Wanis, et al.
Published: (2014) -
Automated kinship verification and identification through human facial images: a survey
by: Almuashi, M., et al.
Published: (2017) -
Implementation of an automated smart home control for detection for detecting human emotions via facial detection
by: Lim, Teck Boon, et al.
Published: (2015) -
A survey of hand gesture dialogue modeling for map navigation
by: Yee, Yong Pang, et al.
Published: (2012)