Applying multimodal data fusion to track autistic adolescents’ representational flexibility development during virtual reality-based training

In our study, we harnessed multimodal data to develop a predictive model aimed at assessing the development of representational flexibility (RF) in autistic adolescents engaged in virtual reality (VR)-based cognitive skills training. Recognizing VR's potential to enhance RF through immersive 3D...

Full description

Bibliographic Details
Main Authors: Jewoong Moon, Fengfeng Ke, Zlatko Sokolikj, Shayok Chakraborty
Format: Article
Language:English
Published: Elsevier 2024-01-01
Series:Computers & Education: X Reality
Subjects:
Online Access:http://www.sciencedirect.com/science/article/pii/S2949678024000138
Description
Summary:In our study, we harnessed multimodal data to develop a predictive model aimed at assessing the development of representational flexibility (RF) in autistic adolescents engaged in virtual reality (VR)-based cognitive skills training. Recognizing VR's potential to enhance RF through immersive 3D simulation tasks, we addressed the research gap in analyzing learners' digital interactions within this environment. This data mining study integrated diverse data sources—including behavioral cues, physiological responses, and direct interaction logs—collected from 178 training sessions with eight autistic adolescents. This comprehensive dataset, encompassing both audio and screen recordings, was analyzed using advanced machine learning techniques. Through decision-level data fusion, particularly employing the random forest algorithm, our model demonstrated enhanced accuracy in predicting RF development, surpassing single-source data approaches. This research not only contributes to the effective use of VR in educational interventions for autistic adolescents but also showcases the potential of multimodal data fusion in understanding complex cognitive skills development.
ISSN:2949-6780