Multimodal data as a means to understand the learning experience
作者:
Highlights:
• We propose using multimodal data to capture learning experience.
• Multimodal data from 251 game sessions and 17 users were collected.
• Click-stream models achieve 39% error rate in predicting learning.
• Fusing multimodal data drops error rate up to 6%.
• Identify the physiological features that best predict skill development.
摘要
•We propose using multimodal data to capture learning experience.•Multimodal data from 251 game sessions and 17 users were collected.•Click-stream models achieve 39% error rate in predicting learning.•Fusing multimodal data drops error rate up to 6%.•Identify the physiological features that best predict skill development.
论文关键词:Human learning,Multimodal learning analytics,User-generated data,Skill acquisition,Multimodal data,Machine learning
论文评审过程:Received 3 December 2018, Revised 6 February 2019, Accepted 10 February 2019, Available online 1 March 2019, Version of Record 1 March 2019.
论文官网地址:https://doi.org/10.1016/j.ijinfomgt.2019.02.003