论文标题

GIKT:一种基于图的知识跟踪的互动模型

GIKT: A Graph-based Interaction Model for Knowledge Tracing

论文作者

Yang, Yang, Shen, Jian, Qu, Yanru, Liu, Yunfei, Wang, Kerong, Zhu, Yaoming, Zhang, Weinan, Yu, Yong

论文摘要

随着在线教育的快速发展,知识追踪(KT)已成为一个基本问题,它可以追溯到学生的知识状况并预测他们在新问题上的表现。在线教育系统中,问题通常很多,并且与技能的相关性总是很少。但是,以前的文献没有涉及问题信息以及高阶问题技能相关性,这主要受数据稀疏性和多技能问题的限制。从模型的角度来看,以前的模型几乎无法捕捉学生运动历史的长期依赖性,并且无法以一致的方式对学生问题和学生技能之间的相互作用进行建模。在本文中,我们提出了一个基于图的知识跟踪(GIKT)的相互作用模型,以解决上述探针。更具体地说,GIKT利用图形卷积网络(GCN)通过嵌入传播来实质性地结合问题技能相关性。此外,考虑到相关问题通常散布在整个运动历史中,而这个问题和技能只是知识的不同实例,因此Gikt概括了学生对问题的主人的程度,以使学生目前的状态,学生历史相关的练习,目标问题和相关技能之间的相互作用。三个数据集的实验表明,GIKT实现了新的最新性能,并且至少1%的绝对AUC改进。

With the rapid development in online education, knowledge tracing (KT) has become a fundamental problem which traces students' knowledge status and predicts their performance on new questions. Questions are often numerous in online education systems, and are always associated with much fewer skills. However, the previous literature fails to involve question information together with high-order question-skill correlations, which is mostly limited by data sparsity and multi-skill problems. From the model perspective, previous models can hardly capture the long-term dependency of student exercise history, and cannot model the interactions between student-questions, and student-skills in a consistent way. In this paper, we propose a Graph-based Interaction model for Knowledge Tracing (GIKT) to tackle the above probems. More specifically, GIKT utilizes graph convolutional network (GCN) to substantially incorporate question-skill correlations via embedding propagation. Besides, considering that relevant questions are usually scattered throughout the exercise history, and that question and skill are just different instantiations of knowledge, GIKT generalizes the degree of students' master of the question to the interactions between the student's current state, the student's history related exercises, the target question, and related skills. Experiments on three datasets demonstrate that GIKT achieves the new state-of-the-art performance, with at least 1% absolute AUC improvement.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源