论文标题

kgtuner:有效的超参数搜索知识图学习

KGTuner: Efficient Hyper-parameter Search for Knowledge Graph Learning

论文作者

Zhang, Yongqi, Zhou, Zhanke, Yao, Quanming, Li, Yong

论文摘要

尽管超参数(HPS)对于知识图(KG)学习很重要,但现有方法无法有效地搜索它们。为了解决此问题,我们首先分析了不同HP的属性,并测量了从小子图到完整图的转移能力。基于分析,我们提出了一种有效的两阶段搜索算法kgtuner,该算法有效地探索了第一阶段的小子图上的HP配置,并在第二阶段的大图上转移了最佳表现的配置,以进行微调。实验表明,我们的方法可以在同一时间预算内始终找到比基线算法更好的HP,这可以在开放图基准中大规模kgs上的四个嵌入式模型的平均相对改进。

While hyper-parameters (HPs) are important for knowledge graph (KG) learning, existing methods fail to search them efficiently. To solve this problem, we first analyze the properties of different HPs and measure the transfer ability from small subgraph to the full graph. Based on the analysis, we propose an efficient two-stage search algorithm KGTuner, which efficiently explores HP configurations on small subgraph at the first stage and transfers the top-performed configurations for fine-tuning on the large full graph at the second stage. Experiments show that our method can consistently find better HPs than the baseline algorithms within the same time budget, which achieves {9.1\%} average relative improvement for four embedding models on the large-scale KGs in open graph benchmark.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源