论文标题

Hypree证明搜索神经定理证明

HyperTree Proof Search for Neural Theorem Proving

论文作者

Lample, Guillaume, Lachaux, Marie-Anne, Lavril, Thibaut, Martinet, Xavier, Hayat, Amaury, Ebner, Gabriel, Rodriguez, Aurélien, Lacroix, Timothée

论文摘要

我们为基于变压器的自动定理供体提出了一个在线培训程序。我们的方法利用了一种新的搜索算法,Hypertree Proof Search(HTPS),灵感来自Alphazero的最新成功。我们的模型通过在线培训从以前的证明搜索中学习,从而使其概括到远离培训分布的领域。我们通过在复杂性增加的三个环境上研究绩效,报告了管道主要组成部分的详细消融。特别是,我们表明,仅使用HTPS,一个对带注释的证明进行训练的模型可以证明65.4%的持有的Metamath定理,从而大大优于GPT-F的先前56.5%的艺术状态。对这些未经证实的定理的在线培训将精度提高到82.6%。凭借类似的计算预算,我们在基于精益的Minif2F-Curriculum数据集上改善了最新技术,从31%到42%证明准确性。

We propose an online training procedure for a transformer-based automated theorem prover. Our approach leverages a new search algorithm, HyperTree Proof Search (HTPS), inspired by the recent success of AlphaZero. Our model learns from previous proof searches through online training, allowing it to generalize to domains far from the training distribution. We report detailed ablations of our pipeline's main components by studying performance on three environments of increasing complexity. In particular, we show that with HTPS alone, a model trained on annotated proofs manages to prove 65.4% of a held-out set of Metamath theorems, significantly outperforming the previous state of the art of 56.5% by GPT-f. Online training on these unproved theorems increases accuracy to 82.6%. With a similar computational budget, we improve the state of the art on the Lean-based miniF2F-curriculum dataset from 31% to 42% proving accuracy.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源