论文标题

使用神经引导程序综合(扩展版本)将命令的自动转升到功能代码

Automated Transpilation of Imperative to Functional Code using Neural-Guided Program Synthesis (Extended Version)

论文作者

Mariano, Benjamin, Chen, Yanju, Feng, Yu, Durrett, Greg, Dillig, Isil

论文摘要

尽管Java,Python和C#等许多主流语言越来越多地包含功能性API来简化编程并改善并行化/性能,但没有可用于使用这些API自动将现有的命令代码自动转化为功能变体的有效技术。由于这个问题的激励,本文提出了一种基于归纳计划综合现代化代码的归纳计划的转换方法。我们的方法是基于这样的观察结果:在这种情况下,绝大多数源/目标程序都满足了我们称之为痕量兼容性的假设:不仅程序共享句法相同的低级表达式,而且这些表达式在相应的执行轨迹中也具有相同的值。我们的方法利用这种观察来设计一种新的神经引导的合成算法((1),该算法使用了一种新型的神经架构,称为Cognate Grammar网络(CGN),(2)(2)利用一种同意执行的形式来修剪基于计算过程中产生的中间值的部分偏段。我们已经在称为NGST2的工具中实现了我们的方法,并将其用于将命令的Java和Python代码转换为分别使用流和功能API的功能变体。我们的实验表明,NGST2的表现明显优于几个基线,并且我们提出的神经结构和修剪技术对于取得良好的结果至关重要。

While many mainstream languages such as Java, Python, and C# increasingly incorporate functional APIs to simplify programming and improve parallelization/performance, there are no effective techniques that can be used to automatically translate existing imperative code to functional variants using these APIs. Motivated by this problem, this paper presents a transpilation approach based on inductive program synthesis for modernizing existing code. Our method is based on the observation that the overwhelming majority of source/target programs in this setting satisfy an assumption that we call trace-compatibility: not only do the programs share syntactically identical low-level expressions, but these expressions also take the same values in corresponding execution traces. Our method leverages this observation to design a new neural-guided synthesis algorithm that (1) uses a novel neural architecture called cognate grammar network (CGN) and (2) leverages a form of concolic execution to prune partial programs based on intermediate values that arise during a computation. We have implemented our approach in a tool called NGST2 and use it to translate imperative Java and Python code to functional variants that use the Stream and functools APIs respectively. Our experiments show that NGST2 significantly outperforms several baselines and that our proposed neural architecture and pruning techniques are vital for achieving good results.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源