论文标题

通过增加转换来增加最大重量独立设置问题的数据减少

Boosting Data Reduction for the Maximum Weight Independent Set Problem Using Increasing Transformations

论文作者

Gellner, Alexander, Lamm, Sebastian, Schulz, Christian, Strash, Darren, Zaválnij, Bogdán

论文摘要

给定顶点加权图,最大重量独立集问题要求配对的非贴量套件集,以便其权重总和最大。分支机构范式是事实上的标准方法,可以在实践中解决问题。在此范式中,还应用了数据减少规则以减少问题大小。这些数据减少规则确保给定新(较小)输入的最佳解决方案,可以快速在原始输入上构建最佳解决方案。 我们为问题引入了新的广义数据减少和转换规则。我们工作的一个关键特征是,某些转换规则可以增加输入的大小。令人惊讶的是,这些所谓的增长转化可以简化问题,并打开还原空间,以在整个算法的后来产生甚至较小的不可还原图。在实验中,我们的算法在所有实例上都计算出明显更小的不可还原图,它求解了比以前可能的最佳实例要多,在许多情况下,最多要比最新的最新求解器要快两个数量级,并且在许多情况下发现了较高的质量溶液。虽然不断增加的转换仅有效地进行预处理,但我们认为这是迈向新的分支和转换范式的关键初始步骤。

Given a vertex-weighted graph, the maximum weight independent set problem asks for a pair-wise non-adjacent set of vertices such that the sum of their weights is maximum. The branch-and-reduce paradigm is the de facto standard approach to solve the problem to optimality in practice. In this paradigm, data reduction rules are applied to decrease the problem size. These data reduction rules ensure that given an optimum solution on the new (smaller) input, one can quickly construct an optimum solution on the original input. We introduce new generalized data reduction and transformation rules for the problem. A key feature of our work is that some transformation rules can increase the size of the input. Surprisingly, these so-called increasing transformations can simplify the problem and also open up the reduction space to yield even smaller irreducible graphs later throughout the algorithm. In experiments, our algorithm computes significantly smaller irreducible graphs on all except one instance, solves more instances to optimality than previously possible, is up to two orders of magnitude faster than the best state-of-the-art solver, and finds higher-quality solutions than heuristic solvers DynWVC and HILS on many instances. While the increasing transformations are only efficient enough for preprocessing at this time, we see this as a critical initial step towards a new branch-and-transform paradigm.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源