论文标题
受限的单调神经网络
Constrained Monotonic Neural Networks
论文作者
论文摘要
在许多关键领域(例如金融和医疗保健)中,更广泛地采用神经网络,这是由于需要解释其预测并对其施加其他限制的必要性。单调性约束是现实情况下最要求的属性之一,也是本文的重点。构建单调完全连接的神经网络的最古老的方法之一是限制其权重。不幸的是,这种结构不适用于流行的非饱和激活功能,因为它只能近似凸功能。我们表明,可以通过从典型的不饱和单调激活函数中构建两个附加激活函数来解决这一缺点,并在神经元上使用它们。我们的实验表明,与其他最先进的方法相比,这种构建单调神经网络的方法具有更好的准确性,而在最少的参数方面是最简单的方法,并且不需要对学习过程或学习后步骤进行任何修改。最后,我们证明它可以在$ \ mathbb {r}^n $的紧凑子集上近似任何连续的单调函数。
Wider adoption of neural networks in many critical domains such as finance and healthcare is being hindered by the need to explain their predictions and to impose additional constraints on them. Monotonicity constraint is one of the most requested properties in real-world scenarios and is the focus of this paper. One of the oldest ways to construct a monotonic fully connected neural network is to constrain signs on its weights. Unfortunately, this construction does not work with popular non-saturated activation functions as it can only approximate convex functions. We show this shortcoming can be fixed by constructing two additional activation functions from a typical unsaturated monotonic activation function and employing each of them on the part of neurons. Our experiments show this approach of building monotonic neural networks has better accuracy when compared to other state-of-the-art methods, while being the simplest one in the sense of having the least number of parameters, and not requiring any modifications to the learning procedure or post-learning steps. Finally, we prove it can approximate any continuous monotone function on a compact subset of $\mathbb{R}^n$.