论文标题
使用CATMULL-ROM样条插值实现双曲线切线功能的硬件实现
Hardware Implementation of Hyperbolic Tangent Function using Catmull-Rom Spline Interpolation
论文作者
论文摘要
深层神经网络产生了最先进的状态,从而导致许多计算机视觉和人类机器接口任务,例如对象识别,语音识别等。由于这些网络在计算上是昂贵的,定制的加速器旨在以较低的成本和功率来实现所需的性能。这些神经网络的关键构建块之一是非线性激活函数,例如Sigmoid,双曲线切线(Tanh)和Relu。需要低复杂性准确的激活功能的硬件实现来满足神经网络加速器的性能和区域目标。本文使用CATMULL-ROM样条插值提出了Tanh函数的实现。使用此方法具有相对较小的逻辑区域来实现最新结果。
Deep neural networks yield the state of the art results in many computer vision and human machine interface tasks such as object recognition, speech recognition etc. Since, these networks are computationally expensive, customized accelerators are designed for achieving the required performance at lower cost and power. One of the key building blocks of these neural networks is non-linear activation function such as sigmoid, hyperbolic tangent (tanh), and ReLU. A low complexity accurate hardware implementation of the activation function is required to meet the performance and area targets of the neural network accelerators. This paper presents an implementation of tanh function using the Catmull-Rom spline interpolation. State of the art results are achieved using this method with comparatively smaller logic area.