Diminished-1 Fermat Number Transform for Integer Convolutional Neural Networks

Zhu Baozhou, Nauman Ahmed, Johan Peltenburg, Koen Bertels, Zaid Al-Ars

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

2 Citations (Scopus)


Convolutional Neural Networks (CNNs) are a class of widely used deep artificial neural networks. However, training large CNNs to produce state-of-the-art results can take a long time. In addition, we need to reduce compute time of the inference stage for trained networks to make it accessible for real time applications. In order to achieve this, integer number formats INT8 and INT16 with reduced precision are being used to create Integer Convolutional Neural Networks (ICNNs) to allow them to be deployed on mobile devices or embedded systems. In this paper, Diminished-l Fermat Number Transform (DFNT), which refers to Fermat Number Transform (FNT) with diminished-l number representation, is proposed to accelerate ICNNs through algebraic properties of integer convolution. This is achieved by performing the convolution step as diminished -1 point-wise products between DFNT transformed feature maps, which can be reused multiple times in the calculation. Since representing and computing all the integers in the ring of integers modulo Fermat number 2 {b}+1 for FNT requires b+1 bits, diminished-1 number representation is used to enable exact and efficient calculation. Using DFNT, integer convolution is implemented on a general purpose processor, showing speedup of 2-3x with typical parameter configurations and better scalability without any round-off error compared to the baseline.

Original languageEnglish
Title of host publication2019 IEEE 4th International Conference on Big Data Analytics (ICBDA)
EditorsSheng-Uei Guan, Kang Zhang, Jiannong Cao
Place of PublicationPiscataway, NJ, USA
Number of pages6
ISBN (Electronic)978-1-7281-1282-4
ISBN (Print) 978-1-7281-1283-1
Publication statusPublished - 2019
Event4th IEEE International Conference on Big Data Analytics, ICBDA 2019 - Suzhou, China
Duration: 15 Mar 201918 Mar 2019


Conference4th IEEE International Conference on Big Data Analytics, ICBDA 2019


  • CNNs
  • Computation Complexity
  • Diminished-1 representation
  • Fermat Number Transform


Dive into the research topics of 'Diminished-1 Fermat Number Transform for Integer Convolutional Neural Networks'. Together they form a unique fingerprint.

Cite this