Jumping Shift: A Logarithmic Quantization Method for Low-Power CNN Acceleration

Longxing Jiang, David Aledo , Rene van Leuken

Research output: Chapter in Book/Conference proceedings/Edited volumeConference contributionScientificpeer-review

6 Downloads (Pure)


Logarithmic quantization for Convolutional Neural Networks (CNN): a) fits well typical weights and activation distributions, and b) allows the replacement of the multiplication operation by a shift operation that can be implemented with fewer hardware resources. We propose a new quantization method named Jumping Log Quantization (JLQ). The key idea of JLQ is to extend the quantization range, by adding a coefficient parameter “s” in the power of two exponents $(2^{sx+i})$. This quantization strategy skips some values from the standard logarithmic quantization. In addition, we also develop a small hardware-friendly optimization called weight de-zero. Zero-valued weights that cannot be performed by a single shift operation are all replaced with logarithmic weights to reduce hardware resources with almost no accuracy loss. To implement the Multiply-And-Accumulate (MAC) operation (needed to compute convolutions) when the weights are JLQ-ed and de-zeroed, a new Processing Element (PE) have been developed. This new PE uses a modified barrel shifter that can efficiently avoid the skipped values. Resource utilization, area, and power consumption of the new PE standing alone are reported. We have found that JLQ performs better than other state-of-the-art logarithmic quantization methods when the bit width of the operands becomes very small.
Original languageEnglish
Title of host publicationProceedings of the 2023 Design, Automation & Test in Europe Conference & Exhibition (DATE)
Place of PublicationPiscataway
Number of pages6
ISBN (Print)979-8-3503-9624-9
Publication statusPublished - 2023
EventDATE 2023: Design, Automation & Test in Europe Conference & Exhibition - Antwerp, Belgium
Duration: 17 Apr 202319 Apr 2023


ConferenceDATE 2023
Internet address

Bibliographical note

Green Open Access added to TU Delft Institutional Repository 'You share, we take care!' - Taverne project https://www.openaccess.nl/en/you-share-we-take-care
Otherwise as indicated in the copyright section: the publisher is the copyright holder of this work and the author uses the Dutch legislation to make this work public.


  • Convolutional Neural Network
  • Low-power hardware acceleration
  • Logarithmic Quantization
  • FPGA


Dive into the research topics of 'Jumping Shift: A Logarithmic Quantization Method for Low-Power CNN Acceleration'. Together they form a unique fingerprint.

Cite this