Navigation überspringen
Universitätsbibliothek Heidelberg
Status: Bibliographieeintrag

Verfügbarkeit
Standort: ---
Exemplare: ---
heiBIB
 Online-Ressource
Verfasst von:Shakibhamedan, Salar [VerfasserIn]   i
 Amirafshar, Nima [VerfasserIn]   i
 Baroughi, Ahmad Sedigh [VerfasserIn]   i
 Shahhoseini, Hadi Shahriar [VerfasserIn]   i
 Taherinejad, Nima [VerfasserIn]   i
Titel:ACE-CNN
Titelzusatz:approximate carry disregard multipliers for energy-efficient CNN-based image classification
Verf.angabe:Salar Shakibhamedan, Nima Amirafshar, Graduate Student Member, IEEE, Ahmad Sedigh Baroughi , Hadi Shahriar Shahhoseini, and Nima Taherinejad, Member, IEEE
E-Jahr:2024
Jahr:May 2024
Umfang:14 S.
Illustrationen:Illustrationen
Fussnoten:Veröffentlicht: 01. März 2024 ; Gesehen am 22.07.2024
Titel Quelle:Enthalten in: Institute of Electrical and Electronics EngineersIEEE transactions on circuits and systems. 1, Regular papers
Ort Quelle:New York, NY : Institute of Electrical and Electronics Engineers, 2004
Jahr Quelle:2024
Band/Heft Quelle:71(2024), 5 vom: Mai, Seite 2280-2293
ISSN Quelle:1558-0806
Abstract:This paper presents the design and development of Signed Carry Disregard Multiplier (SCDM8), a family of signed approximate multipliers tailored for integration into Convolutional Neural Networks (CNNs). Extensive experiments were conducted on popular pre-trained CNN models, including VGG16, VGG19, ResNet101, ResNet152, MobileNetV2, InceptionV3, and ConvNeXt-T to evaluate the trade-off between accuracy and approximation. The results demonstrate that ACE-CNN outperforms other configurations, offering a favorable balance between accuracy and computational efficiency. In our experiments, when applied to VGG16, SCDM8 achieves an average reduction in power consumption of 35% with a marginal decrease in accuracy of only 1.5%. Similarly, when incorporated into ResNet152, SCDM8 yields an energy saving of 42% while sacrificing only 1.8% in accuracy. ACE-CNN provides the first approximate version of ConvNeXt which yields up to 72% energy improvement at the price of less than only 1.3% Top-1 accuracy. These results highlight the suitability of SCDM8 as an approximation method across various CNN models. Our analysis shows that the ACE-CNN outperforms state-of-the-art approaches in accuracy, energy efficiency, and computation precision for image classification tasks in CNNs. Our study investigated the resiliency of CNN models to approximate multipliers, revealing that ResNet101 demonstrated the highest resiliency with an average difference in the accuracy of 0.97%, whereas LeNet5 Inspired-CNN exhibited the lowest resiliency with an average difference of 2.92%. These findings aid in selecting energy-efficient approximate multipliers for CNN-based systems, and contribute to the development of energy-efficient deep learning systems by offering an effective approximation technique for multipliers in CNNs. The proposed SCDM8 family of approximate multipliers opens new avenues for efficient deep learning applications, enabling significant energy savings with virtually no loss in accuracy.
DOI:doi:10.1109/TCSI.2024.3369230
URL:Bitte beachten Sie: Dies ist ein Bibliographieeintrag. Ein Volltextzugriff für Mitglieder der Universität besteht hier nur, falls für die entsprechende Zeitschrift/den entsprechenden Sammelband ein Abonnement besteht oder es sich um einen OpenAccess-Titel handelt.

kostenfrei: Volltext: https://doi.org/10.1109/TCSI.2024.3369230
 kostenfrei: Volltext: https://www.webofscience.com/api/gateway?GWVersion=2&SrcAuth=DynamicDOIArticle&SrcApp=WOS&KeyAID=10.1109%2FTCSI.2024.336 ...
 DOI: https://doi.org/10.1109/TCSI.2024.3369230
Datenträger:Online-Ressource
Sprache:eng
Sach-SW:4-2 COMPRESSORS
 approximate multiplier
 Computer architecture
 convolutional neural network
 Convolutional neural networks
 Delays
 DESIGN
 Energy efficiency
 Hardware
 image classification
 Image classification
 Task analysis
K10plus-PPN:1895988691
Verknüpfungen:→ Zeitschrift

Permanenter Link auf diesen Titel (bookmarkfähig):  https://katalog.ub.uni-heidelberg.de/titel/69236300   QR-Code
zum Seitenanfang