Open Access iconOpen Access

ARTICLE

An Optimized Deep-Learning-Based Low Power Approximate Multiplier Design

M. Usharani1,*, B. Sakthivel2, S. Gayathri Priya3, T. Nagalakshmi4, J. Shirisha5

1 Department of Electronics and Communication Engineering, Er.Perunal Manimekalai College of Engineering, Konneripalli, Hosur, 635117, India
2 Department of Electronics and Communication Engineering, Pandian Saraswathi Yadav Engineering College, Sivagangai, Tamilnadu, 630561, India
3 Department of Electronics and Communication Engineering, R.M.D Engineering College, Gummidipundi, Tamilnadu, 601206, India
4 Department of Computer Science and Engineering, Koneru Lakshmaiah Education Foundation, Vaddeswaram, Guntur, 522502, India
5 Department of Electronics and Communication Engineering, Malla Reddy Engineering College, Hyderabad, Telangana, 500015, India

* Corresponding Author: M. Usharani. Email: email

Computer Systems Science and Engineering 2023, 44(2), 1647-1657. https://doi.org/10.32604/csse.2023.027744

Abstract

Approximate computing is a popular field for low power consumption that is used in several applications like image processing, video processing, multimedia and data mining. This Approximate computing is majorly performed with an arithmetic circuit particular with a multiplier. The multiplier is the most essential element used for approximate computing where the power consumption is majorly based on its performance. There are several researchers are worked on the approximate multiplier for power reduction for a few decades, but the design of low power approximate multiplier is not so easy. This seems a bigger challenge for digital industries to design an approximate multiplier with low power and minimum error rate with higher accuracy. To overcome these issues, the digital circuits are applied to the Deep Learning (DL) approaches for higher accuracy. In recent times, DL is the method that is used for higher learning and prediction accuracy in several fields. Therefore, the Long Short-Term Memory (LSTM) is a popular time series DL method is used in this work for approximate computing. To provide an optimal solution, the LSTM is combined with a meta-heuristics Jellyfish search optimisation technique to design an input aware deep learning-based approximate multiplier (DLAM). In this work, the jelly optimised LSTM model is used to enhance the error metrics performance of the Approximate multiplier. The optimal hyperparameters of the LSTM model are identified by jelly search optimisation. This fine-tuning is used to obtain an optimal solution to perform an LSTM with higher accuracy. The proposed pre-trained LSTM model is used to generate approximate design libraries for the different truncation levels as a function of area, delay, power and error metrics. The experimental results on an 8-bit multiplier with an image processing application shows that the proposed approximate computing multiplier achieved a superior area and power reduction with very good results on error rates.

Keywords


Cite This Article

APA Style
Usharani, M., Sakthivel, B., Priya, S.G., Nagalakshmi, T., Shirisha, J. (2023). An optimized deep-learning-based low power approximate multiplier design. Computer Systems Science and Engineering, 44(2), 1647-1657. https://doi.org/10.32604/csse.2023.027744
Vancouver Style
Usharani M, Sakthivel B, Priya SG, Nagalakshmi T, Shirisha J. An optimized deep-learning-based low power approximate multiplier design. Comput Syst Sci Eng. 2023;44(2):1647-1657 https://doi.org/10.32604/csse.2023.027744
IEEE Style
M. Usharani, B. Sakthivel, S.G. Priya, T. Nagalakshmi, and J. Shirisha, “An Optimized Deep-Learning-Based Low Power Approximate Multiplier Design,” Comput. Syst. Sci. Eng., vol. 44, no. 2, pp. 1647-1657, 2023. https://doi.org/10.32604/csse.2023.027744



cc Copyright © 2023 The Author(s). Published by Tech Science Press.
This work is licensed under a Creative Commons Attribution 4.0 International License , which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
  • 1774

    View

  • 837

    Download

  • 0

    Like

Share Link