TY - CHAP
T1 - Energy Efficient LSTM Accelerator with e-FPGAs for XAI Based Text Classification
AU - Oommen, Abu Thomas
AU - Guha, Krishnendu
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2024.
PY - 2024
Y1 - 2024
N2 - The present era has witnessed the increasing use of reconfigurable hardware or field programmable gate arrays (FPGAs) as hardware accelerators for intelligent applications. Existing explainable artificial intelligence (XAI) based applications are associated with low latency, high-power consumption and hence, are not energy efficient in nature. In this article, we consider an XAI based text classifier that utilizes LSTM model. Initially, we discuss the evolution of AI from symbolic approaches to deep learning, emphasizing the importance of addressing the computational demands and energy efficiency of deep learning models like LSTMs. We propose the use of embedded FPGAs or e-FPGAs as hardware accelerators in the system design. For an XAI based text classifier that uses LSTM, we find out the various functional units and order them as per their power consumption. Then, we try to map them to available e-FPGAs, which are partitioned into various virtual portions that houses the different power and time-consuming functional units. We analyze how the throughput and power consumption of the system varies with increasing e-FPGA resources. As obtained from experimental results, throughput increases, while power consumption decreases when the various functional units are mapped to the e-FPGA resources, thus, enhancing the energy efficiency of the system.
AB - The present era has witnessed the increasing use of reconfigurable hardware or field programmable gate arrays (FPGAs) as hardware accelerators for intelligent applications. Existing explainable artificial intelligence (XAI) based applications are associated with low latency, high-power consumption and hence, are not energy efficient in nature. In this article, we consider an XAI based text classifier that utilizes LSTM model. Initially, we discuss the evolution of AI from symbolic approaches to deep learning, emphasizing the importance of addressing the computational demands and energy efficiency of deep learning models like LSTMs. We propose the use of embedded FPGAs or e-FPGAs as hardware accelerators in the system design. For an XAI based text classifier that uses LSTM, we find out the various functional units and order them as per their power consumption. Then, we try to map them to available e-FPGAs, which are partitioned into various virtual portions that houses the different power and time-consuming functional units. We analyze how the throughput and power consumption of the system varies with increasing e-FPGA resources. As obtained from experimental results, throughput increases, while power consumption decreases when the various functional units are mapped to the e-FPGA resources, thus, enhancing the energy efficiency of the system.
KW - Energy efficiency
KW - Explainable AI (XAI)
KW - Field-Programmable Gate Arrays (FPGAs)
KW - Hardware acceleration
KW - LSTM
KW - Text classification
UR - https://www.scopus.com/pages/publications/85206978837
U2 - 10.1007/978-981-97-6489-1_15
DO - 10.1007/978-981-97-6489-1_15
M3 - Chapter
AN - SCOPUS:85206978837
SN - 9789819764884
T3 - Lecture Notes in Networks and Systems
SP - 203
EP - 217
BT - Proceedings of International Conference on Data, Electronics and Computing - ICDEC 2023
A2 - Das, Nibaran
A2 - Bhattacharjee, Debotosh
A2 - Khan, Ajoy Kumar
A2 - Mandal, Swagata
A2 - Krejcar, Ondrej
PB - Springer Science and Business Media Deutschland GmbH
T2 - 2nd International Conference on Data, Electronics, and Computing, ICDEC 2023
Y2 - 15 December 2023 through 16 December 2023
ER -