Artigo Revisado por pares

LATIM: Loading-Aware Offline Training Method for Inverter-Based Memristive Neural Networks

2021; Institute of Electrical and Electronics Engineers; Volume: 68; Issue: 10 Linguagem: Inglês

10.1109/tcsii.2021.3072289

ISSN

1558-3791

Autores

Shaghayegh Vahdat, Mehdi Kamal, Ali Afzali‐Kusha, Massoud Pedram,

Tópico(s)

Ferroelectric and Negative Capacitance Devices

Resumo

In this brief, we present a high accuracy training method for inverter-based memristive neural networks ( IM -NNs). The method, which relies on accurate modeling of the circuit element characteristics, is called LATIM (Loading-Aware offline Training method for Inverter-based Memristive NNs). In LATIM, an approximation method is proposed to estimate the effective load of the memristive crossbar (as the synapses) while two NNs are utilized to predict the voltage transfer characteristic (VTC) of the inverters (as the activation functions). Efficacy of the proposed method is compared with the recent offline training methods for IM -NNs, called PHAX and RIM. Simulation results reveal that LATIM can predict the output voltage of the IM -NNs, on average, by $14\times $ ( $6\times $ ) and $29\times $ ( $4\times $ ) smaller error for the MNIST and Fashion MNIST datasets, respectively, compared to those of PHAX (RIM) method. In addition, IM -NNs trained by LATIM consume, on average, 62% and 53% lower energy compared to PHAX and RIM methods due to proper sizing of the inverters.

Referência(s)