Back‐End and Flexible Substrate Compatible Analog Ferroelectric Field‐Effect Transistors for Accurate Online Training in Deep Neural Network Accelerators
التفاصيل البيبلوغرافية
العنوان:
Back‐End and Flexible Substrate Compatible Analog Ferroelectric Field‐Effect Transistors for Accurate Online Training in Deep Neural Network Accelerators
Online training of deep neural networks (DNN) can be significantly accelerated by performing in situ vector‐matrix multiplication in a crossbar array of analog memories. However, training accuracies often suffer due to nonideal properties of synapses such as nonlinearity, asymmetry, limited bit precision, and dynamic weight update range within a constrained power budget. Herein, a fully scalable process is reported for digital and analog ferroelectric memory transistors with possibilities for both volatile and nonvolatile data retention and 16 reproducible conducting states. Network training experiments with these ferroelectric field‐effect transistors show >96% classification accuracy with Modified National Institute of Standards and Technology (MNIST) handwritten datasets highlighting their potential for implementation in scaled DNN architectures.