Long Short-Term Memory Neural Networks For Modeling Nonlinear Electronic Components

2021 
This article presents a new macromodeling approach for nonlinear electronic components and circuits based on long short-term memory (LSTM) neural network. LSTM proposes a more efficient training process in comparison with the conventional recurrent neural network (RNN) training. Conventional structures such as RNN suffer from the gradient vanishing problem during the training process. LSTM addresses this issue and solves this problem in an efficient way. In order to train the proposed structure, some input and output waveforms of the original circuit, called training waveforms, should be obtained from simulation tools or measurements. Model creation using proposed method does not require information on details inside the components and input–output training waveforms are sufficient to construct the model. The provided numerical results in this article show that the proposed method is more efficient than RNN techniques for modeling components and packages in terms of both speed and accuracy. The findings suggest that the proposed method significantly reduces the training time in comparison with conventional state-of-art modeling techniques. Furthermore, simulation time of the obtained model from the proposed technique is less than both conventional models (such as SPICE models) used in circuit simulation tools and models obtained from conventional RNN method. Three practical examples, namely, an audio amplifier, Texas Instruments (TIs) SN74AHCT540 device, and MOS inverter, are utilized to manifest the validity of the proposed macromodeling approach.
    • Correction
    • Source
    • Cite
    • Save
    • Machine Reading By IdeaReader
    37
    References
    1
    Citations
    NaN
    KQI
    []