This study employs machine learning (ML) algorithms, trained on accurate datasets produced by TCAD, in order to massively accelerate multidomain ferroelectric FET (FeFET) simulations and take TCAD out of the framework’s loop.
Physics-based simulations using technology computer aided design (TCAD) offer high accuracy while suffering from exceedingly slow computations and significant license costs (especially when high parallelism is inevitable for Monte Carlo analysis), rendering large-scale design-space explorations infeasible. Therefore, we employ machine learning (ML) algorithms, trained on accurate datasets produced by TCAD, in order to massively accelerate multidomain ferroelectric FET (FeFET) simulations and take TCAD out of our framework’s loop. Part of this study explores approaches to predicting <inline-formula> <tex-math notation="LaTeX">${I}$ </tex-math></inline-formula>–<inline-formula> <tex-math notation="LaTeX">${V}$ </tex-math></inline-formula> characteristics using pure ML means or augmenting it with simple physics models. The huge speedup (13<inline-formula> <tex-math notation="LaTeX">$600\times $ </tex-math></inline-formula>) obtained through our ML-based modeling enabled unprecedented analysis of device-to-device and cycle-to-cycle variability for FeFET technology. Furthermore, we demonstrate how computationally infeasible analysis that would take years using TCAD (e.g., read disturbance pulses with 200 million time steps) became feasible for the first time.