Deep-learning cell-delay modeling for static timing analysis
Author's Department
Electronics & Communications Engineering Department
Find in your Library
https://doi.org/10.1016/j.asej.2022.101828
Document Type
Research Article
Publication Title
Ain Shams Engineering Journal
Publication Date
Spring 2-1-2023
doi
10.1016/j.asej.2022.101828
Abstract
Delay and transition timetables plus voltage waveforms are used to characterize standard cell delays. More accurate models explode cell library size and degrades design flow performance. Our proposed deep learning non-linear delay model, DL-NLDM, technique outperformed 7×7 NLDM-LUT in average percentage errors with up to 1.4% error compared to SPICE and outperformed the non-standard 100×100 NLDM-LUT in maximum percentage errors. The proposed DL Autoencoder-based waveform compression outperformed singular value decomposition by 1.79×. Additionally, a novel DL waveform-delay model, DL-WFDM, models cell delays using encoded waveforms instead of delay and transition time. DL-WFDM outperformed DL-NLDM in maximum delay percentage errors.
First Page
1
Last Page
8
Recommended Citation
Waseem Raslan, Yehea Ismail, Deep-learning cell-delay modeling for static timing analysis, Ain Shams Engineering Journal, Volume 14, Issue 1, 2023, 101828, ISSN 2090-4479, https://doi.org/10.1016/j.asej.2022.101828