From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Latency-aware Spatial-wise Dynamic Networks., , , , , , и . NeurIPS, (2022)Algorithm-Hardware Co-Design for Energy-Efficient A/D Conversion in ReRAM-Based Accelerators., , , и . DATE, стр. 1-6. IEEE, (2024)Tailor: removing redundant operations in memristive analog neural network accelerators., , , , и . DAC, стр. 1009-1014. ACM, (2022)Reducing Overfitting in Deep Convolutional Neural Networks Using Redundancy Regularizer., , , , и . ICANN (2), том 10614 из Lecture Notes in Computer Science, стр. 49-55. Springer, (2017)Enabling High-Quality Uncertainty Quantification in a PIM Designed for Bayesian Neural Network., , , , , , , , , и 4 other автор(ы). HPCA, стр. 1043-1055. IEEE, (2022)Crane: Mitigating Accelerator Under-utilization Caused by Sparsity Irregularities in CNNs., , , , , , , и . IEEE Trans. Computers, 69 (7): 931-943 (2020)PTQ4ViT: Post-training Quantization for Vision Transformers with Twin Uniform Quantization., , , , и . ECCV (12), том 13672 из Lecture Notes in Computer Science, стр. 191-207. Springer, (2022)A Closer Look at Time Steps is Worthy of Triple Speed-Up for Diffusion Model Training., , , , , , , и . CoRR, (2024)Rapid Configuration of Asynchronous Recurrent Neural Networks for ASIC Implementations., , , , , , и . HPEC, стр. 1-6. IEEE, (2021)34.3 A 22nm 64kb Lightning-Like Hybrid Computing-in-Memory Macro with a Compressed Adder Tree and Analog-Storage Quantizers for Transformer and CNNs., , , , , , , , , и 13 other автор(ы). ISSCC, стр. 570-572. IEEE, (2024)