From post

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

TPE: A High-Performance Edge-Device Inference with Multi-level Transformational Mechanism., , , , , , , и . AICAS, стр. 1-5. IEEE, (2023)CPE: An Energy-Efficient Edge-Device Training with Multi-dimensional Compression Mechanism., , , , , , и . DAC, стр. 1-6. IEEE, (2023)CIMFormer: A 38.9TOPS/W-8b Systolic CIM-Array Based Transformer Processor with Token-Slimmed Attention Reformulating and Principal Possibility Gathering., , , , , , , , , и . A-SSCC, стр. 1-3. IEEE, (2023)Ayaka: A Versatile Transformer Accelerator With Low-Rank Estimation and Heterogeneous Dataflow., , , , , , , , , и 3 other автор(ы). IEEE J. Solid State Circuits, 59 (10): 3342-3356 (октября 2024)A 28nm 27.5TOPS/W Approximate-Computing-Based Transformer Processor with Asymptotic Sparsity Speculating and Out-of-Order Computing., , , , , , , , , и 1 other автор(ы). ISSCC, стр. 1-3. IEEE, (2022)A 28nm 276.55TFLOPS/W Sparse Deep-Neural-Network Training Processor with Implicit Redundancy Speculation and Batch Normalization Reformulation., , , , , , , , и . VLSI Circuits, стр. 1-2. IEEE, (2021)Trainer: An Energy-Efficient Edge-Device Training Processor Supporting Dynamic Weight Pruning., , , , , , , , и . IEEE J. Solid State Circuits, 57 (10): 3164-3178 (2022)A 28nm 49.7TOPS/W Sparse Transformer Processor with Random-Projection-Based Speculation, Multi-Stationary Dataflow, and Redundant Partial Product Elimination., , , , , , , , , и 3 other автор(ы). A-SSCC, стр. 1-3. IEEE, (2023)CIMFormer: A Systolic CIM-Array-Based Transformer Accelerator With Token-Pruning-Aware Attention Reformulating and Principal Possibility Gathering., , , , , , , , , и 1 other автор(ы). IEEE J. Solid State Circuits, 59 (10): 3317-3329 (октября 2024)