Author of the publication

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Extremely Low-bit Convolution Optimization for Quantized Neural Network on Modern Computer Architectures., , , , , , , , , and 1 other author(s). ICPP, page 38:1-38:12. ACM, (2020)Efficient Bitwidth Search for Practical Mixed Precision Neural Network., , , , , and . CoRR, (2020)Mimose: An Input-Aware Checkpointing Planner for Efficient Training on GPU., , , , , , , , , and 1 other author(s). CoRR, (2022)Once Quantization-Aware Training: High Performance Extremely Low-bit Architecture Search., , , , , , , , and . ICCV, page 5320-5329. IEEE, (2021)Incorporating Convolution Designs into Visual Transformers., , , , , and . ICCV, page 559-568. IEEE, (2021)Outlier Suppression: Pushing the Limit of Low-bit Transformer Language Models., , , , , , , and . NeurIPS, (2022)Diversifying Sample Generation for Accurate Data-Free Quantization., , , , , , , , and . CVPR, page 15658-15667. Computer Vision Foundation / IEEE, (2021)NNLQP: A Multi-Platform Neural Network Latency Query and Prediction System with An Evolving Database., , , , and . ICPP, page 78:1-78:14. ACM, (2022)BRECQ: Pushing the Limit of Post-Training Quantization by Block Reconstruction., , , , , , , , and . ICLR, OpenReview.net, (2021)QDrop: Randomly Dropping Quantization for Extremely Low-bit Post-Training Quantization., , , , and . ICLR, OpenReview.net, (2022)