Author of the publication

BiPFT: Binary Pre-trained Foundation Transformer with Low-Rank Estimation of Binarization Residual Polynomials.

, , , , , , and . AAAI, page 16094-16102. AAAI Press, (2024)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed. You can also use the button next to the name to display some publications already assigned to the person.

 

Other publications of authors with the same name

Ant Colony Algorithm and Simulated Annealing Algorithm Based Process Route Optimization., , , , , and . ES, page 102-107. IEEE, (2014)The BINGO Project I: Baryon Acoustic Oscillations from Integrated Neutral Gas Observations, , , , , , , , , and 36 other author(s). (2021)cite arxiv:2107.01633Comment: 24 pages, 15 figures, 3 tables. Submitted to A&A.The BINGO Project VI: HI Halo Occupation Distribution and Mock Building, , , , , , , , , and 12 other author(s). (2021)cite arxiv:2107.01638Comment: 16 pages, 20 figures, 1 table. Accepted for publication in A&A.The BINGO Project II: Instrument Description, , , , , , , , , and 34 other author(s). (2021)cite arxiv:2107.01634Comment: 12 pages, 16 figures, 4 tables. Submitted to A&A.The BINGO Project III: Optical design and optimisation of the focal plane, , , , , , , , , and 19 other author(s). (2021)cite arxiv:2107.01635Comment: 20 pages, 22 figures, 3 tables. Submitted to A&A.Language-Independent Representor for Neural Machine Translation., , , , and . CoRR, (2018)Synchronous Speech Recognition and Speech-to-Text Translation with Interactive Decoding., , , , , , , and . CoRR, (2019)Touch Editing: A Flexible One-Time Interaction Approach for Translation., , , , and . AACL/IJCNLP, page 1-11. Association for Computational Linguistics, (2020)Structurally Comparative Hinge Loss for Dependency-Based Neural Text Representation., , , , and . ACM Trans. Asian Low Resour. Lang. Inf. Process., 19 (4): 58:1-58:19 (2020)Unified Prompt Learning Makes Pre-Trained Language Models Better Few-Shot Learners., , and . ICASSP, page 1-5. IEEE, (2023)