From post

Efficient Informed Proposals for Discrete Distributions via Newton's Series Approximation.

, , , , и . AISTATS, том 206 из Proceedings of Machine Learning Research, стр. 7288-7310. PMLR, (2023)

Please choose a person to relate this publication to

To differ between persons with the same name, the academic degree and the title of an important publication will be displayed.

 

Другие публикации лиц с тем же именем

Balance is Essence: Accelerating Sparse Training via Adaptive Gradient Correction., , , , и . CoRR, (2023)ReWOO: Decoupling Reasoning from Observations for Efficient Augmented Language Models., , , , , и . CoRR, (2023)Efficient Informed Proposals for Discrete Distributions via Newton's Series Approximation., , , , и . AISTATS, том 206 из Proceedings of Machine Learning Research, стр. 7288-7310. PMLR, (2023)Accelerating Dataset Distillation via Model Augmentation., , , , , , , , и . CVPR, стр. 11950-11959. IEEE, (2023)Towards Reliable Rare Category Analysis on Graphs via Individual Calibration., , , и . KDD, стр. 2629-2638. ACM, (2023)Dynamic Sparse Training via Balancing the Exploration-Exploitation Trade-off., , , , , , и . DAC, стр. 1-6. IEEE, (2023)Calibrating the Rigged Lottery: Making All Tickets Reliable., , , и . ICLR, OpenReview.net, (2023)Neurogenesis Dynamics-inspired Spiking Neural Network Training Acceleration., , , , , , , , , и . DAC, стр. 1-6. IEEE, (2023)Rethinking Data Distillation: Do Not Overlook Calibration., , , , , , и . ICCV, стр. 4912-4922. IEEE, (2023)