```iii) However, if you were to use your same Gaussian decoder to model data that is itself Gaussian, you'd find that the VAE learns to ignore the latent code!```
S. Mežnar, S. Džeroski, и L. Todorovski. (2023)cite arxiv:2302.09893Comment: 35 pages, 11 tables, 7 multi-part figures, Machine learning (Springer) and journal track of ECML/PKDD 2023.
O. Rybkin, K. Daniilidis, и S. Levine. (2020)cite arxiv:2006.13202Comment: International Conference on Machine Learning (ICML), 2021. Project website is at https://orybkin.github.io/sigma-vae/.
D. Kingma, и M. Welling. 2nd International Conference on Learning Representations, ICLR 2014, Banff, AB, Canada, April 14-16, 2014, Conference Track Proceedings, (2014)