Expectation-Propagation for the Generative Aspect Model
T. Minka, and J. Lafferty. Proceedings of the Eighteenth Conference on Uncertainty in Artificial Intelligence, page 352--359. (2002)
Abstract
The generative aspect model is an extension of
the multinomial model for text that allows word
probabilities to vary stochastically across documents.
Previous results with aspect models have
been promising, but hindered by the computational
difficulty of carrying out inference and
learning. This paper demonstrates that the simple
variational methods of Blei et al. (2001) can
lead to inaccurate inferences and biased learning
for the generative aspect model. We develop an
alternative approach that leads to higher accuracy
at comparable cost. An extension of Expectation-
Propagation is used for inference and then embedded
in an EM algorithm for learning. Experimental
results are presented for both synthetic
and real data sets.
Proceedings of the Eighteenth Conference on Uncertainty in Artificial Intelligence
year
2002
pages
352--359
comment
"It
is found that the variational methods can lead to inaccurate
inferences and biased learning, while Expectation-
Propagation gives results that are more true to the model."
---
re-read!
%0 Conference Paper
%1 citeulike:401262
%A Minka, Thomas
%A Lafferty, John
%B Proceedings of the Eighteenth Conference on Uncertainty in Artificial Intelligence
%D 2002
%K topicinference socialnets
%P 352--359
%T Expectation-Propagation for the Generative Aspect Model
%U http://research.microsoft.com/~minka/papers/aspect/minka-aspect.pdf
%X The generative aspect model is an extension of
the multinomial model for text that allows word
probabilities to vary stochastically across documents.
Previous results with aspect models have
been promising, but hindered by the computational
difficulty of carrying out inference and
learning. This paper demonstrates that the simple
variational methods of Blei et al. (2001) can
lead to inaccurate inferences and biased learning
for the generative aspect model. We develop an
alternative approach that leads to higher accuracy
at comparable cost. An extension of Expectation-
Propagation is used for inference and then embedded
in an EM algorithm for learning. Experimental
results are presented for both synthetic
and real data sets.
@inproceedings{citeulike:401262,
abstract = {The generative aspect model is an extension of
the multinomial model for text that allows word
probabilities to vary stochastically across documents.
Previous results with aspect models have
been promising, but hindered by the computational
difficulty of carrying out inference and
learning. This paper demonstrates that the simple
variational methods of Blei et al. (2001) can
lead to inaccurate inferences and biased learning
for the generative aspect model. We develop an
alternative approach that leads to higher accuracy
at comparable cost. An extension of Expectation-
Propagation is used for inference and then embedded
in an EM algorithm for learning. Experimental
results are presented for both synthetic
and real data sets.},
added-at = {2006-06-16T10:34:37.000+0200},
author = {Minka, Thomas and Lafferty, John},
biburl = {https://www.bibsonomy.org/bibtex/27680d22f1349a4f96ddbf178d9547e1c/ldietz},
booktitle = {Proceedings of the Eighteenth Conference on Uncertainty in Artificial Intelligence},
citeulike-article-id = {401262},
comment = {"It
is found that the variational methods can lead to inaccurate
inferences and biased learning, while Expectation-
Propagation gives results that are more true to the model."
---
re-read!},
interhash = {b4be319c9424a60c9d3bec9fe535b3ed},
intrahash = {7680d22f1349a4f96ddbf178d9547e1c},
keywords = {topicinference socialnets},
pages = {352--359},
priority = {4},
timestamp = {2006-06-16T10:34:37.000+0200},
title = {Expectation-Propagation for the Generative Aspect Model},
url = {http://research.microsoft.com/~minka/papers/aspect/minka-aspect.pdf},
year = 2002
}