A. Golan, G. Judge, и J. Perloff. Journal of the American Statistical Association, 91 (434):
841--853(июня 1996)The classical maximum entropy (ME) approach to estimating the unknown parameters of a multinomial discrete choice problem, which is equivalent to the maximum likelihood multinomial logit (ML) estimator, is generalized. The generalized maximum entropy (GME) model includes noise terms in the multinomial information constraints. Each noise term is modeled as the mean of a finite set of a priori known points in the interval -1, 1 with unknown probabilities where no parametric assumptions about the error distribution are made. A GME model for the multinomial probabilities and for the distributions associated with the noise terms is derived by maximizing the joint entropy of multinomial and noise distributions, under the assumption of independence. The GME formulation reduces to the ME in the limit as the sample grows large or when no noise is included in the entropy maximization. Further, even though the GME and the logit estimators are conceptually different, the dual GME model is related to a generalized class of logit models. In finite samples, modeling the noise terms along with the multinomial probabilities results in a gain of efficiency of the GME over the ME and ML. In addition to analytical results, some extensions, sampling experiments, and an example based on real-world data are presented..