Ranking with Large Margin Principle: Two Approaches
A. Shashua, и A. Levin. In Proceedings of Advances in Neural Information Processing Systems, (2003)
Аннотация
We discuss the problem of ranking instances with the use of a “large
margin” principle. We introduce two main approaches: the first is the
“fixed margin” policy in which the margin of the closest neighboring
classes is being maximized — which turns out to be a direct generalization of SVM to ranking learning. The second approach allows for
different margins where the sum of margins is maximized. This approach
is shown to reduce to
\nu-SVM when the number of classes. Both
approaches are optimal in size of
n where n is the total number of training
examples. Experiments performed on visual classification and “collaborative filtering” show that both approaches outperform existing ordinal
regression algorithms applied for ranking and multiclass SVM applied
to general multiclass classification.
Описание
Two svm formulations of the ranking or rather ordinal regression are introduced: one which maximizes the individual margins separating the different classes and one which maximizes the sum of the margins. There might be a way to introduce this types of losses in the mmmf formulation.
%0 Conference Paper
%1 shashua2003ranking
%A Shashua, Amnon
%A Levin, Anat
%B In Proceedings of Advances in Neural Information Processing Systems
%D 2003
%E NIPS,
%K learning oc ranking sum
%T Ranking with Large Margin Principle: Two Approaches
%U http://books.nips.cc/papers/files/nips15/AA58.pdf
%X We discuss the problem of ranking instances with the use of a “large
margin” principle. We introduce two main approaches: the first is the
“fixed margin” policy in which the margin of the closest neighboring
classes is being maximized — which turns out to be a direct generalization of SVM to ranking learning. The second approach allows for
different margins where the sum of margins is maximized. This approach
is shown to reduce to
\nu-SVM when the number of classes. Both
approaches are optimal in size of
n where n is the total number of training
examples. Experiments performed on visual classification and “collaborative filtering” show that both approaches outperform existing ordinal
regression algorithms applied for ranking and multiclass SVM applied
to general multiclass classification.
@inproceedings{shashua2003ranking,
abstract = {We discuss the problem of ranking instances with the use of a “large
margin” principle. We introduce two main approaches: the first is the
“fixed margin” policy in which the margin of the closest neighboring
classes is being maximized — which turns out to be a direct generalization of SVM to ranking learning. The second approach allows for
different margins where the sum of margins is maximized. This approach
is shown to reduce to
\nu-SVM when the number of classes. Both
approaches are optimal in size of
n where n is the total number of training
examples. Experiments performed on visual classification and “collaborative filtering” show that both approaches outperform existing ordinal
regression algorithms applied for ranking and multiclass SVM applied
to general multiclass classification.
},
added-at = {2012-12-25T14:52:11.000+0100},
author = {Shashua, Amnon and Levin, Anat},
biburl = {https://www.bibsonomy.org/bibtex/284662e39711650b021786f8908bef03c/nosebrain},
booktitle = {In Proceedings of Advances in Neural Information Processing Systems},
description = {Two svm formulations of the ranking or rather ordinal regression are introduced: one which maximizes the individual margins separating the different classes and one which maximizes the sum of the margins. There might be a way to introduce this types of losses in the mmmf formulation.},
editor = {NIPS},
interhash = {20c44188b32fe08a6ea3c5f63731099b},
intrahash = {84662e39711650b021786f8908bef03c},
keywords = {learning oc ranking sum},
timestamp = {2013-03-16T16:01:16.000+0100},
title = {Ranking with Large Margin Principle: Two Approaches},
url = {http://books.nips.cc/papers/files/nips15/AA58.pdf},
year = 2003
}