Abstract
Logistic regression (LR) is widely applied as a
powerful classification method in various fields, and a variety
of optimization methods have been developed. To cope with
large-scale problems, an efficient optimization method for LR
is required in terms of computational cost and memory usage.
In this paper, we propose an efficient optimization method
using non-linear conjugate gradient (CG) descent. In each CG
iteration, the proposed method employs the optimized step size
without exhaustive line search, which significantly reduces the
number of iterations, making the whole optimization process
fast. In addition, on the basis of such CG-based optimization
scheme, a novel optimization method for kernel logistic regression
(KLR) is proposed. Unlike the ordinary KLR methods, the
proposed method optimizes the kernel-based classifier, which
is naturally formulated as the linear combination of sample
kernel functions, directly in the reproducing kernel Hilbert
space (RKHS), not the linear coefficients. Subsequently, we also
propose the multiple-kernel logistic regression (MKLR) along
with the optimization of KLR. The MKLR effectively combines
the multiple types of kernels with optimizing the weights for
the kernels in the framework of the logistic regression. These
proposed methods are all based on CG-based optimization and
matrix-matrix computation which is easily parallelized such as
by using multi-thread programming. In the experimental results
on multi-class classifications using various datasets, the proposed
methods exhibit favorable performances in terms of classification
accuracies and computation times.
Users
Please
log in to take part in the discussion (add own reviews or comments).