We propose Deep Asymmetric Multitask Feature
Learning (Deep-AMTFL) which can learn deep
representations shared across multiple tasks while
effectively preventing negative transfer that may
happen in the feature sharing process. Specifi-
cally, we introduce an asymmetric autoencoder
term that allows reliable predictors for the easy
tasks to have high contribution to the feature learn-
ing while suppressing the influences of unreliable
predictors for more difficult tasks. This allows
the learning of less noisy representations, and
enables unreliable predictors to exploit knowl-
edge from the reliable predictors via the shared
latent features. Such asymmetric knowledge trans-
fer through shared features is also more scalable
and efficient than inter-task asymmetric transfer.
We validate our Deep-AMTFL model on multi-
ple benchmark datasets for multitask learning and
image classification, on which it significantly out-
performs existing symmetric and asymmetric mul-
titask learning models, by effectively preventing
negative transfer in deep feature learning.
Описание
Deep Asymmetric Multi-task Feature Learning - Semantic Scholar
%0 Conference Paper
%1 Lee2018DeepAM
%A Lee, Haebeom
%A Yang, Eunho
%A Hwang, Sung Ju
%B ICML
%D 2018
%K asymetric deep_learning multitask vision
%T Deep Asymmetric Multi-task Feature Learning
%X We propose Deep Asymmetric Multitask Feature
Learning (Deep-AMTFL) which can learn deep
representations shared across multiple tasks while
effectively preventing negative transfer that may
happen in the feature sharing process. Specifi-
cally, we introduce an asymmetric autoencoder
term that allows reliable predictors for the easy
tasks to have high contribution to the feature learn-
ing while suppressing the influences of unreliable
predictors for more difficult tasks. This allows
the learning of less noisy representations, and
enables unreliable predictors to exploit knowl-
edge from the reliable predictors via the shared
latent features. Such asymmetric knowledge trans-
fer through shared features is also more scalable
and efficient than inter-task asymmetric transfer.
We validate our Deep-AMTFL model on multi-
ple benchmark datasets for multitask learning and
image classification, on which it significantly out-
performs existing symmetric and asymmetric mul-
titask learning models, by effectively preventing
negative transfer in deep feature learning.
@inproceedings{Lee2018DeepAM,
abstract = {We propose Deep Asymmetric Multitask Feature
Learning (Deep-AMTFL) which can learn deep
representations shared across multiple tasks while
effectively preventing negative transfer that may
happen in the feature sharing process. Specifi-
cally, we introduce an asymmetric autoencoder
term that allows reliable predictors for the easy
tasks to have high contribution to the feature learn-
ing while suppressing the influences of unreliable
predictors for more difficult tasks. This allows
the learning of less noisy representations, and
enables unreliable predictors to exploit knowl-
edge from the reliable predictors via the shared
latent features. Such asymmetric knowledge trans-
fer through shared features is also more scalable
and efficient than inter-task asymmetric transfer.
We validate our Deep-AMTFL model on multi-
ple benchmark datasets for multitask learning and
image classification, on which it significantly out-
performs existing symmetric and asymmetric mul-
titask learning models, by effectively preventing
negative transfer in deep feature learning.},
added-at = {2018-08-27T22:17:25.000+0200},
author = {Lee, Haebeom and Yang, Eunho and Hwang, Sung Ju},
biburl = {https://www.bibsonomy.org/bibtex/25810725dff1fb5959b0466f42ff89ec3/dallmann},
booktitle = {ICML},
description = {Deep Asymmetric Multi-task Feature Learning - Semantic Scholar},
interhash = {75c6477f695c4b32e28328bc85b31744},
intrahash = {5810725dff1fb5959b0466f42ff89ec3},
keywords = {asymetric deep_learning multitask vision},
timestamp = {2018-08-27T22:17:25.000+0200},
title = {Deep Asymmetric Multi-task Feature Learning},
year = 2018
}