,

Deep Forest: Towards An Alternative to Deep Neural Networks.

, и .
(2017)cite arxiv:1702.08835Comment: 7 pages, 5 figures.

Аннотация

In this paper, we propose gcForest, a decision tree ensemble approach with performance highly competitive to deep neural networks. In contrast to deep neural networks which require great effort in hyper-parameter tuning, gcForest is much easier to train. Actually, even when gcForest is applied to different data from different domains, excellent performance can be achieved by almost same settings of hyper-parameters. The training process of gcForest is efficient and scalable. In our experiments its training time running on a PC is comparable to that of deep neural networks running with GPU facilities, and the efficiency advantage may be more apparent because gcForest is naturally apt to parallel implementation. Furthermore, in contrast to deep neural networks which require large-scale training data, gcForest can work well even when there are only small-scale training data. Moreover, as a tree-based approach, gcForest should be easier for theoretical analysis than deep neural networks.

тэги

Пользователи данного ресурса

  • @marcsaric
  • @dblp

Комментарии и рецензиипоказать / перейти в невидимый режим

  • @giannis81
    8 лет назад (последнее обновление8 лет назад)
    Interesting
  • @giannis81
    @giannis81 8 лет назад
    Interesting I d say...
  • @marcsaric
    8 лет назад (последнее обновление8 лет назад)
Пожалуйста, войдите в систему, чтобы принять участие в дискуссии (добавить собственные рецензию, или комментарий)