Аннотация
Recent studies and industry practices build data-center-scale computer
systems to meet the high storage and processing demands
of data-intensive and compute-intensive applications, such as web
searches. The Map-Reduce programming model is one of the most
popular programming paradigms on these systems. In this paper,
we report our experiences and insights gained from implementing
three data-intensive and compute-intensive tasks that have different
characteristics from previous studies: a large-scale machine learning
computation, a physical simulation task, and a digital media
processing task. We identify desirable features and places to improve
in the Map-Reduce model. Our goal is to better understand
such large-scale computation and data processing in order to design
better supports for them.
Пользователи данного ресурса
Пожалуйста,
войдите в систему, чтобы принять участие в дискуссии (добавить собственные рецензию, или комментарий)