C. Bauer. Proceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization, Seite 323-325. ACM, (Juni 2021)
DOI: 10.1145/3450613.3457122
Zusammenfassung
When evaluating personalized or adaptive systems, we frequently rely on one single evaluation objective and one single method. This remains us with “blind spots”. A comprehensive evaluation may require a thoughtful integration of multiple methods. This tutorial (i) demonstrates the wide variety of dimensions to be eval- uated, (ii) outlines the methodological approaches to evaluate these dimensions, (iii) pinpoints the blind spots when using only one ap- proach, (iv) demonstrates the benefits of multi-method evaluation, and (v) outlines the basic options how multiple methods can be integrated into one evaluation design. Participants familiarize with the wide spectrum of opportunities how adaptive or personalized systems may be evaluated, and have the opportunity to come up with evaluation designs that comply with the four basic options of multi-method evaluation. The ultimate learning objective is to stimulate the critical reflection of one’s own evaluation practices and those of the community at large.
%0 Conference Paper
%1 Bauer_2021
%A Bauer, Christine
%B Proceedings of the 29th ACM Conference on User Modeling, Adaptation and Personalization
%D 2021
%I ACM
%K adaptive evaluation mulitimethods myown recsys
%P 323-325
%R 10.1145/3450613.3457122
%T Multi-Method Evaluation of Adaptive Systems
%U https://doi.org/10.1145%2F3450613.3457122
%X When evaluating personalized or adaptive systems, we frequently rely on one single evaluation objective and one single method. This remains us with “blind spots”. A comprehensive evaluation may require a thoughtful integration of multiple methods. This tutorial (i) demonstrates the wide variety of dimensions to be eval- uated, (ii) outlines the methodological approaches to evaluate these dimensions, (iii) pinpoints the blind spots when using only one ap- proach, (iv) demonstrates the benefits of multi-method evaluation, and (v) outlines the basic options how multiple methods can be integrated into one evaluation design. Participants familiarize with the wide spectrum of opportunities how adaptive or personalized systems may be evaluated, and have the opportunity to come up with evaluation designs that comply with the four basic options of multi-method evaluation. The ultimate learning objective is to stimulate the critical reflection of one’s own evaluation practices and those of the community at large.
%@ 9781450383660
@inproceedings{Bauer_2021,
abstract = {When evaluating personalized or adaptive systems, we frequently rely on one single evaluation objective and one single method. This remains us with “blind spots”. A comprehensive evaluation may require a thoughtful integration of multiple methods. This tutorial (i) demonstrates the wide variety of dimensions to be eval- uated, (ii) outlines the methodological approaches to evaluate these dimensions, (iii) pinpoints the blind spots when using only one ap- proach, (iv) demonstrates the benefits of multi-method evaluation, and (v) outlines the basic options how multiple methods can be integrated into one evaluation design. Participants familiarize with the wide spectrum of opportunities how adaptive or personalized systems may be evaluated, and have the opportunity to come up with evaluation designs that comply with the four basic options of multi-method evaluation. The ultimate learning objective is to stimulate the critical reflection of one’s own evaluation practices and those of the community at large.},
added-at = {2021-06-22T00:40:31.000+0200},
author = {Bauer, Christine},
biburl = {https://www.bibsonomy.org/bibtex/2e68ef4ecf19311bf524641faf58773f1/bauerc},
booktitle = {Proceedings of the 29th {ACM} Conference on User Modeling, Adaptation and Personalization},
doi = {10.1145/3450613.3457122},
eventdate = {25 June},
eventtitle = {29th {ACM} Conference on User Modeling, Adaptation and Personalization},
interhash = {1f36742f8d7b2038947c984a8d13104f},
intrahash = {e68ef4ecf19311bf524641faf58773f1},
isbn = {9781450383660},
keywords = {adaptive evaluation mulitimethods myown recsys},
language = {English},
month = jun,
pages = {323-325},
publisher = {{ACM}},
series = {UMAP 2021},
timestamp = {2021-06-22T00:40:31.000+0200},
title = {Multi-Method Evaluation of Adaptive Systems},
url = {https://doi.org/10.1145%2F3450613.3457122},
venue = {Utrecht, The Netherlands},
year = 2021
}