Model-based performance prediction in agile software development promises to evaluate design alternatives and to reduce the cost of performance tests. To minimize the differences between a real software and its performance model, parametric dependencies are introduced. They express how the performance model parameters (such as loop iteration count, branch transition probabilities, resource demands, and external service call arguments) depend on impacting factors like the input data.
The approaches that perform model-based performance prediction in agile software development have two major shortcomings: they are either costly because they do not update the performance models automatically after each commit, or do not consider more complex parametric dependencies than linear.
This work extends an approach for continuous integration of performance model during agile development. Our extension aims to optimize the learning of parametric dependencies with a genetic programming algorithm to be able to detect non-linear dependencies.
The case study results show that using genetic programming enables detecting more complex dependencies and improves the accuracy of the updated performance model.
%0 Conference Paper
%1 VoMaGrKo-ECSA-Dependencies
%A Voneva, Sonya
%A Mazkatli, Manar
%A Grohmann, Johannes
%A Koziolek, Anne
%B Companion of the 14th European Conference Software Architecture (ECSA 2020)
%C Cham
%D 2020
%E Muccini, Henry
%E Avgeriou, Paris
%E Buhnova, Barbora
%E Cámara, Javier
%E Caporuscio, Mauro
%E Franzago, Mirco
%E Koziolek, Anne
%E Scandurra, Patrizia
%E Trubiani, Catia
%E Weyns, Danny
%E Zdun, Uwe
%I Springer
%K descartes myown parametric_dependencies performance prediction statistical_estimation_and_machine_learning t_workshop
%P 228--240
%R 10.1007/978-3-030-59155-7_17
%T Optimizing Parametric Dependencies for Incremental Performance Model Extraction
%U https://doi.org/10.1007/978-3-030-59155-7_17
%V 1269
%X Model-based performance prediction in agile software development promises to evaluate design alternatives and to reduce the cost of performance tests. To minimize the differences between a real software and its performance model, parametric dependencies are introduced. They express how the performance model parameters (such as loop iteration count, branch transition probabilities, resource demands, and external service call arguments) depend on impacting factors like the input data.
The approaches that perform model-based performance prediction in agile software development have two major shortcomings: they are either costly because they do not update the performance models automatically after each commit, or do not consider more complex parametric dependencies than linear.
This work extends an approach for continuous integration of performance model during agile development. Our extension aims to optimize the learning of parametric dependencies with a genetic programming algorithm to be able to detect non-linear dependencies.
The case study results show that using genetic programming enables detecting more complex dependencies and improves the accuracy of the updated performance model.
@inproceedings{VoMaGrKo-ECSA-Dependencies,
abstract = {Model-based performance prediction in agile software development promises to evaluate design alternatives and to reduce the cost of performance tests. To minimize the differences between a real software and its performance model, parametric dependencies are introduced. They express how the performance model parameters (such as loop iteration count, branch transition probabilities, resource demands, and external service call arguments) depend on impacting factors like the input data.
The approaches that perform model-based performance prediction in agile software development have two major shortcomings: they are either costly because they do not update the performance models automatically after each commit, or do not consider more complex parametric dependencies than linear.
This work extends an approach for continuous integration of performance model during agile development. Our extension aims to optimize the learning of parametric dependencies with a genetic programming algorithm to be able to detect non-linear dependencies.
The case study results show that using genetic programming enables detecting more complex dependencies and improves the accuracy of the updated performance model.},
added-at = {2020-09-23T04:00:22.000+0200},
address = {Cham},
author = {Voneva, Sonya and Mazkatli, Manar and Grohmann, Johannes and Koziolek, Anne},
biburl = {https://www.bibsonomy.org/bibtex/24e13e3a98dc62154bae8ef35f561c39c/se-group},
booktitle = {Companion of the 14th European Conference Software Architecture ({ECSA} 2020)},
doi = {10.1007/978-3-030-59155-7_17},
editor = {Muccini, Henry and Avgeriou, Paris and Buhnova, Barbora and C{\'{a}}mara, Javier and Caporuscio, Mauro and Franzago, Mirco and Koziolek, Anne and Scandurra, Patrizia and Trubiani, Catia and Weyns, Danny and Zdun, Uwe},
interhash = {4bfc513020f6d8a0da3b7560debf4d96},
intrahash = {4e13e3a98dc62154bae8ef35f561c39c},
keywords = {descartes myown parametric_dependencies performance prediction statistical_estimation_and_machine_learning t_workshop},
month = {September},
pages = {228--240},
publisher = {Springer},
series = {Communications in Computer and Information Science},
timestamp = {2021-01-12T13:50:58.000+0100},
title = {Optimizing Parametric Dependencies for Incremental Performance Model Extraction},
url = {https://doi.org/10.1007/978-3-030-59155-7_17},
volume = 1269,
year = 2020
}