@joh.grohmann

Optimizing Parametric Dependencies for Incremental Performance Model Extraction

, , , and . Companion of the 14th European Conference Software Architecture (ECSA 2020), volume 1269 of Communications in Computer and Information Science, page 228--240. Cham, Springer, (September 2020)
DOI: 10.1007/978-3-030-59155-7_17

Abstract

Model-based performance prediction in agile software development promises to evaluate design alternatives and to reduce the cost of performance tests. To minimize the differences between a real software and its performance model, parametric dependencies are introduced. They express how the performance model parameters (such as loop iteration count, branch transition probabilities, resource demands, and external service call arguments) depend on impacting factors like the input data. The approaches that perform model-based performance prediction in agile software development have two major shortcomings: they are either costly because they do not update the performance models automatically after each commit, or do not consider more complex parametric dependencies than linear. This work extends an approach for continuous integration of performance model during agile development. Our extension aims to optimize the learning of parametric dependencies with a genetic programming algorithm to be able to detect non-linear dependencies. The case study results show that using genetic programming enables detecting more complex dependencies and improves the accuracy of the updated performance model.

Links and resources

Tags

community