Perf-AL: Performance prediction for configurable software through adversarial learning

Publisher:
ACM
Publication Type:
Conference Proceeding
Citation:
International Symposium on Empirical Software Engineering and Measurement, 2020, pp. 1-11
Issue Date:
2020-10-05
Full metadata record
© 2020 IEEE Computer Society. All rights reserved. Context: Many software systems are highly configurable. Different configuration options could lead to varying performances of the system. It is difficult to measure system performance in the presence of an exponential number of possible combinations of these options. Goal: Predicting software performance by using a small configuration sample. Method: This paper proposes PERF-AL to address this problem via adversarial learning. Specifically, we use a generative network combined with several different regularization techniques (L1 regularization, L2 regularization and a dropout technique) to output predicted values as close to the ground truth labels as possible. With the use of adversarial learning, our network identifies and distinguishes the predicted values of the generator network from the ground truth value distribution. The generator and the discriminator compete with each other by refining the prediction model iteratively until its predicted values converge towards the ground truth distribution. Results:We argue that (i) the proposed method can achieve the same level of prediction accuracy, but with a smaller number of training samples. (ii) Our proposed model using seven real-world datasets show that our approach outperforms the state-of-the-art methods. This help to further promote software configurable performance. Conclusion: Experimental results on seven public real-world datasets demonstrate that PERF-AL outperforms state-of-the-art software performance prediction methods.
Please use this identifier to cite or link to this item: