icsm04 - Selected as one of the Best Papers . Appears in...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Selected as one of the Best Papers . Appears in Proceedings of the 20th IEEE International Conference on Software Maintenance An extended version is invited for submission to a special issue of IEEE Transactions on Software Engineering (TSE) Checking Inside the Black Box: Regression Testing Based on Value Spectra Differences Tao Xie David Notkin Department of Computer Science & Engineering University of Washington Seattle, WA 98195, USA E-mail: { taoxie,notkin } @cs.washington.edu Abstract Comparing behaviors of program versions has become an important task in software maintenance and regression testing. Traditional regression testing strongly focuses on black-box comparison of program outputs. Program spec- tra have recently been proposed to characterize a programs behavior inside the black box. Comparing program spectra of program versions offers insights into the internal behav- ior differences between versions. In this paper, we present a new class of program spectra, value spectra, which en- riches the existing program spectra family. We compare the value spectra of an old version and a new version to de- tect internal behavior deviations in the new version. We use a deviation-propagation call tree to present the devi- ation details. Based on the deviation-propagation call tree, we propose two heuristics to locate deviation roots, which are program locations that trigger the behavior deviations. We have conducted an experiment on seven C programs to evaluate our approach. The results show that our ap- proach can effectively expose program behavior differences between versions even when their program outputs are the same, and our approach reports deviation roots with high accuracy for most programs. 1. Introduction Regression testing retests a program after it is modified. In addition to validating new added functionality, regression testing compares the behavior of a new version to the behav- ior of an old version to assure that no regression faults are introduced. When the outputs produced by two versions are different, regression faults are exposed. However, even if a variable-value difference is caused immediately after a new faulty statement is executed, the fault might not be prop- agated to the observable outputs because of the informa- tion loss or hiding effects. Checking inside the black box has been used to expose faults complementing the tradi- tional black box output checking approach. Runtime asser- tion [2, 15] or inferred invariant checking [4, 6] is used to validate that certain properties inside the black box are sat- isfied during program execution. Recently program spectra have been proposed to characterize a programs behavior in- side the black box [14]. Structural program spectra, such as branch, path, data dependence, and execution trace spectra, have been proposed in the literature [3, 7, 14]....
View Full Document

Page1 / 10

icsm04 - Selected as one of the Best Papers . Appears in...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online