Skip to Main content Skip to Navigation
Preprints, Working Papers, ...

DSpot: Test Amplification for Automatic Assessment of Computational Diversity

Benoit Baudry 1 Simon Allier 1 Marcelino Rodriguez-Cancio 1 Martin Monperrus 2
1 DiverSe - Diversity-centric Software Engineering
Inria Rennes – Bretagne Atlantique , IRISA-D4 - LANGAGE ET GÉNIE LOGICIEL
2 SPIRALS - Self-adaptation for distributed services and large software systems
Inria Lille - Nord Europe, CRIStAL - Centre de Recherche en Informatique, Signal et Automatique de Lille (CRIStAL) - UMR 9189
Abstract : In this work, we characterize a new form of software diver- sity: the existence of a set of variants that (i) all share the same API, (ii) all behave the same according to an input- output based specification and (iii) exhibit observable dif- ferences when they run outside the specified input space. We quantify computational diversity as the dissimilarity be- tween execution traces on inputs that are outside the speci- fied domain. Our technique relies on test amplification. We propose source code transformations on test cases to explore the input domain and systematically sense the observation domain. We run our experiments on 472 variants of 7 classes from open-source, large and thoroughly tested Java classes. Our test amplification multiplies by ten the number of input points in the test suite and is effective at detecting software diversity.
Document type :
Preprints, Working Papers, ...
Complete list of metadatas

https://hal.archives-ouvertes.fr/hal-01162219
Contributor : Martin Monperrus <>
Submitted on : Tuesday, June 9, 2015 - 8:14:53 PM
Last modification on : Wednesday, December 18, 2019 - 5:18:42 PM

Links full text

Identifiers

  • HAL Id : hal-01162219, version 1
  • ARXIV : 1503.05807

Citation

Benoit Baudry, Simon Allier, Marcelino Rodriguez-Cancio, Martin Monperrus. DSpot: Test Amplification for Automatic Assessment of Computational Diversity. 2015. ⟨hal-01162219⟩

Share

Metrics

Record views

1016