The Community for Technology Leaders
2017 32nd IEEE/ACM International Conference on Automated Software Engineering (ASE) (2017)
Urbana, IL, USA
Oct. 30, 2017 to Nov. 3, 2017
ISBN: 978-1-5386-3976-4
pp: 16-26
Ke Mao , Facebook London, Facebook, 10 Brock Street, London, NW1 3FG, UK CREST, University College London, Malet Place, London, WC1E 6BT, UK
Mark Harman , Facebook London, Facebook, 10 Brock Street, London, NW1 3FG, UK CREST, University College London, Malet Place, London, WC1E 6BT, UK
Yue Jia , Facebook London, Facebook, 10 Brock Street, London, NW1 3FG, UK CREST, University College London, Malet Place, London, WC1E 6BT, UK
ABSTRACT
We show that information extracted from crowd-based testing can enhance automated mobile testing. We introduce Polariz, which generates replicable test scripts from crowd-based testing, extracting cross-app ‘motif’ events: automatically-inferred reusable higher-level event sequences composed of lower-level observed event actions. Our empirical study used 434 crowd workers from Mechanical Turk to perform 1,350 testing tasks on 9 popular Google Play apps, each with at least 1 million user installs. The findings reveal that the crowd was able to achieve 60.5% unique activity coverage and proved to be complementary to automated search-based testing in 5 out of the 9 subjects studied. Our leave-one-out evaluation demonstrates that coverage attainment can be improved (6 out of 9 cases, with no disimprovement on the remaining 3) by combining crowd-based and search-based testing.
INDEX TERMS
Testing, Mobile communication, Tools, Data mining, Androids, Humanoid robots, Mobile handsets
CITATION

K. Mao, M. Harman and Y. Jia, "Crowd intelligence enhances automated mobile testing," 2017 32nd IEEE/ACM International Conference on Automated Software Engineering (ASE), Urbana, IL, USA, 2017, pp. 16-26.
doi:10.1109/ASE.2017.8115614
184 ms
(Ver 3.3 (11022016))