Universally Adaptive Cross-Platform Reinforcement Learning Testing via GUI Image Understanding

08/19/2022
by   Shengcheng Yu, et al.
0

With the rapid development of the Internet, more and more applications (app) are playing an important role in various aspects of the world. Among all apps, mobile apps and web apps are dominant in people's daily life and all industries. In order to tackle the challenges in ensuring the app quality, many approaches have been adopted to improve app GUI testing, including random technologies, model-based technologies, etc. However, existing approaches are still insufficient in reaching high code coverage, constructing high quality models, and achieving generalizability. Besides, current approaches is heavily dependent on the execution platforms (i.e., Android, Web). Apps of distinct platforms share commonalities in GUI design, which inspires us to propose a platform-independent approach with the development of computer vision algorithms. In this paper, we propose UniRLTest. It is a reinforcement learning based approach utilizing a universal framework with computer vision algorithms to conduct automated testing on apps from different platforms. UniRLTest extracts the GUI widgets from GUI pages and characterizes the GUI corresponding layouts, embedding the GUI pages as states. UniRLTest explores apps with the guidance of a novelly designed curiosity-driven strategy, which uses a Q-network to estimate the values of specific states and actions to encourage more exploration in uncovered pages without platform dependency. The state embedding similarity is used to calculate the rewards of each exploration step. We conduct an empirical study on 20 mobile apps and 5 web apps, and the results show that UniRLTest can perform better than the baselines, especially in the exploration of new states.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset