Modeling Mobile Interface Tappability Using Crowdsourcing and Deep Learning

by   Amanda Swearngin, et al.

Tapping is an immensely important gesture in mobile touchscreen interfaces, yet people still frequently are required to learn which elements are tappable through trial and error. Predicting human behavior for this everyday gesture can help mobile app designers understand an important aspect of the usability of their apps without having to run a user study. In this paper, we present an approach for modeling tappability of mobile interfaces at scale. We conducted large-scale data collection of interface tappability over a rich set of mobile apps using crowdsourcing and computationally investigated a variety of signifiers that people use to distinguish tappable versus not-tappable elements. Based on the dataset, we developed and trained a deep neural network that predicts how likely a user will perceive an interface element as tappable versus not tappable. Using the trained tappability model, we developed TapShoe, a tool that automatically diagnoses mismatches between the tappability of each element as perceived by a human user---predicted by our model, and the intended or actual tappable state of the element specified by the developer or designer. Our model achieved reasonable accuracy: mean precision 90.2% and recall 87.0%, in matching human perception on identifying tappable UI elements. The tappability model and TapShoe were well received by designers via an informal evaluation with 7 professional interaction designers.


page 1

page 5

page 9


What do all these Buttons do? Statically Mining Android User Interfaces at Scale

We introduce FRONTMATTER: a tool to automatically mine both user interfa...

Saliency Prediction for Mobile User Interfaces

We introduce models for saliency prediction for mobile user interfaces. ...

EvIcon: Designing High-Usability Icon with Human-in-the-loop Exploration and IconCLIP

Interface icons are prevalent in various digital applications. Due to li...

Do we agree on user interface aesthetics of Android apps?

Context: Visual aesthetics is increasingly seen as an essential factor i...

Screen Recognition: Creating Accessibility Metadata for Mobile Applications from Pixels

Many accessibility features available on mobile platforms require applic...

Predicting Human Performance in Vertical Menu Selection Using Deep Learning

Predicting human performance in interaction tasks allows designers or de...

Never-ending Learning of User Interfaces

Machine learning models have been trained to predict semantic informatio...

Please sign up or login with your details

Forgot password? Click here to reset