Accurate On Device Bug Reporting for Android Applications

Team Members: Kevin Moran, Carlos Bernal-Cárdenas, Richard Bonett, Brendan Otten, Daniel Park & Denys Poshyvanyk

College of William & Mary --- SEMERU

 
 

Purpose

This project was created by the Software Engineering Maintenance and Evolution Research Unit (SEMERU) at the College of William & Mary, under the supervision of Dr. Denys Poshyvanyk.  The major goal of the ODBR project is to provide a practical automated bug-reporting tool to developers that is capable accurately recording and replaying a set of actions that will reproduce a bug that surfaces in a mobile application directly on a physical or virtual Android device.

Video Demonstration

Publications

  • On-Device Bug Reporting for Android Applications
    ‣ Kevin Moran, Richard Bonnet, Carlos Bernal-Cárdenas, Brendan Otten, Daniel Park, and Denys Poshyvanyk
    ‣ Proceedings of 4th IEEE/ACM International Conference on Mobile Software Engineering and Systems (MOBILESoft'17), Tool Demos and Mobile Apps Track, Buenos Aires, Argentina, May 22-23, 2017 (44% Acceptance Rate)
    ‣ [pdf software | slides]

ODBR is Open Source!

We have made the code of the ODBR open source, hosted on Gitlab.  Code for the Java Web App will be coming soon. 

Please considering contributing via the button below!

 
GitLab_Logo.svg.png
 

ODBR

ODBR Workflow Overview

ODBR Workflow Overview (Click for more detail)

The ODBR Workflow: The overall architecture of our ODBR tool is illustrated in the Figure above. The entry point for a tester or developer using this tool is the ODBR Android Application, which is available as an open source project. along with further information about the project and demo videos. To use our tool, a reporter simply needs to install the ODBR app on their target virtual or physical device2 select the app for which they wish to report a bug from the list of installed applications, hit the record button and perform the actions on the device that manifest a bug. After the ODBR app (which is running in the background) detects that no touch-based inputs have been entered after a certain period of time, it asks the user if they have finished the report or if they wish to continue. Once the user has finished, they will have a chance to enter additional information about the bug, including a natural language description of the expected and actual behavior, and a title for the report. Additionally, the user can view the screenshots and sensor data traces from the device and even replay the touch events to ensure they were properly captured. This replay feature relies on a custom written Java implementation of the sendevent utility3. Once the report has been created, a BugReport is translated to a json document where it is sent over the web to a couchDB server instance. A Java web-application then reads the bug report information from the json file and converts the information into a fully expressive html report. From the java web app, a developer can view the reproduction steps of the report, complete with screenshots and action descriptions, as well as download a replayable script that will reproduce the actions via adb or sendevent commands.

ODBR App components: The ODBR Application has 4 major components that aid in the collection of informa tion during the bug reporting process. These include: (i) the GetEvent Manager, (ii) the UI-Hierarchy Manager, (iii) the Sensor Event Manager and (iv) the Screenshot Manager. The GetEvent Manager is responsible for precisely and efficiently reading in the user event stream. To accomplish this, during the reporting process the app creates threads that read from the underlying Linux input event streams at /dev/input/... These input events provide highly detailed information about the user’s interaction with the phone’s physical devices, including (where applicable) the touch screen, power button, keyboard, etc. This information is identical to what is gathered using the Android getevent utility as used by RERAN. Next, these low-level input event streams are parsed into higher level user interactions (e.g. swipe from (a,b) to (c,d)). The low-level input events are retained to support precise analysis and replayability, while the higher level interactions are used to summarize the report in natural language. Whenever the GetEvent Manager detects the user is taking a new action, it notifies the applicable managers to take a screenshot and dump of the UI-hierarchy, associating these with the new interaction. The UI-Hierarchy Manager interfaces with the Android uiautomator framework to capture dumps of the Android view hierarchy in ui-dump.xml files for each new user action. Because these dump files contain information about the screen location of each UI-component, we can use the event information obtained from the previous component to precisely infer the UI-component that the user interacted with at each step in the bug reporting process. This component also extracts attributes of the various components on the screen including information such as the type (e.g., button, spinner) and whether the component is clickable. The Sensor Event Manager is responsible for efficiently sampling the sensor input streams (e.g., accelerometer, GPS) during the bug reporting process. This component accomplishes this by registering SensorEventListener instances for each sensor which sample the sensor values at appropriate rates. Finally, the Screenshot Manager is responsible for capturing an image of the screen for each new user interaction using the screencap utility included in Android.

Tools used for ODBR Implementation

We provide the tools we used in our implementation of the ODBR App, and the Java Web App.

Technology used to implement the ODBR App:

technology used to Implement the Java Web App:

  • Bootstrap: HTML, CSS, and JavaScript framework for developing web applications.
  • MySQL: Robust Relational Database.