The ODBR Workflow: The overall architecture of our ODBR tool is illustrated in the Figure above. The entry point for a tester or developer using this tool is the ODBR Android Application, which is available as an open source project. along with further information about the project and demo videos. To use our tool, a reporter simply needs to install the ODBR app on their target virtual or physical device2 select the app for which they wish to report a bug from the list of installed applications, hit the record button and perform the actions on the device that manifest a bug. After the ODBR app (which is running in the background) detects that no touch-based inputs have been entered after a certain period of time, it asks the user if they have finished the report or if they wish to continue. Once the user has finished, they will have a chance to enter additional information about the bug, including a natural language description of the expected and actual behavior, and a title for the report. Additionally, the user can view the screenshots and sensor data traces from the device and even replay the touch events to ensure they were properly captured. This replay feature relies on a custom written Java implementation of the sendevent utility3. Once the report has been created, a BugReport is translated to a json document where it is sent over the web to a couchDB server instance. A Java web-application then reads the bug report information from the json file and converts the information into a fully expressive html report. From the java web app, a developer can view the reproduction steps of the report, complete with screenshots and action descriptions, as well as download a replayable script that will reproduce the actions via adb or sendevent commands.
ODBR App components: The ODBR Application has 4 major components that aid in the collection of informa tion during the bug reporting process. These include: (i) the GetEvent Manager, (ii) the UI-Hierarchy Manager, (iii) the Sensor Event Manager and (iv) the Screenshot Manager. The GetEvent Manager is responsible for precisely and efficiently reading in the user event stream. To accomplish this, during the reporting process the app creates threads that read from the underlying Linux input event streams at /dev/input/... These input events provide highly detailed information about the user’s interaction with the phone’s physical devices, including (where applicable) the touch screen, power button, keyboard, etc. This information is identical to what is gathered using the Android getevent utility as used by RERAN. Next, these low-level input event streams are parsed into higher level user interactions (e.g. swipe from (a,b) to (c,d)). The low-level input events are retained to support precise analysis and replayability, while the higher level interactions are used to summarize the report in natural language. Whenever the GetEvent Manager detects the user is taking a new action, it notifies the applicable managers to take a screenshot and dump of the UI-hierarchy, associating these with the new interaction. The UI-Hierarchy Manager interfaces with the Android uiautomator framework to capture dumps of the Android view hierarchy in ui-dump.xml files for each new user action. Because these dump files contain information about the screen location of each UI-component, we can use the event information obtained from the previous component to precisely infer the UI-component that the user interacted with at each step in the bug reporting process. This component also extracts attributes of the various components on the screen including information such as the type (e.g., button, spinner) and whether the component is clickable. The Sensor Event Manager is responsible for efficiently sampling the sensor input streams (e.g., accelerometer, GPS) during the bug reporting process. This component accomplishes this by registering SensorEventListener instances for each sensor which sample the sensor values at appropriate rates. Finally, the Screenshot Manager is responsible for capturing an image of the screen for each new user interaction using the screencap utility included in Android.
Tools used for ODBR Implementation
We provide the tools we used in our implementation of the ODBR App, and the Java Web App.
Technology used to implement the ODBR App:
technology used to Implement the Java Web App:
- Bootstrap: HTML, CSS, and JavaScript framework for developing web applications.
- MySQL: Robust Relational Database.