Building Recognition on Android

Faculty Mentor(s)

Dr. Michael Jipping

Document Type


Event Date



This project is a structural recognition application that is implemented on the Android operating system using either phones or tablets. Using the back-facing camera on the Android device to get a video stream, we used a Computer Vision algorithm to analyze individual frames to search for recognizable buildings. Android sensors provided us with the GPS coordinates of the device. These GPS coordinates allowed us to limit which buildings could potentially be on the screen to nearby buildings. The Computer Vision algorithm we used was SURF (Speeded Up Robust Features). This algorithm uses a matrix to compute areas of high contrast in the picture. These areas of high contrast give a good representation of the outline of the structural aspects of the building as well as the specific colors and texture of the building. Using the smart phone infrastructure coupled with the image analysis provided by SURF, we were able to provide an interactive environment for users. This included touch interaction points called Hotspots. The user can touch a Hotspot to provide more information about the building in various formats. Our application’s interactive information display will be used to augment pedagogy of courses in the digital humanities.


This material is based upon work supported by the National Science Foundation under grant No. 0851293.

This document is currently not available here.