Robotic Echolocation Testbed
Dr. Miguel Abrahantes and Professor Mark Edgington
A bat can identify its position within an environment using ultrasound chirps to perform echolocation. The biology of this process has been studied in depth, and engineers have applied ultrasound ranging in mapping and object detection. However, most engineered systems do not mimic bats, and there is still much to be understood about how a bat actually processes the echoes it hears. In this project, a mobile system was developed that can precisely and reliably carry out echolocation experiments (data collection) for later analysis. A Kobuki robot was used as a base unit, providing mobility and accurate odometry. Custom shelving and mounting hardware were designed for the robot to accommodate a laptop for controlling the robot, along with a Microsoft Kinect sensor and ultrasonic transducers for taking experimental measurements. A Python based software package was designed to provide simple control of the robot and its sensors. This software was designed to work within the Robot Operating System (ROS) framework, and includes high-level interfaces for controlling robot movement and for the simultaneous playing and recording of sounds. Each data-collection experiment consists of a sequence of movements and measurements that the robot should perform. A YAML based specification for representing experiments was developed, so that users can create and execute experiments with simple, human-readable text files. The system we have developed will make future data collection simple, allowing us to focus on the study and analysis of echo signals.
A recommended citation will become available once a downloadable file has been added to this entry.