Computer Science Department
School of Computer Science, Carnegie Mellon University
Michael N. Dille
The LittleDog robot is a 12 degree-of-freedom quadruped developed by Boston Dynamics and selected for use in the DARPA Learning Locomotion program, in which machine learning is applied to develop controllers capable of navigating rocky terrain. Presently, it is typically constrained to operate within wireless range of a host desktop computer and within a fixed workspace surrounded by a motion capture system that globally localizes the robot and specially marked terrain boards without the use of onboard sensing. In this thesis, we explore a variety of strategies for expanding the capabilities of this platform in the theme of relaxing these operational constraints and with the goal of allowing operation in arbitrary locations outside of the fixed workspace and without a host computer. Towards this end, we start by addressing the straightforward technical issue of physical independence by demonstrating a viable onboard controller in the form of a compact single-board computer. Next, we attempt to resolve the lack of onboard sensing through computer vision by attaching a camera to the robot and developing the necessary procedures for calibrating it, synchronizing its data stream with existing state data, and compensating for the additional weight of the camera. Using this, we demonstrate mapping and navigation of terrains outside the motion capture system containing both planar and simple structured three-dimensional obstacles. In conjunction with this, we develop and implement several dead reckoning strategies, one including a complete kinodynamic model of ground contact, to compute odometry information enabling reasonably accurate continuous pose estimation. Finally, we complete a brief exploration of alternatives for local sensing and reason about extensions to more unstructured environments.