Jonathan Juett and Benjamin Kuipers. 2016.
Learning to reach by building a representation of peri-personal space.
IEEE/RSJ Int. Conf. Humanoid Robots, 2016.
Inspired by the early spatial learning of human infants, we describe progress toward enabling a robotic learning agent to learn the structure of peri-personal space — the space immediately around the agent within which reaching and grasping take place — with minimal prior spatial knowledge.
We propose the PPS Graph representation for early knowledge of peri-personal space, a model that produces behaviors qualitatively consistent with early human motion, and that may provide information on how humans learn manipulation skills through its implementation in a computational system. Each graph node represents a visual sense vector and a proprioceptive sense vector corresponding to the same state of the world, but neither sense vector has a pre-existing interpretation in terms of a 3D model of the environment. An edge linking two nodes in the PPS Graph represents the feasibility of motion between those two states.
Learning starts with “motor babbling”, random exploration of the space of joint angle vectors, leading to the creation of an initial PPS graph representing arm configurations in an otherwise empty space. The next crucial step is recognizing an unusual event, such as accidentally colliding with an object and changing its position. Once a type of unusual event has been identified, the goal for learning is to identify the prerequisites for an action to achieve an event of that type.
We report the results of experiments on a physical Baxter robot, both on a small Learning Graph and on a much larger Sampled PPS Graph to demonstrate scalability. We show how appropriate features can be extracted from uninterpreted visual images, and that combining weakly informative features with Naıve Bayes allows our robot to plan and make reliable reaching motions. We hypothesize that a similar approach will extend these results to grasping and moving objects.