We anticipate a future in which humans and intelligent robots will collaborate on shared tasks. To achieve this vision, a robot must have sufficiently rich knowledge of the task domain and that knowledge must be usable in ways that support effective communication between the human and the robot. Navigational space is one of the few task domains where the structure of the knowledge is sufficiently well understood for a physically-embodied robot agent to be a useful collaborator, meeting genuine human needs.
We propose to develop and evaluate an intelligent robot capable of being genuinely useful to a human, and capable of natural dialog with a human about their shared task. This project is a collaboration between three accomplished research groups with overlapping as well as complementary areas of expertise.
The Hybrid Spatial Semantic Hierarchy (HSSH) is a human-inspired multi-ontology representation for knowledge of navigational space. The spatial representations in the HSSH provide for efficient incremental learning, graceful degradation under resource limitations, and natural interfaces for different kinds of human-robot interactions. Speech is a natural though demanding way to use natural language to communicate with a robot. To maintain real-time performance, natural language understanding must be organized to minimize the amount of backtracking from early conclusions in light of later information.
This project will answer three scientific questions. (1) Can the HSSH framework, extended with real-time computer vision, express the kinds of knowledge of natural human environments that are relevant to navigation tasks? (2) Can the HSSH representation support effective natural language communication in the spatial navigation domain? (3) Can we develop effective human-robot interaction that meets the needs of a person and improves the performance of the system?
We will perform this research with two different kinds of navigational robots, each learning from its travel experiences and building an increasingly sophisticated cognitive map: (1) an intelligent robotic wheelchair which carries its human driver to desired destinations, and (2) a telepresence robot that transmits its perceptions to a remote human driver as it navigates within an environment so the driver can achieve virtual presence and communicate with others remotely. To inform the design process, we will conduct focus groups with potential users. We will also evaluate our implemented systems throughout the process, creating an iterative design-test cycle.
Intellectual Merit: To be successful, an intelligent robot must not only be able to perceive the world, represent what it learns, make useful inferences and plans, and act effectively. It must also be able to communicate effectively with other agents, and particularly with people. This confluence among grounded knowledge representation, situated natural language understanding, and human-robot interaction is intellectually fundamental, and is the focus of our research project. Investigating this confluence requires a substantial knowledge representation, which we have for the domain of spatial mapping and navigation. Since the domain of spatial knowledge is foundational for virtually all aspects of human knowledge, we believe that its significance will be broad.
Broader Impact: The results of this project will create technologies for mobility assistance for people with disabilities in perception (blindness or low vision), cognition (developmental delay or dementia), or general frailty (old age). It will also support telepresence applications such as telecommuting, telemedicine and search and rescue. The project includes outreach to K–12 and community college students, K–12 teachers, and the public in a number of venues.
Our intelligent wheelchairs are named Vulcan, after the only Roman god with a disability. He was also said to be the first blacksmith, engineer, and roboticist.
Vulcan 1.0 was the Intelligent Wheelchair at the University of Texas at Austin.
Vulcan 2.0 is the Intelligent Wheelchair at the University of Michigan.
Paul Foster, Collin Johnson and Benjamin Kuipers. 2023.
The Reflectance Field Map: Mapping glass and specular surfaces in dynamic environments.
IEEE Int. Conf. on Robotics and Automation (ICRA-23).
Collin Johnson and Benjamin Kuipers. 2018.
Socially-aware navigation using topological maps and social norm learning.
AAAI/ACM Conf. on Artificial Intelligence, Ethics, and Society (AIES), 2018.
Collin E. Johnson. 2018.
Topological Mapping and Navigation in Real-World Environments.
PhD thesis, Computer Science & Engineering, University of Michigan, 2018.
Tom Williams, Collin Johnson, Matthias Scheutz and Benjamin Kuipers. 2017.
A tale of two architectures: A dual-citizenship integration of natural language and the cognitive map.
Int. Conf. Autonomous Agents and Multi-Agent Systems (AAMAS), 2017.
Jong Jin Park, Seungwon Lee and Benjamin Kuipers. 2017.
Discrete-time dynamic modeling and calibration of differential-drive mobile robots with friction.
IEEE Int. Conf. Robotics and Automation (ICRA), 2017.
Jong Jin Park. 2016.
Graceful Navigation for Mobile Robots
in Dynamic and Uncertain Environments.
Doctoral dissertation, Mechanical Engineering Department, University of Michigan, 2016.
Jong Jin Park and Benjamin Kuipers. 2015.
Feedback motion planning via non-holonomic RRT* for mobile robots.
IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS), 2015.
Grace Tsai. 2014.
On-line, incremental visual scene understanding
for an indoor navigating robot.
Doctoral dissertation,
Department of Electrical Engineering and Computer Science,
University of Michigan.
Grace Tsai and Benjamin Kuipers. 2014.
Handling perceptual clutter for robot vision with partial model-based interpretations.
IEEE/RSJ Int. Conf. Intelligent Robots and Systems (IROS), 2014.
Grace Tsai, Collin Johnson, and Benjamin Kuipers. 2014.
Semantic visual understanding of indoor environments: from structures to opportunities for action.
Vision Meets Cognition Workshop (CVPR), 2014.
Grace Tsai and Benjamin Kuipers. 2013.
Focusing attention on visual features that matter.
British Machine Vision Conference (BMVC), 2013.
Paul Foster, Zhenghong Sun, Jong Jin Park and Benjamin Kuipers. 2013.
VisAGGE: Visible Angle Grid for Glass Environments.
IEEE Int. Conf. on Robotics and Automation (ICRA-13).
Jong Jin Park and Benjamin Kuipers. 2013.
Autonomous person pacing and following with
Model Predictive Equilibrium Point Control.
IEEE Int. Conf. on Robotics and Automation (ICRA-13).
Changhai Xu, Jingen Liu and Benjamin Kuipers. 2012.
Moving object segmentation using motor signals.
European Conf. on Computer Vision (ECCV), 2012.
Grace Tsai and Benjamin Kuipers. 2012.
Dynamic visual understanding of the local environment
for an indoor navigating robot.
IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2012.
Jong Jin Park, Collin Johnson, and Benjamin Kuipers. 2012.
Robot Navigation with Model Predictive Equilibrium Point Control.
IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2012.
Collin Johnson and Benjamin Kuipers. 2012.
Efficient search for correct and useful topological maps.
IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 2012.
Grace Tsai, Changhai Xu, Jingen Liu and Benjamin Kuipers. 2011.
Real-time indoor scene understanding
using Bayesian filtering with motion cues.
Int. Conf. on Computer Vision (ICCV), 2011.
Jong Jin Park and Benjamin Kuipers. 2011.
A smooth control law for graceful motion of differential wheeled
mobile robots in 2D environments.
IEEE Int. Conf. on Robotics and Automation (ICRA-11).
S. Gulati, C. Jhurani, B. Kuipers and R. Longoria. 2009.
A framework for
planning comfortable and customizable motion of an assistive mobile robot.
IEEE/RSJ International Conference on Intelligent Robots and
Systems (IROS-09).
A. Murarka and B. Kuipers. 2009.
A stereo vision based mapping algorithm for detecting inclines,
drop-offs, and obstacles for safe local navigation.
IEEE/RSJ International Conference on Intelligent Robots and
Systems (IROS-09).
Patrick Beeson, Joseph Modayil, Benjamin Kuipers.
Factoring the mapping problem: Mobile robot map-building in the
Hybrid Spatial Semantic Hierarchy
International Journal of Robotics Research 29(4): 428-459, 2010.
A. Murarka, M. Sridharan and B. Kuipers. 2008.
Detecting obstacles and drop-offs using stereo and motion cues for
safe local motion.
IEEE/RSJ International Conference on Intelligent Robots and
Systems (IROS-08).
Patrick Beeson, Matt MacMahon, Joseph Modayil, Aniket Murarka, Benjamin Kuipers and Brian Stankiewicz. 2007.
Integrating multiple representations of spatial knowledge for mapping, navigation, and communication.
In Interaction Challenges for Intelligent Assistants,
AAAI Spring Symposium Series, Stanford, CA.
R. C. Simpson, E. F. LoPresti and R. A. Cooper. How many people would benefit from a smart wheelchair? Journal of Rehabilitation Research and Development 45(1): 53-72, 2008.
R. C. Simpson. Smart wheelchairs: a literature review. Journal of Rehabilitation Research and Development 42(4): 423-436, 2005.