Lynn Conway and Charles Cohen



Video Mirroring and Iconic Gestures: Enhancing Basic Videophones to Provide Visual Coaching and Visual Control. In this paper we present concepts and architectures for mirroring and gesturing into remote sites when video conferencing. Mirroring enables those at one site to visually coach those at a second site by pointing at locally referenceable objects in the scene reflected back to the second site. Thus mirror-ing provides a way to train people at remote sites in practical tasks such as operating equip-ment and assembling or fixing things. We also discuss how video mirroring can be extended to enable visual control of remote mechan-isms, even when using basic videophones, by using a visual interpreter at the remote site to process transmitted visual cues and derive intended control actions in the remote scene. [Lynn Conway and Charles J. Cohen, IEEE Transactions on Consumer Electronics, Vol. 44, No. 2, May 1998]


Dynamic System Representation, Generation, and Recognition of Basic Oscillatory Motion Gestures. We present a system for generation and recognition of oscillatory gestures. Inspired by gestures used in two representative human-to-human control areas, we consider a set of oscillatory (circular) motions and refine from them a 24 gestures lexicon. Each gesture is modeled as a dynamic system with added geometric constraints to allow for real time gesture recognition using a small amount of processing time and memory. The gestures are used to control a pan-tilt camera neck. We propose extensions for use in areas such as mobile robot control and telerobotics. [Charles J. Cohen, Lynn Conway, and Dan Koditschek. 2nd International Conference on Automatic Face- and Gesture-Recognition, Killington, Vermont, October 1996.]

Telautonomous Systems: Projecting and Coordinating Intelligent Action at a Distance. There is a growing need for humans to perform complex remote control operations, and for the extension of the intelligence and experience of experts to distant applications. A blending of human intelligence, information technology, remote control, and intelligent autonomous systems is required. The authors have coined the term telautonomous technology (or "telautomation") for methods of producing intelligent action at a distance. This paper introduces a framework for envisioning teleautomation that goes beyond autonomous control, in that it blends in human intelligence and action as appropriate; it also goes beyond teleoperation in that it incorporates as much autonomy as is possible or reasonable.

A novel method for solving one of the fundamental problems facing telautonomous systems is then discussed in this paper, namely the need to cope with time delays due to signal processing and signal progagation. New concepts, called time and position clutches, are introduced and described in detail. By allowing the time and position frames between the local user's control and the remote device being controlled to be desynchronized, these mechanisms enable substantial telemanipulation performance-improvements when operating through time delays. These novel controls also yield a simple protocol for smooth handoffs of control of manipulation tasks between local operators and remote systems. [Lynn Conway, Richard A. Volz and Michael W. Walker, IEEE Transactions on Robotics and Automation, Vol. 6, NO. 2, April 1990]