Healthcare Robotics Logo

Clickable World Interface

laser pointer interface

We have developed a novel interface for human-robot interaction and assistive mobile manipulation. The interface enables a human to intuitively and unambiguously select a 3D location in the world and communicate it to the robot. The human points at a location of interest and illuminates it ("clicks it") with an unaltered, off-the-shelf, green laser pointer. The robot detects the resulting laser spot with an omnidirectional, catadioptric camera with a narrow-band green filter. After detection, the robot moves its stereo pan/tilt camera to look at this location and estimates the location's 3D position with respect to the robot's frame of reference.

Unlike previous approaches, this interface for gesture-based pointing requires no instrumentation of the environment, makes use of a non-instrumented everyday pointing device, has low spatial error out to 3 meters, is fully mobile, and is robust enough for use in real-world applications.

a clickable world

When a user selects a 3D location, it triggers an associated robotic behavior that depends on the surrounding context. For example, if the robot has an object in its hand and the robot detects a face near the click, the robot will deliver the object to the person at the selected location. In essence, virtual buttons get mapped onto the world, each with an associated behavior (see image above). The user can click these virtual buttons by pointing at them and illuminating them with the laser pointer.

In our object fetching application there are initially virtual buttons surrounding objects within the environment. If the user illuminates an object ("clicks it") the robot moves to the object, grasps it, and lifts it up. Once the robot has an object in its hand, a separate set of virtual buttons get mapped onto the world. At this point, clicking near a person tells the robot to deliver the object to the person. Clicking on a tabletop tells the robot to place the object on the table. While clicking on the floor tells the robot to move to the selected location.

This project is funded by the Wallace H. Coulter Foundation as part of a Translational Research Partnership in Biomedical Engineering Award, "An Assistive Robot to Fetch Everyday Objects for People with Severe Motor Impairments".

Publications