What Have You Done?

a picture of coco the infant gorilla robot a picture of cardea, a robot for mobile manipulation

I am deeply interested in intelligent systems that perceive and act within human environments. The following narrative gives a chronological, hyperlinked summary of my past research activities. For the cold facts you can look at my CV and publications.

At MIT I was a member of Rod Brooks's group. I initially worked on vision for the humanoid robot Cog and an independent cog-like robotic head. This was followed by research with Cynthia Breazeal that led to our proposal for a new approach to artificial intelligence (AI) based on robots that emulate infant development. I then worked with Yuppy, a mobile pet robot, for which I focused on the actuated vision system and human-robot interaction. The Yuppy project evolved into the Coco project, which allowed me to contribute to the creation of a new robot from the initial brainstorming sessions to the design, the construction, and the programming. Coco (left image) ended up being similar to a quadrupedal infant gorilla that could crawl around and sit up. For my Area Exam, I evaluated computational models of the rat hippocampus.

As a qualified graduate student, I became interested in how wearable computing could be used to teach robots to behave intelligently. Paul Fitzpatrick and I created the first system to use a shoe-mounted camera to perform gait analysis, obstacle detection, and context recognition. For my dissertation, I created the first wearable system to autonomously infer a kinematic model of the wearer via body-mounted orientation sensors and a head-mounted camera. I also demonstrated a variety of applications for this fully untethered system, such as browsing the wearer's activities, acquiring image segments of the wearer's hand and manipulated objects, and discovering significant arm postures.

In parallel with my work on wearable computing, I continued to contribute to robotics research. I became especially interested in robot manipulation. I worked on mobile manipulation in built-for-human environments as part of the Cardea project. Cardea (right image), which was designed to open doors, used an RMP Base (Segway for robots) as a mobile platform and an arm that was a predecessor to the arms of the humanoid robot Domo.

As a postdoctoral researcher, I worked with Aaron Edsinger and his humanoid robot Domo. We created new methods for manipulation in human environments, including novel approaches to tool use, visual-motor learning, and human-robot interaction. We also developed assistive applications that enabled Domo to help a seated person place everyday objects on a shelf, or perform everyday bimanual tasks such as pouring and stirring.

These assistive applications led me to my current research, which focuses on the development of intelligent robots with autonomous capabilities for healthcare.