- Kinect Gesture Recognition Empowers Surgical Robots
- Purdue University is adapting Microsoft's Kinect gesture-recognition engine to a robotic nurse that can recognize hand gestures and offer assistance to surgeons during operations. The robotic nurse is expected to reduce the time it takes to perform operations.
By adapting Microsoft's Kinect gesture-recognition hardware (originally designed for gaming) with a software development kit (SDK) from PrimeSense Ltd., Purdue University is developing a robotic scrub nurse to assist surgeons and other professionals.
Human scrub-nurse assistants today hand surgeons the proper surgical instrument when they gesture hand-out palms-up. Purdue's robotic scrub nurse performs the same operation while watching the surgeon through a video camera, offering a hemostat in response to the open-hand gesture. And soon the team plans to add voice recognition, in case the surgeon wants a scalpel, clamp or forceps instead.
Robots at Purdue University are being trained to respond to gestures when assisting surgeons and other professionals (source: Purdue University photo).
"Voice recognition gives good performance today, but recognizing gestures has been the weak link for robotic assistants," said Purdue professor Juan Pablo Wachs. "In order to advance the state-of-the-art we added gesture recognition, which we found works much better when using the Kinect."
The researchers' first-generation scrub-nurse assistant prototype used a standard video camera to recognize gestures. That model worked with simple instructions, such as the hand-out palms-up gesture. However, to indicate different instruments without voice recognition the team had to train the prototype to identify gestures like cutting with index and middle finger to indicate scissors. These types of complex gestures, however, could be mistaken for normal conversational gestures without Kinect, according to Wachs.
"Kinect gives us a three-dimensional map of the surgeon's gestures, which allows us to disambiguate between symbolic gestures intended for the robot and those just used during conversation," said Wachs.
Gesture recognition also allows the surgeon to direct a nearby computer to display images relevant to the current procedure that would otherwise require tapping on the keys of a laptop, thus slowing down the operation and introducing the possibility of infections from bacteria on the keyboard. With gesture recognition, assisted by voice recognition, a surgeon could ask for the "interior view," then flip between X-ray images of the patient with "brush" gestures similar to those used on touch screens to turn pages.
Beyond surgical assistance, the research team, which includes Purdue doctoral candidates Mithun Jacob and Yu-Ting Li, plans to adapt the system to several other application areas, including coordination of emergency response activities during crisis management and disaster relief, human-robot communications, and entertainment.
The Purdue University prototype robotic scrub nurse was developed at the School of Veterinary Medicine using anthropometry algorithms that model the physiology of hands, resulting in highly accurate gesture recognition even under difficult lighting conditions, according to the team.
Funding was provided by the U.S. Agency for Healthcare Research and Quality.