Researchers taught a robot to suture by showing it surgery videos
Researchers
taught a robot to suture by showing it surgery videos
Robot see, robot do.
Andrew Tarantola, June 16, 2020Stitching a patient back together
after surgery is a vital but monotonous task for medics, often requiring them
to repeat the same simple movements over and over hundreds of times. But thanks
to a collaborative effort between Intel and the University of
California, Berkeley, tomorrow’s surgeons could offload that grunt
work to robots -- like a macro, but for automated suturing.
The UC
Berkeley team, led by Dr. Ajay Tanwani, has developed a semi-supervised AI
deep-learning system, dubbed Motion2Vec. This
system is designed to watch publically surgical videos performed by actual
doctors, break down the medic’s movements when suturing (needle insertion,
extraction and hand-off) and then mimic them with a high degree of
accuracy.
“There's a lot of appeal in learning from visual observations,
compared to traditional interfaces for learning in a static way or learning
from [mimicking] trajectories, because of the huge amount of information
content available in existing videos,” Tanwani told Engadget. When it comes to
teaching robots, a picture, apparently, is worth a thousand words.
Comments
Post a Comment