Virtual Human Hand Simulation holds Promise for Prosthetics, Virtual Reality, Medical Education

| September 23, 2019 

Working with a clinical radiologist, USC computer scientists combined visual effects techniques and medical imaging to create precise model of the human hand in motion

USC Computer Science Professor Jernej Barbic and PhD candidate Bohan Wang developed the world’s most realistic model of the human hand’s musculoskeletal system in motion. Photo/Hoatian Mai.

USC Computer Science Professor Jernej Barbic (right) and PhD candidate Bohan Wang developed the world’s most realistic model of the human hand’s musculoskeletal system in motion. Photo/Haotian Mai.

Whatever our hands do—reaching, grabbing or manipulating objects—it always appears simple. Yet your hands are one of the most complicated, and important, parts of the body.

Despite this, little is understood about the complexity of the hand’s underlying anatomy and, as such, animating human hands has long been considered one of the most challenging problems in computer graphics.

That’s because it has been impossible to capture the internal movement of the hand in motion—until now.

Using magnetic resonance imaging (MRI) and a technique inspired by the visual effects industry, a team of USC researchers, comprising two computer scientists and a radiologist, has developed the world’s most realistic model of the human hand’s musculoskeletal system in motion.

The musculoskeletal system includes muscles, bones, tendons and joints. The breakthrough has implications not only for computer graphics, but also prosthetics, medical education, robotics and virtual reality.

“The hand is very complicated, but prior to this work, nobody had built a precise computational model for how anatomical structures inside the hand actually move as it is articulated,” said study co-author Jernej Barbic, an Andrew and Erna Viterbi Early Career Chair and Associate Professor of Computer Science.

The dataset is now available for use, free with attribution. 

Designing better prosthetic technology 

To tackle this problem, Barbic, a computer animation and physically-based simulation expert, and his PhD student, Bohan Wang, the study’s lead author, teamed up with George Matcuk, MD, an associate professor of clinical radiology at Keck School of Medicine of USC. The result: the most precise anatomically based model of the hand in motion.

“This is currently the most accurate hand animation model available and the first to combine laser scanning of the hand’s surface features and to incorporate an underlying bone rigging model based on MRI,” said Matcuk.

In addition to creating more realistic hands for computer games and CGI movies, where hands are often exposed, this system could also be used in prosthetics, to design better fingers and hand prostheses.

“Understanding the motion of internal hand anatomy opens the door for biologically-inspired robotic hands that look and behave like real hands,” said Barbic.

“In the not-so-distant future, the work may contribute to the development of anatomically realistic hands and improved hand prosthetics.”

“In the not-so-distant future, the work may contribute to the development of anatomically realistic hands and improved hand prosthetics.”
Jernej Barbic

The study, titled Hand Modelling and Simulation using Stabilized Magnetic Resonance Imaging, was presented at ACM SIGGRAPH in Aug., 2019.

A long-standing challenge

To improve realism, virtual hands should be modeled similarly to biological hands, which requires building precise anatomical and kinematic models of real human hands. But we still know surprisingly little about how bones and muscles move inside the hand.

One of the reasons is that, until now, there have been no methods to systematically acquire the motion of internal hand anatomy. Although MRI scanners can provide anatomical details, a previously unaddressed practical challenge exists: the hand must be kept perfectly still in the scanner for around 10 minutes.

“Holding the hand still in a fixed pose for 10 minutes is practically impossible,” said Barbic. “A fist is easier to hold steady, but try semi-closing your hand and you’ll find you start to shake after about a minute or two. You can’t hold it still for 10 minutes.”

To overcome this challenge, the researchers developed a manufacturing process using lifecasting materials from the special effects industry to stabilize the hand during the MRI scanning process. Lifecasting involves making a mold of the human form and then reproducing it in various media, including plastic or silicone.

Barbic, who worked on the Oscar-nominated film The Hobbit: the Desolation of Smaug, landed on the idea after seeing an inexpensive hand-cloning product in a visual effects store in Los Angeles while working on a previous project. “That was the eureka moment,” said Barbic, who has long pondered a solution for creating more realistic virtual human hands.

First, the team used the life-casting material to create a plastic replica of the model’s hand. This replica captures extremely detailed features, down to individual pores and tiny lines on the hand surface, which were then scanned using a laser scanner.

Then, the lifecasting process was used again, this time on the plastic hand, to create a negative 3D mold of the hand out of a rubber-like elastic material. The mold stabilizes the hand in the required pose.  The mold was cut in two parts, and then the subject placed their real hand into the mold for MRI scanning.

“As we refine this work, I think this could be an excellent teaching tool for my students and other doctors who need an understanding of the complex anatomy and biomechanics of the hand.” George Matcuk

With assistance from radiology expert Matcuk, a practicing medical doctor at USC, the hand was then scanned by the MRI scanner for 10 minutes. This procedure was repeated 12 times, each time in a different hand pose. Two subjects, one male and one female, were captured in this way. Now, for every pose, the researchers knew exactly where the bones, muscles and tendons were positioned.

After discussing the anatomical features of the MRI scans with Matcuk, Barbic and Wang set to work building a data-driven skeleton kinematic model that captures complex real-world rotations and translations of bones in any pose.

They then added soft tissue simulation, using the finite element method (FEM) to compute the motion of the hand’s muscles, tendons and the fat tissue, consistent with the bone motion. This model, combined with surface detail allowed them to create a highly realistic moving hand. The hand can be animated in any motion, even movement which is very different from the captured poses.

Going forward

The team, which recently received a grant from the National Science Foundation to take their work to the next stage, plans to build a public dataset of multi-pose hand MRI scans, for 10 subjects over the next three years. This will be the first dataset of its kind and will enable researchers from around the world to better simulate, model and re-create human hands. The team also plans to integrate the research into education, to train PhD students at USC and for K-12 outreach programs.

“As we refine this work, I think this could be an excellent teaching tool for my students and other doctors who need an understanding of the complex anatomy and biomechanics of the hand,” said Matcuk.

The team is currently working on adding better awareness of muscles and tendons into the model and making it real-time. Right now, it takes the computer about an hour to create a minute-long simulation. Barbic and Wang hope to make the system faster, without losing quality.

 

Published on September 23rd, 2019

Last updated on May 16th, 2024

Share this Post