After Dr. Carla Pugh and a resident, or doctor-in-training, cracked open the chest of a crash victim, the patient’s heart stopped. The resident squeezed the heart through her fingers, causing beats to return. But then the heart stopped again. Pugh reached in and felt something hard. She cut open the area, and a blue clot of blood gushed out. The heart resumed beating, robustly this time, allowing Pugh to send the patient to the operating room while she examined others with injuries in UW Hospital’s emergency room. After the incident last year, Pugh wondered: Why did the resident miss the clot? Maybe, she figured, the resident hadn’t been taught how a clot feels.
Pugh, director of UW Health’s clinical simulation program, is a champion of haptic technology, the science of touch. Medical students must pass oral and computer-based exams to become doctors, but they aren’t evaluated on how well they can feel a cancerous tumor in a breast, a blocked duct in a pancreas, a clot in a heart. “We don’t have a test for hands-on skills, and we desperately need one,” Pugh told an audience in San Francisco in September at the TEDMED conference, an annual gathering of innovators.
Pugh is developing anatomical models embedded with sensors and using motion-tracking systems to monitor doctors’ movements, steps that could help shift medicine toward more tactile training and testing. Her tools include sponges, balloons and squishy balls she finds at craft and fabric stores — whatever feels like a particular organ, growth or tissue layer to her. To mimic lumps in breasts that can mask tumors, for example, she uses lentils. “We’ve scoured most of the hobby shops in the city,” she said.
The National Board of Medical Examiners, which oversees the country’s medical licensing exams, agrees that hands-on simulation should become part of the process, said Dr. Kim Edward LeBlanc, executive director of the Clinical Skills Evaluation Collaboration, which administers part of the licensing exam. “We are looking at that quite heavily,” LeBlanc said. “We want to simulate abnormal (physical) findings in our exam.” Pugh, 48, grew up in Berkeley, California, and went to Howard University College of Medicine in Washington, D.C. She was surprised that medical school and surgical training relied so much on books. “I ... found it odd that I was expected to read about what I’m supposed to do with my hands,” she told the TEDMED conference.
Late in medical school, she observed a surgery to remove a prostate gland, she recalled in an interview. “For at least two hours, all I saw was the back of the surgeon’s hand,” she said. “I’m like, are you kidding me?” She didn’t dare break protocol by talking directly to the surgeon. She asked a resident, “How are you learning what he’s doing?” To her surprise, the surgeon responded, saying she had asked a great question. “He grabbed my hand and shoved it into the pelvis,” she said. “He said, ‘Do you feel that little ball?’ ” She did feel it, and the experience gave her a better sense of what a prostate feels like — similar to a walnut, but softer, she said.
Residents — medical school graduates undergoing years of additional training before they can work on their own — get chances to feel healthy and irregular tissue, such as the clot in the emergency room patient’s heart. But in the rush of patient care, when the patient’s well-being takes precedence over training, teachable moments can vanish quickly, Pugh said. She never discussed the clot with her resident, for example, because they had to treat other patients that day and then the resident moved to a different institution. “These lost opportunities happen every hour,” Pugh said. That is why she believes simulation is the solution. After her surgical training, she got a PhD in education at Stanford University, where she obtained the first of two patents on medical applications of sensor technology. UW Health’s clinical simulation program, which Pugh became clinical director of in 2012, features a $6 million facility on the first floor of UW Hospital.
The facility’s high-tech manikins — anatomical models used in health care, as opposed to storefront mannequins — enable students and health care workers to learn how to put breathing tubes in patients, insert catheters, tie stitches, lift patients out of bed, respond to emergencies and use new models of equipment such as defibrillators and surgical tools. But it is in Pugh’s research lab, on the third floor of the hospital, where most of her innovations unfold. The lab, which has 18 researchers, is supported by two federal grants: $1.6 million from the National Institutes of Health to develop sensor-enabled breast models to quantify breast exam skills; and $2 million from the Department of Defense to examine “skills decay” — what can happen when doctors spend time away from regular work, such as for military duty. Summarizing both projects, Pugh said, “I bring mistakes to life that people don’t realize they can make.”
Using a breast model made with silicone, silicone gel, lentils, two types of cotton and a hard clay ball to mimic a tumor, Pugh evaluated experienced doctors while they conducted a breast exam.
About 15 percent of the doctors missed the tumor, meaning their technique was ineffective, she said. For the military project, she has developed “box trainers” — partial anatomical models made of box-like material — for four procedures: placing a central line, inserting a bladder catheter, connecting two portions of intestine and repairing a hernia.
She is testing medical residents in Wisconsin, Illinois and Minnesota who already planned to spend two years away from surgical training to do research, a proxy for military duty. Pugh is evaluating the residents at the beginning of their leave, midway through it and at the end. “We’re looking at what gets lost when you have time away,” she said. The projects could someday lead to hands-on tests of doctors’ skills, Pugh said.
Medical students must pass three computer-based exams to become doctors. To assess communication skills, the National Board of Medical Examiners added encounters with patient actors to the second exam a decade ago. Specialty groups such as the American Board of Surgery generally require a computer-based exam and oral exam for certification, and a computer exam for re-certification every 10 years. Hands-on assessments could be added during medical school to weed out students from certain specialties, and for experienced doctors to be sure they keep up with the latest techniques, Pugh said. She would like to see national medical boards adopt hands-on evaluations and hospitals use them when granting privileges to doctors. It might not be easy to get experienced doctors to agree to such scrutiny, Pugh acknowledged. But if they want to ensure the best patient care, they should, she said. “It’s going to require a culture change,” she said.