The Itch to Heal
With unconventional thinking honed at WPI, Elzani van Zyl tackles the scourge of malaria.
Read StoryIn the Medical FUSION Lab, engineers are combining medical imaging and robotics to make healthcare better, safer, and more cost-effective.
While surgeons prepare a patient with end-stage renal disease for a transplant, a helicopter delivers a donor kidney to the hospital helipad.
Before the new organ can be implanted, it must undergo a rigorous evaluation to be sure it’s free of pathogens and structural problems—damaged tubules, for example—that might prevent it from functioning properly in the recipient’s body.
Conventionally, the structural analysis is done by biopsy. On this day, however, the kidney is placed beneath a robotic arm that moves a sensor just above its surface. In less time than it would take to complete and analyze several isolated biopsies, the robot has scanned the entire kidney, showing that it is in excellent condition.
This futuristic scenario may soon become a reality thanks to research under way in the Medical FUSION (Frontier Ultrasound Imaging and Robotic Instrumentation) Laboratory at WPI. Founded by Haichong (Kai) Zhang, associate professor of biomedical engineering and robotics engineering, the lab and its team of graduate students and undergraduates are at the forefront of efforts to merge medical imaging and robotics to create new applications that will make surgery and clinical practice more effective, safer, and less costly for physicians and patients.
Zhang says the lab’s mission is to look at how healthcare providers currently deliver care to see if that work can be improved by adding imaging, automation, or, more often, both. “We are interested,” he says, “in how robotic technology can work together with imaging to make healthcare more functional.”
The kidney transplantation project is a good example of this fusion. About 90,000 Americans were on waiting lists for new kidneys in 2024, while fewer than 30,000 were transplanted in all of 2023. The global shortage of kidneys suitable for transplants is due, in part, to a high rejection rate for donor organs, which, Zhang says, is exacerbated by limitations in the current evaluation method.
The solution involves imaging and robotics. In collaboration with biomedical researchers at the University of Massachusetts and the University of Oklahoma, Zhang and PhD candidate Xihan Ma are exploring the use of an imaging technique called optical coherence tomography (OCT) as a replacement for kidney biopsies. OCT uses near-infrared light that can penetrate biological tissue. By observing how different tissues reflect and change the light, the OCT scanner can create a high-resolution image of an organ’s internal structure.
Since its beam is quite narrow, a single OCT scan will not “see” much more of a kidney’s interior than can be sampled with a biopsy needle. As a first step, Zhang and Ma are currently working with physicians at the Georgetown University Hospital on a patient study that is seeking to determine whether this noninvasive snapshot is at least as good as a biopsy in evaluating a kidney’s status. The next step, Zhang says, will be to place a robot in the mix.
A robotic optical coherence tomography system hovers over a 3D-printed replica of a kidney.
Already, he and Ma have shown that a general-purpose robot, armed with an OCT device and custom algorithms, can scan an entire kidney in about five minutes. Scans of animal kidneys and one rejected human kidney showed that OCT can give doctors a complete, detailed picture of a kidney’s interior, which should give them better information than they currently have available to decide whether a kidney is transplant-ready.
While OCT is an emerging form of medical imaging, ultrasound has been widely used for many years because of its versatility and portability. Zhang and his team are looking at ways to make this imaging modality even more useful through the addition of robotics. For example, during the COVID-19 pandemic, ultrasound was the go-to tool for quick diagnosis because it could be used right in the emergency room.
“But emergency room personnel didn’t want to be exposed to infected patients longer than necessary,” Zhang says. “If a robot can do the scan for them, they can focus on diagnosis and treatment.”
With an award from the National Institutes of Health, Zhang’s lab led an international team that quickly developed a robotic scanner that can be tele-operated to safely scan patients’ lungs from a nearby station or even a remote doctor’s office. The system, first tested in Nigeria as part of a WPI-led initiative to deploy robotic solutions to African countries, was low-cost and rugged, but limited in its capabilities.
Members of the Medical FUSION Lab, from left: Ryo Murakami, Yichuan Tang, Associate Professor Kai Zhang, and Xihan Ma.
Since then, Zhang and Ma, in partnership with emergency physicians at Beth Israel Deaconess Medical Center in Boston, have developed a more sophisticated system that uses a versatile robotic arm and artificial intelligence to fully automate ultrasound lung scans. The robot is driven by a machine-learning algorithm trained on hundreds of ultrasound images of volunteers in the diagnostic medical sonography program at the Massachusetts College of Pharmacy and Health Sciences in Worcester. “When doctors look at ultrasound scans,” Ma says, “they don’t want their view to be blocked by a rib. With our AI, the robot can understand what it is seeing and automatically change the position of the scanner to get the best angle.”
While developed to scan the lungs, the system can be adapted for use on many other organs, Zhang says. One target is the thyroid gland. Doctors use ultrasound to look for thyroid nodules, lumps that are usually benign, but which, in about five percent of cases, can become cancerous and need to be removed. For these scans, the challenge is repeatability.
“If doctors spot something suspicious,” Zhang says, “they will usually ask patients to come back in three to six months for another scan. But after that much time, they are unlikely to remember exactly how they did the scan. If the position or angle is different, even slightly, it will completely change the appearance of the thyroid, making it hard to judge if the nodules are getting larger. We are trying to automate thyroid scans, because a robotic system can remember and repeat scans precisely.”
In addition to helping doctors see and diagnose medical problems, the partnership of imaging and robotics can help make surgical procedures safer and more efficient. “We can bring the robot and the imaging device into the OR to help surgeons navigate through different anatomies,” Ma says.
In a current project, the Medical FUSION Lab is helping surgeons insert screws into the spine. Current best practices for properly aligning the screw insertion device include using an optical tracking system to create a 3D model of the spine or using a CT scanner in the operating room to guide the screw placement. The first method is highly complicated and time-consuming, while the second exposes the surgeons to regular doses of radiation.
Building on work Zhang completed as a postdoctoral fellow at Johns Hopkins University, where he also earned an MS and a PhD in computer science (he holds a BS and an MS in human health sciences from Kyoto University in Japan), he and Ma have developed a system that uses an ultrasound scanner controlled by a robot to create a dense 3D picture of the entire spine. This can be aligned with MRI or CT images made prior to surgery and used by the surgeon to plan out the screw insertion. “In this way,” Ma says, “they can complete the procedure without the need for a complicated setup or exposure to radiation.”
A similar challenge has provided real-world, problem-solving opportunities for several teams of undergraduates advised by Zhang. For their Major Qualifying Projects, these student teams, working closely with surgeons at UMass Chan Medical School, have developed a device that can greatly simplify a procedure called percutaneous nephrolithotomy (PCNL), which is used to remove kidney stones. To perform PCNL, a surgeon inserts a needle into the kidney through which tools are introduced to grab small stones or break up and vacuum away larger ones.
“The kidney is soft and can easily move,” Zhang says, “so guiding the needle to its target is challenging.”
To visualize the needle’s progress, surgeons typically use fluoroscopy, a type of live X-ray imaging that exposes the surgeons to radiation. Ultrasound can be used instead, but to see the needle the beam must be aimed from the patient’s side. “It’s like throwing darts,” Zhang says. “You don’t want to stand to the side of the dartboard; you want to aim from straight on.”
The solution was to create a mirror from a material that reflects ultrasound waves. The mirror is arranged at a 45-degree angle to the patient’s back, and the ultrasound beam bounces off the mirror and down toward the kidney. Likewise, sound reflected by the kidney bounces off the mirror and is received by the scanner. The needle, inserted through a hole in the mirror, can be aligned perfectly with the reflected ultrasound beam. “So now we can throw a bull’s-eye to accurately guide the kidney procedure,” Zhang says.
Like ultrasound, photoacoustic imaging uses sound to probe the structure of tissue and organs, but the sound starts out as light. It works by illuminating tissue with extremely brief pulses of laser light. Through what is known as the photoacoustic effect, the tissue warms and expands, creating pressure waves that are detected as sound in the ultrasound range. The frequency of the sound waves varies with the type of tissue (making it possible, for example, to distinguish muscle from blood vessels) and even with the state of the tissue (for example, how much oxygen blood is carrying).
Zhang is exploring the use of photoacoustic imaging to help surgeons with procedures that involve ablation, or the destruction of tissue. In one procedure, surgeons use radiofrequency ablation to destroy small areas of heart muscle responsible for conducting the errant electrical pulses responsible for atrial fibrillation.
“They must rely on their experience and intuition to decide how much tissue to ablate,” Zhang says. “They can’t see what they are doing and may ablate too much tissue or not enough. That’s why about 60% of patients who undergo ablation surgery for A-fib must undergo one or more additional procedures.”
In partnership with surgeons at the Texas Heart Institute at Baylor College of Medicine, Zhang’s team has developed a small photoacoustic sensor that can go into the heart with the ablation device to observe its effects. Since “cooked” muscle will reflect light differently than untouched tissue, “we can deploy an algorithm that detects the boundaries of the treated area so the surgeon will know if the job is complete,” he says.
Zhang hopes an MRI-compatible, robotically controlled photoacoustic sensor will detect biomarkers specific to aggressive prostate cancers.
With a seed grant from the Gapontsev Family Collaborative Venture Fund, which supports research at WPI in photonics and lasers, Zhang is working with Loris Fichera, associate professor of robotics engineering and an authority on laser surgery, and Andrea Arnold, associate professor of mathematical sciences, to explore the use of a similar photoacoustic sensor, guided by a robot, to monitor the success of tonsillectomies performed with laser ablation.
In another application of photoacoustic imaging, Zhang is collaborating with Gregory Fischer, professor of robotics engineering at WPI and a pioneer in the development of surgical robotic systems that can operate within MRI machines, and surgeons at the University of Texas Southwestern Medical Center, to add a new capability to a system Fischer has developed and extensively tested to assist with prostate cancer biopsies.
Fischer’s system goes into an MRI unit along with the patient to help surgeons guide the biopsy needle to suspected cancers using fresh MRI images. Zhang says MRI’s strength is its high sensitivity, which makes it good at detecting tumors. But it is less adept at distinguishing aggressive lesions that require immediate surgery from those that are slower growing. With a five-year, $1.9 million Director’s Early Independence Award from the NIH, Zhang is developing an MRI-compatible, robotically controlled photoacoustic sensor that will detect biomarkers specific to aggressive cancers to increase the likelihood that prostate cancer biopsies will produce useful results with fewer false positives.
“Photoacoustic imaging works in real time,” Zhang says, “so we get feedback faster and can make the needle insertion quicker and more precise.”
Photoacoustic imaging works in real time so we get feedback faster and can make the needle insertion quicker and more precise.
Like most of the work of the Medical FUSION Lab, the prostate cancer project takes advantage of the unique capabilities of WPI’s PracticePoint, a healthcare technology research and medical device development facility located at Gateway Park. Launched in 2020, PracticePoint has several clinical spaces, including a fully functioning operating room, an intensive care unit, and a medical imaging lab with a full-size MRI scanner, which are used by WPI faculty and students and outside companies for research, development, and testing of new medical technologies.
Zhang said the availability of PracticePoint and the chance to work with other faculty also engaged in biomedical and robotics research were important factors in his decision to join the WPI faculty. “Medical robotics is a collaborative and interdisciplinary field,” he says, “so having colleagues doing similar work, and a facility designed specifically to support that research—something most universities do not have—were big factors in leading me to believe I could be successful here.”
WPI’s focus on applied research and its success with commercializing innovative technology were also important to Zhang, who sees real-world potential in many of the projects in which he and his students are engaged. “It would be a waste of our resources and time to work on technology that clinicians do not want or need,” he says. “So, we want to make sure that our efforts are rewarded by creating solutions that will help clinicians and help patients.”
Reader Comments
0 Comments