Mine Detection Software Identifies Cancer Cells
Adapted from the Office of Naval Research
Medical researchers at the University of Pennsylvania have demonstrated that software developed by a Duke University electrical engineer for finding and recognizing undersea mines can help doctors identify and classify cancer-related cells.
As it turns out, the problem that physicians encounter in analyzing images of human cells is surprisingly similar to the Navy’s challenge of finding undersea mines.
“The results are spectacular,” Larry Carin, William H. Younger Professor and chairman of electrical and computer engineering at Duke’s Pratt School of Engineering, said of the recent findings. Carin developed the technology with the support of the Office of Naval Research (ONR).
“This is not a typical Navy transition,” Carin continued. “But it is a transition to a very important medical tool used at hospitals around the world. There is a real chance this may save lives in the future. This could be a game-changer for medical research.”
The active learning software developed by Carin for the ONR allows robotic mine-hunting systems to behave more like humans when they are uncertain about how to classify an object. Using information theory, the software asks a human to provide labels for those items. This feature is valuable in mine warfare, where identifying unknown objects beneath the ocean has been accomplished traditionally by sending in divers.
Similarly, when examining tissue samples, doctors must sift through hundreds of microscopic images containing millions of cells. To pinpoint specific cells of interest, they use an automated image analysis software toolkit called FARSIGHT, or Fluorescence Association Rules for Quantitative Insight. Funded by the National Institutes of Health (NIH) and the Defense Advanced Research Projects Agency (DARPA), FARSIGHT identifies cells based upon a subset of examples initially labeled by a physician.
But the resulting classifications can be erroneous because the computer applies tags based on the small sampling. By adding active learning software algorithms developed by Carin, the identification of cells is more accurate and FARSIGHT’s performance more consistent. The enhanced toolkit also requires physicians to label fewer cell samples because the algorithm automatically selects the best set of examples to teach the software.
The Pennsylvania medical team is applying Carin’s algorithms, embedded into FARSIGHT, to examine tumors from kidney cancer patients. Focusing on endothelial cells that form the blood vessels that supply the tumors with oxygen and nutrients, the research could one day improve drug treatments for different types of kidney cancer, also known as renal cell carcinoma.
“With the computer program having learned to pick out an endothelial cell, we have now automated this process, and it seems to be highly accurate,” said Dr. William Lee, an associate professor of medicine, hematology and oncology at the university who is leading the research effort. “We can begin to study the endothelial cells of human cancer -- something that is not being done because it’s so difficult and time-consuming to do.”
It usually takes days, even weeks, for a pathologist to manually pick out all the endothelial cells in 100 images. The enhanced FARSIGHT toolkit can accomplish the same feat in a few hours with human accuracy.
“This is an important NIH-funded clinical study that we’re supporting with FARSIGHT, and Dr. Carin’s active learning system has been a great success,” said Badri Roysam, an electrical and computer engineering professor at the University of Houston and program investigator for FARSIGHT.