Experienced professionals tend to be better than rookies partly because they know how to use the tools of their trade more effectively. Cytologists who evaluate cells under a microscope become more consistent and accurate when they know how to prep their apparatus to produce clear images.
Researchers at Duke University have now given a microscope the capability to intelligently adjust its settings, including the light angles, color, and patterns, to achieve optimal results when classifying healthy and malaria infected red blood cells. The system is designed to address the capabilities of a digital camera, rather than the human eye, and so can perform incredibly well.
“A standard microscope illuminates a sample with the same amount of light coming from all directions, and that lighting has been optimized for human eyes over hundreds of years,” said Roarke Horstmeyer, the lead researcher. “But computers can see things humans can’t. So not only have we redesigned the hardware to provide a diverse range of lighting options, we’ve allowed the microscope to optimize the illumination for itself.”
Previously, other research groups, including some from Duke, have developed computer vision algorithms that can classify cells infected with the P. falciparum parasite that causes malaria. Although effective, they still lack the consistent accuracy that’s desired for clinical diagnostics.
This research team has now taught a computer how to adjust the various parameters of a microscopy system, and powered it with a deep learning classification algorithm so well that it beats seasoned physicians and previously developed automated systems at malaria classification.
The new imaging system uses a novel light source that surrounds the samples from the sides and below. The computer can change which LEDs in this bowl-light fixture turn on and off and which colors to use. The computer was shown hundreds of samples of red blood cells infected with the malaria pathogen, as well as healthy cells. The system was made to adjust the illumination in order to see which settings work best in classifying the cells. After the training period, the system was about 90% accurate, compared with about 75% accuracy typically demonstrated by physicians, and existing learning algorithms.
The technology can be applied to other diagnostic imaging tasks, potentially automating entire processes that happen in hospital pathology labs.
Study in Biomedical Optics Express: Learned sensing: jointly optimized microscope hardware for accurate image classification
Via: Duke University