- Home
- Technology & AI
- CheXzero: AI that can diagnose ...
Students from Stanford University and Harvard Medical School scientists have jointly come up with artificial intelligence named CheXzero that can detect diseases based on X-ray scans and language descriptions in clinical reports.
This tool is a major stride in clinical AI design because, unlike CheXzero, all previous AI models developed for similar purposes have needed strenuous annotations of large amounts of data fed to them during the training stage. A report by Nature Biomedical Engineering found that CheXzero was able to perform as well as human radiologists and match their accuracy in detection of indications on chest X-rays.
The team working on this project has also made the code for this AI publicly available to others so that it could be applied to other diagnostic tests such as CT scans, echocardiograms, and MRIs.
Pranav Rajpurkar, assistant professor of biomedical informatics at the Harvard Medical School, led the project and is hopeful that similar AI models that require minimal supervision will be instrumental in increasing access to healthcare globally, especially in countries with few specialists.
Generally, AI models need labeled datasets during their training in order to learn to correctly identify diseases. Therefore large-scale annotation by human clinicians for medical image-interpretation tasks is needed, this is usually time consuming and expensive. Even expert radiologists typically have to look at thousands of X-ray images in detail and explicitly label each with the prognosis if they desire to label an x-ray data set for training.
Recent AI models are learning from unlabeled data in their “pre-training” stage. However, training with labeled data later on is still a requirement to produce accurate results.
In comparison to this, the new model; CheXzero, does not need hand-labeled data at any point in its training. It is able to learn more independently without external supervision. The model depends soley on chest X-rays and the doctor’s notes found in accompanying reports. Equipped with just this information the AI can effectively identify diseases such as pneumonia, collapsed lungs, and lesions.
“CheXzero has successfully demonstrated that the interpretation of complex medical image is no longer dependent on large labeled datasets,” said study co-first author Ekin Tiu. “X-rays are the first of many possible situations in which this AI can facilitate us. In the larger context, CheXzero has the potential to be generalizable to a vast array of medical settings where data is stored in an unstructured way. CheXzero promises a future where we will be able to avoid large-scale labeling in medical machine learning.”
CheXzero was trained on a publicly available dataset of more than 350,000 chest X-rays and their corresponding clinical notes. Two sets of datasets of chest x-rays were used to further test the technology along with complimentary notes from separate institutions. This was done to ensure that the technology was able to find the same outcome for a chest x-ray even if the notes came from two different sources / countries.