Radiologists know that every patient is different. When reading a mammogram, it’s important to know a patient’s racial and ethnic background, health history, age, geographic location and breast density, just to name a few factors. Understanding how these differences may affect a patient’s diagnosis is crucial—and unfortunately, many radiologists do not have the data driven insights they need to accurately diagnose a diverse array of cases and find breast cancers earlier.
AI-based CAD provides a solution to this dilemma by providing a diversely-trained, unbiased second pair of eyes to radiology practices. According to multiple studies, including two 2019 articles in The Journal of Digital Imaging, use of cmTriage™ and cmAssist® can lead to a 27% increase in early cancer detection, without an increase in false-positive recalls, and a 69% reduction in false positives.To learn more about how AI can solve some of the diversity challenges radiologists face when diagnosing patients, we had a conversation with Kevin Harris, president of CureMetrix, Inc. Here’s what he had to say.
Are there any racial or ethnic groups that are at higher risk of breast cancer?
KH: When it comes to breast cancer, there are a couple of important things to keep in mind. There are some racial and ethnic groups that are statistically more likely to develop breast cancer—for example, the CDC has noted that women of Ashkenazi Jewish descent are more likely to have a BRCA gene mutation which puts them at higher risk.
Also, some racial and ethnic groups are more likely to die from breast cancer. While African-American women have a lower lifetime risk of breast cancer, they continue to have the highest breast cancer mortality rate in the nation. This disparity is rooted in the fact that many people of color face many systemic barriers to quality healthcare. This can translate into more pre-existing conditions, such as obesity, that raise a woman’s risk of developing breast cancer; less access to cutting-edge treatment options; and fewer opportunities for routine mammograms that could detect breast cancer in earlier stages.
In addition, some physiological differences that can make breast cancer difficult to diagnose are more common in certain racial and ethnic groups. Women of Asian descent, for example, are more likely to have dense breasts. These differences don’t necessarily make them more prone to breast cancer, but they do make it more difficult for radiologists to find their cancers earlier, since the highly concentrated fibrous tissue found in dense breasts creates a lot of “white noise” in a mammogram.
Most doctors are aware of these differences and understand that they may need to apply different diagnostic techniques beyond mammography, such as ultrasound, to accurately diagnose women of these backgrounds. However, AI-based CAD can detect subtle patterns that a human eye might miss, and flag those for a radiologist who may be reading a scan from a woman with dense breasts, or at a higher risk for developing or dying from breast cancer. CureMetrix developed cmTriage™, the first FDA-cleared mammography triage AI program, and diagnostic software cmAssist™ by training their algorithms on mammogram images from women worldwide in order to provide that service to radiologists.
How does AI help radiologists more accurately read the mammograms of patients with dense breasts?
KH: One of the biggest issues with dense breasts are their ability to obscure potentially cancerous growths. A woman on the higher end of the breast density scale will have a mammogram scan that basically looks like white on white. The ACR BI-RADS Atlas® 5th Edition acknowledges just how much this affects mammogram readings by deviating from its usual “ABCD” breast density scale; it’s much more subjective when instructing radiologists who need to determine how much a woman’s fibrous breast tissue is obscuring in her mammogram.
Radiologists do have a variety of tools to help them read a scan of dense breasts. They can zoom in, but because fibrous breast tissue looks so much like a potentially malignant calcification, sometimes it’s hard to see the forest for the trees. AI can look at the entire “forest,” and pick out
This is important because 80 percent of mammograms are read by general practice radiologists without specific training in mammography -- or by doctors in different specialties, like musculo-skeletal radiology or neuroradiology. Most women aren’t able to go to a board-trained mammography specialist, but AI provides the equivalent of a second opinion from that kind of specialist.
How was the CureMetrix AI trained to be able to triage and flag anomalies across a more diverse array of patients?
KH: The typical breast cancer detection rate without AI is about 5 malignant cases for every 1,000 mammogram scans. We’ve collected about 10 million images from around the world, from women of all races and ages in Latin America, Asia and across the U.S., which in and of itself has a hugely diverse population. Our algorithm was then trained on both the malignant and benign cases put to the test in the real world.
We knew our algorithm worked when a clinical system in Brazil started using CureMetrix. We had never collected data from Brazil, so the algorithm was forced to build on its previous knowledge when processing the Brazilian women’s mammograms. The results: the CureMetrix AI-based CAD in Brazil outperformed the success rate we had used for our FDA submission in the U.S., even though it had previously never seen images from that population before, all because it had a broad and diverse enough training set.
How can AI help solve disparities in healthcare access both in the U.S. and around the world?
KH: There’s a lot to be said for how much AI can level the playing field and help catch breast cancer earlier in more women. Many developing countries don’t have as many doctors serving their populations, don’t have as many training opportunities available, and don’t have as many screening programs for women. AI provides an equalization of care across countries. If you drop that algorithm that’s designed to support a doctor into a remote area of an African country, where their local doctor may not have had as much specialized training, they’re going to be able to serve their patients that much better.
There may also be cultural or religious issues in place that keep women from going in for routine mammograms—for example, in some countries, there may not be any female doctors within traveling distance for many patients, and it may be taboo for women to expose their breast to a male stranger, even if it’s a doctor. This means many women in these areas delay screening until their cancer is late-stage and less survivable. In these situations, there’s an opportunity to set up mammography clinics that might be staffed entirely by female radiology technicians, who would use AI-based CAD to flag cases that are suspicious—and only those would be sent to a radiologist offsite, who could read these scans from any location. Patients would only go in to see a doctor in-person if it turns out their case warrants further screening.
AI is also a great equalizer when it comes to cost. Most health plans will reimburse for a routine mammogram. However, the millions of women both in the U.S. and worldwide who do not have comprehensive healthcare coverage may not see this benefit, and cost can be a barrier to their ability to get regular screenings for breast cancer. There’s an argument to be made that AI can reduce the cost of screenings for these women. If a woman has her initial screening read by an AI-based CAD’s algorithm for a lower cost, and only referred to a radiologist for a more costly follow-up if her case is deemed suspicious enough, this saves her time and money. Because AI also reduces the number of false positives per image (FPPI) in mammography, this means fewer women will have to go back for unnecessary recalls, which means fewer women who need to shoulder the cost of childcare and time off work.
People often say: “AI doesn’t discriminate.” Could you elaborate more on what this means, and why it makes AI-based radiology tools such a benefit for a more diverse array of patients?
KH: AI both completely doesn’t discriminate and completely discriminates. Here’s what I mean: when a woman walks into a clinic, the doctor has no choice but to see the woman herself and what she looks like. If there’s any ounce of bias whatsoever in that doctor’s mind, and there probably is because studies show we’re all prone to it in the medical field, that bias may very well come through in their diagnosis. The algorithm in AI-based CAD never sees the patient and doesn’t know anything about that woman. It has no choice but to be completely unbiased in its recommendation.
On the flip side of that, AI can be completely biased—but not in the negative way one might assume. For example, if we took a large data set of Japanese women’s mammogram scans and trained an AI algorithm how to read these really difficult, dense breast cases from Japan, this AI would be completely biased but in the right direction. It would even be more sensitive to anomalies in dense breasts, regardless of ethnicity, which is more helpful for a radiologist seeing a variety of patients. It’s a positive bias that saves lives.
If you’d like to learn more about how AI can help you serve a truly diverse patient population, connect with us here https://curemetrix.com/solution-summary/