User login
Screening mammograms miss close to one in eight breast cancers. But early research suggests artificial intelligence (AI) could close this detection gap and markedly improve early diagnosis of the disease. Still, questions remain regarding how to best incorporate AI into screenings and whether it’s too soon to deploy the technology.
Already, some radiology clinics are offering AI analysis of mammograms through an add-on cost method.
Mammography patients who visit RadNet facilities, for example, have the option of an additional AI screening of their images. RadNet, the largest national owner and operator of fixed-site diagnostic imaging centers in the United States with more than 370 locations, first launched its AI program in the Northeast. The company has now rolled out its product across all regions in the country.
Because the AI is not reimbursed by insurers, patients must pay a $40 out-of-pocket fee if they want the AI analysis.
“RadNet practices have identified more than 400 women whose cancer was found earlier than it would have been had the AI not been present,” said Greg Sorensen MD, chief science officer for RadNet.
How RadNet’s AI Program Works
Patients coming to RadNet facilities for screening mammography undergo 3D high-resolution mammography that includes the use of 70-micron resolution detector technology, said Dr. Sorensen. The mammogram is reviewed by a qualified radiologist with assistance from two Food and Drug Administration–cleared AI programs, Saige-Q and Saige-Density. The radiologist then makes an interpretation.
Saige-Q is an AI tool that helps identify more suspicious mammograms by providing a quick signal to radiologists if the AI considers a given mammogram to be in a suspicious category, according to Dr. Sorensen. Saige-Density provides a density rating for each mammogram using one of the four standard categories:
- A. Almost entirely fatty
- B. Scattered areas of fibroglandular density
- C. Heterogeneously dense
- D. Extremely dense
Starting in September 2024, the FDA will require all mammogram reports to indicate density.
For patients who choose the $40 add-on service, called Enhanced Breast Cancer Detection, two other FDA-registered AI programs are also applied: Saige-Dx and Saige-Assure. These AI programs go a step further by placing marks on areas within the images that they find suspicious. Mammograms flagged as “high-suspicion” by the AI are then reviewed by a second human radiologist. The first and second radiologists confer to agree on a final diagnosis, Dr. Sorensen explained.
“Our research shows that approximately 20% more cancers are found when the safeguard review process is in place,” Dr. Sorensen said. “We also have seen [30%] decreases in recall rates” — the percentage of screening cases in which further tests are recommended by the radiologist.
Bethesda radiologist Janet Storella, MD, has used the AI program for about 3 years and said the technology has improved her screening performance.
The AI is linked to her practice’s imaging software, and radiologists have the option of turning the AI on at any time during their reading of screening mammograms, Dr. Storella explained. Some radiologists review the mammogram first and then initiate the AI, while others like Dr. Storella turn it on at the start, she said. Once initiated, the AI draws bounding boxes — or outlines — around areas that it deems suspicious.
The AI helps focus Dr. Storella’s attention on suspicious areas and grades the level of suspicion into one of four categories: high, intermediate, low, and minimal, she said.
“I find it especially useful in patients who have dense breast tissue,” said Dr. Storella, medical director of women’s imaging at Community Radiology Associates, a RadNet practice. “In these situations, the tissue on the mammogram is a field of white, and cancers are also white, so you’re looking for that little white golf ball on a sea of snow. The AI really helps hone that down to specific areas.”
About 35% of RadNet’s screening mammography patients have enrolled in the Enhanced Breast Cancer Detection program, according to RadNet data. In a recent study of nine general radiologists and nine breast imaging specialists, all radiologists improved their interpretation performance of DBT screening mammograms when reading with RadNet’s AI versus without it. (An average AUC [area under the receiver operating characteristic curve] of 0.93 versus 0.87, demonstrating a difference in AUC of 0.06 (95% CI, 0.04-0.08; P < .001)
Is Mammography Ready for AI?
RadNet is among a growing number of commercial companies offering AI solutions for mammography. MammoScreen and Hologic, for example, are two other companies that provide AI programs to assist radiologists in reading screening mammograms.
“We are at the start of the AI integration into breast imaging at this point,” said Laura Heacock, MD, a breast imaging radiologist and associate professor of radiology at NYU Langone Health. “There are multiple commercial AI models now available to radiologists to use in their practice [ and] there will likely be more. We’re in the transition stage where people are still deciding: Which is the best model to go with? How do I put it in my system? How do I ensure it works they way it was intended? Every practice and medical system will have a different answer to that question.”
At NYU Langone Health, researchers have been developing and studying optimal AI models for breast imaging for several years, Dr. Heacock said. Researchers thus far, have developed AI models for 2D digital mammography, 3D mammograms, breast ultrasound, and breast MRI. Similar to commercial AI systems, the AI is embedded into the picture archiving and communication (PACS) system used by radiologists to review images. Radiologists press a button to launch the AI, which draws a box around suspicious areas of the image and scores the suspicion.
“I take a look of where it is on the mammogram and decide whether that fits my level of suspicion,” Dr. Heacock said. The AI may not understand things about the mammogram like we do. For example, surgical scars look very suspicious to an AI model. But if I’m looking at a mammogram where [the patient] has had a stable scar that hasn’t changed in 10 years, I’m not concerned that the AI found it suspicious. My clinical judgment is the ultimate decider. This is just an additional piece of information that’s helpful to me.”
Research by New York University (NYU) has shown that when used by an expert radiologist the AI models have improved breast cancer detection in all four modalities, she said.
However, the AI has not yet launched at NYU Langone. More research is needed before deploying the technology, according to Dr. Heacock.
“At NYU, we are still testing the benefits to patients,” she said. “We know it improves cancer detection, but we want to make sure there are no drawbacks. We are still exploring the best ways to put it into effect at our institution.”
Dr. Heacock pointed to recent studies on AI in screening mammography that show promise.
An analysis of more than 80,000 women, for example, published in The Lancet Oncology in August, found that AI-supported screen reading led to a similar cancer detection rate as compared with a two-person reader system. This screening resulted in 244 screen-detected cancers, 861 recalls, and a total of 46,345 screen readings, according to the study. Standard screening resulted in 203 screen-detected cancers, 817 recalls, and a total of 83,231 screen readings.
The AI system also reduced the screen-reading workload for radiologists by 44%, the study found.
Meanwhile, a September 2023 study, published in The Lancet Digital Health, found that replacing one radiologist with AI resulted in more cancer detection without a large increase in false-positive cases. The AI led to a 4% higher, noninferior cancer detection rate, compared with radiologist double reading, the study found.
Dr. Heacock emphasized that both studies were conducted in Europe where the standard is for two radiologists to evaluate mammograms.
“That makes the results exciting, but not directly applicable to US practice just yet,” she said.
What Do the Experts Recommend?
Stamatia V. Destounis, MD, FACR, chair of the American College of Radiology (ACR) Breast Imaging Commission, said the college welcomes ongoing research into the efficacy of AI technologies and that AI may prove to be beneficial as an improved workflow tool.
The ACR has not released any guidance about the use of AI for radiologists and have no recommendation about best practices, Dr. Destounis said.
“The decisions regarding which technologies that various health systems and radiology sites choose to use are made by those facilities,” she said.
Dr. Destounis said more research is needed to demonstrate whether or not AI technologies help radiologists produce better results in identifying disease, injury, and illnesses among the general population or in specific groups — whether based on age, physical characteristics, race, ethnicity or risk status for breast cancer.
“Also, a way to measure each AI product is needed so that we can be certain they are relatively equivalent in their efficacy and accuracy — initially and over a prolonged period of time,” she said.
No consensus or concrete recommendation exists about the use of AI in mammography screening, adds Peter P. Yu, MD, FACP, FASCO, physician-in-chief at the Hartford HealthCare Cancer Institute and a member of the newly-created American Society of Clinical Oncology AI task force.
One of the many discussions concerning AI is to what degree patients should be aware that AI is being used in their healthcare and whether they should be required to give consent to its use, Dr. Yu said.
If AI is used to assist radiologists with mammographic interpretation, radiologists should discuss with patients how it’s being used and explain the ultimate reading is in the hands of their physician radiologist, he said.
“In the unlikely situation where there wasn’t a human in the loop and AI was in effect making a medical decision, the patient needs to be aware,” he said. “I’m not aware that any such situation exists today. AI is more likely to be subtly embedded in the software that operates technology, much like it is embedded in manufacturing and transportation.”
Who Will Pay for AI?
When it comes to payment, Dr. Yu said shifting the cost of AI to patients creates serious risk.
“It has enormous potential to increase health inequities,” he said. “If we believe health care is a fundamental human right, AI should inure to the benefit of all, not just those who can afford it. Healthcare should not be a luxury item; if it works, it works for all.”
In general, the issue of payment for AI is still pretty “thorny,” Dr. Heacock noted. Currently, there’s no way for physicians to request direct reimbursement for AI reads of mammograms.
While Dr. Heacock says she is sympathetic to the companies that spend significant time and effort on their AI technology, she doesn’t think charging patients is the right solution.
“We know that many women already have difficulty in paying for mammography-related services and this is just one more charge to confuse them or that they can’t pay,” she said.
Dr. Sorensen expects that, similar to 3D mammography, payers will eventually cover RadNet’s AI technology and that patients will no longer need to pay out of pocket. One Blue Cross carrier will start covering the AI in April 2024, he said.
Screening mammograms miss close to one in eight breast cancers. But early research suggests artificial intelligence (AI) could close this detection gap and markedly improve early diagnosis of the disease. Still, questions remain regarding how to best incorporate AI into screenings and whether it’s too soon to deploy the technology.
Already, some radiology clinics are offering AI analysis of mammograms through an add-on cost method.
Mammography patients who visit RadNet facilities, for example, have the option of an additional AI screening of their images. RadNet, the largest national owner and operator of fixed-site diagnostic imaging centers in the United States with more than 370 locations, first launched its AI program in the Northeast. The company has now rolled out its product across all regions in the country.
Because the AI is not reimbursed by insurers, patients must pay a $40 out-of-pocket fee if they want the AI analysis.
“RadNet practices have identified more than 400 women whose cancer was found earlier than it would have been had the AI not been present,” said Greg Sorensen MD, chief science officer for RadNet.
How RadNet’s AI Program Works
Patients coming to RadNet facilities for screening mammography undergo 3D high-resolution mammography that includes the use of 70-micron resolution detector technology, said Dr. Sorensen. The mammogram is reviewed by a qualified radiologist with assistance from two Food and Drug Administration–cleared AI programs, Saige-Q and Saige-Density. The radiologist then makes an interpretation.
Saige-Q is an AI tool that helps identify more suspicious mammograms by providing a quick signal to radiologists if the AI considers a given mammogram to be in a suspicious category, according to Dr. Sorensen. Saige-Density provides a density rating for each mammogram using one of the four standard categories:
- A. Almost entirely fatty
- B. Scattered areas of fibroglandular density
- C. Heterogeneously dense
- D. Extremely dense
Starting in September 2024, the FDA will require all mammogram reports to indicate density.
For patients who choose the $40 add-on service, called Enhanced Breast Cancer Detection, two other FDA-registered AI programs are also applied: Saige-Dx and Saige-Assure. These AI programs go a step further by placing marks on areas within the images that they find suspicious. Mammograms flagged as “high-suspicion” by the AI are then reviewed by a second human radiologist. The first and second radiologists confer to agree on a final diagnosis, Dr. Sorensen explained.
“Our research shows that approximately 20% more cancers are found when the safeguard review process is in place,” Dr. Sorensen said. “We also have seen [30%] decreases in recall rates” — the percentage of screening cases in which further tests are recommended by the radiologist.
Bethesda radiologist Janet Storella, MD, has used the AI program for about 3 years and said the technology has improved her screening performance.
The AI is linked to her practice’s imaging software, and radiologists have the option of turning the AI on at any time during their reading of screening mammograms, Dr. Storella explained. Some radiologists review the mammogram first and then initiate the AI, while others like Dr. Storella turn it on at the start, she said. Once initiated, the AI draws bounding boxes — or outlines — around areas that it deems suspicious.
The AI helps focus Dr. Storella’s attention on suspicious areas and grades the level of suspicion into one of four categories: high, intermediate, low, and minimal, she said.
“I find it especially useful in patients who have dense breast tissue,” said Dr. Storella, medical director of women’s imaging at Community Radiology Associates, a RadNet practice. “In these situations, the tissue on the mammogram is a field of white, and cancers are also white, so you’re looking for that little white golf ball on a sea of snow. The AI really helps hone that down to specific areas.”
About 35% of RadNet’s screening mammography patients have enrolled in the Enhanced Breast Cancer Detection program, according to RadNet data. In a recent study of nine general radiologists and nine breast imaging specialists, all radiologists improved their interpretation performance of DBT screening mammograms when reading with RadNet’s AI versus without it. (An average AUC [area under the receiver operating characteristic curve] of 0.93 versus 0.87, demonstrating a difference in AUC of 0.06 (95% CI, 0.04-0.08; P < .001)
Is Mammography Ready for AI?
RadNet is among a growing number of commercial companies offering AI solutions for mammography. MammoScreen and Hologic, for example, are two other companies that provide AI programs to assist radiologists in reading screening mammograms.
“We are at the start of the AI integration into breast imaging at this point,” said Laura Heacock, MD, a breast imaging radiologist and associate professor of radiology at NYU Langone Health. “There are multiple commercial AI models now available to radiologists to use in their practice [ and] there will likely be more. We’re in the transition stage where people are still deciding: Which is the best model to go with? How do I put it in my system? How do I ensure it works they way it was intended? Every practice and medical system will have a different answer to that question.”
At NYU Langone Health, researchers have been developing and studying optimal AI models for breast imaging for several years, Dr. Heacock said. Researchers thus far, have developed AI models for 2D digital mammography, 3D mammograms, breast ultrasound, and breast MRI. Similar to commercial AI systems, the AI is embedded into the picture archiving and communication (PACS) system used by radiologists to review images. Radiologists press a button to launch the AI, which draws a box around suspicious areas of the image and scores the suspicion.
“I take a look of where it is on the mammogram and decide whether that fits my level of suspicion,” Dr. Heacock said. The AI may not understand things about the mammogram like we do. For example, surgical scars look very suspicious to an AI model. But if I’m looking at a mammogram where [the patient] has had a stable scar that hasn’t changed in 10 years, I’m not concerned that the AI found it suspicious. My clinical judgment is the ultimate decider. This is just an additional piece of information that’s helpful to me.”
Research by New York University (NYU) has shown that when used by an expert radiologist the AI models have improved breast cancer detection in all four modalities, she said.
However, the AI has not yet launched at NYU Langone. More research is needed before deploying the technology, according to Dr. Heacock.
“At NYU, we are still testing the benefits to patients,” she said. “We know it improves cancer detection, but we want to make sure there are no drawbacks. We are still exploring the best ways to put it into effect at our institution.”
Dr. Heacock pointed to recent studies on AI in screening mammography that show promise.
An analysis of more than 80,000 women, for example, published in The Lancet Oncology in August, found that AI-supported screen reading led to a similar cancer detection rate as compared with a two-person reader system. This screening resulted in 244 screen-detected cancers, 861 recalls, and a total of 46,345 screen readings, according to the study. Standard screening resulted in 203 screen-detected cancers, 817 recalls, and a total of 83,231 screen readings.
The AI system also reduced the screen-reading workload for radiologists by 44%, the study found.
Meanwhile, a September 2023 study, published in The Lancet Digital Health, found that replacing one radiologist with AI resulted in more cancer detection without a large increase in false-positive cases. The AI led to a 4% higher, noninferior cancer detection rate, compared with radiologist double reading, the study found.
Dr. Heacock emphasized that both studies were conducted in Europe where the standard is for two radiologists to evaluate mammograms.
“That makes the results exciting, but not directly applicable to US practice just yet,” she said.
What Do the Experts Recommend?
Stamatia V. Destounis, MD, FACR, chair of the American College of Radiology (ACR) Breast Imaging Commission, said the college welcomes ongoing research into the efficacy of AI technologies and that AI may prove to be beneficial as an improved workflow tool.
The ACR has not released any guidance about the use of AI for radiologists and have no recommendation about best practices, Dr. Destounis said.
“The decisions regarding which technologies that various health systems and radiology sites choose to use are made by those facilities,” she said.
Dr. Destounis said more research is needed to demonstrate whether or not AI technologies help radiologists produce better results in identifying disease, injury, and illnesses among the general population or in specific groups — whether based on age, physical characteristics, race, ethnicity or risk status for breast cancer.
“Also, a way to measure each AI product is needed so that we can be certain they are relatively equivalent in their efficacy and accuracy — initially and over a prolonged period of time,” she said.
No consensus or concrete recommendation exists about the use of AI in mammography screening, adds Peter P. Yu, MD, FACP, FASCO, physician-in-chief at the Hartford HealthCare Cancer Institute and a member of the newly-created American Society of Clinical Oncology AI task force.
One of the many discussions concerning AI is to what degree patients should be aware that AI is being used in their healthcare and whether they should be required to give consent to its use, Dr. Yu said.
If AI is used to assist radiologists with mammographic interpretation, radiologists should discuss with patients how it’s being used and explain the ultimate reading is in the hands of their physician radiologist, he said.
“In the unlikely situation where there wasn’t a human in the loop and AI was in effect making a medical decision, the patient needs to be aware,” he said. “I’m not aware that any such situation exists today. AI is more likely to be subtly embedded in the software that operates technology, much like it is embedded in manufacturing and transportation.”
Who Will Pay for AI?
When it comes to payment, Dr. Yu said shifting the cost of AI to patients creates serious risk.
“It has enormous potential to increase health inequities,” he said. “If we believe health care is a fundamental human right, AI should inure to the benefit of all, not just those who can afford it. Healthcare should not be a luxury item; if it works, it works for all.”
In general, the issue of payment for AI is still pretty “thorny,” Dr. Heacock noted. Currently, there’s no way for physicians to request direct reimbursement for AI reads of mammograms.
While Dr. Heacock says she is sympathetic to the companies that spend significant time and effort on their AI technology, she doesn’t think charging patients is the right solution.
“We know that many women already have difficulty in paying for mammography-related services and this is just one more charge to confuse them or that they can’t pay,” she said.
Dr. Sorensen expects that, similar to 3D mammography, payers will eventually cover RadNet’s AI technology and that patients will no longer need to pay out of pocket. One Blue Cross carrier will start covering the AI in April 2024, he said.
Screening mammograms miss close to one in eight breast cancers. But early research suggests artificial intelligence (AI) could close this detection gap and markedly improve early diagnosis of the disease. Still, questions remain regarding how to best incorporate AI into screenings and whether it’s too soon to deploy the technology.
Already, some radiology clinics are offering AI analysis of mammograms through an add-on cost method.
Mammography patients who visit RadNet facilities, for example, have the option of an additional AI screening of their images. RadNet, the largest national owner and operator of fixed-site diagnostic imaging centers in the United States with more than 370 locations, first launched its AI program in the Northeast. The company has now rolled out its product across all regions in the country.
Because the AI is not reimbursed by insurers, patients must pay a $40 out-of-pocket fee if they want the AI analysis.
“RadNet practices have identified more than 400 women whose cancer was found earlier than it would have been had the AI not been present,” said Greg Sorensen MD, chief science officer for RadNet.
How RadNet’s AI Program Works
Patients coming to RadNet facilities for screening mammography undergo 3D high-resolution mammography that includes the use of 70-micron resolution detector technology, said Dr. Sorensen. The mammogram is reviewed by a qualified radiologist with assistance from two Food and Drug Administration–cleared AI programs, Saige-Q and Saige-Density. The radiologist then makes an interpretation.
Saige-Q is an AI tool that helps identify more suspicious mammograms by providing a quick signal to radiologists if the AI considers a given mammogram to be in a suspicious category, according to Dr. Sorensen. Saige-Density provides a density rating for each mammogram using one of the four standard categories:
- A. Almost entirely fatty
- B. Scattered areas of fibroglandular density
- C. Heterogeneously dense
- D. Extremely dense
Starting in September 2024, the FDA will require all mammogram reports to indicate density.
For patients who choose the $40 add-on service, called Enhanced Breast Cancer Detection, two other FDA-registered AI programs are also applied: Saige-Dx and Saige-Assure. These AI programs go a step further by placing marks on areas within the images that they find suspicious. Mammograms flagged as “high-suspicion” by the AI are then reviewed by a second human radiologist. The first and second radiologists confer to agree on a final diagnosis, Dr. Sorensen explained.
“Our research shows that approximately 20% more cancers are found when the safeguard review process is in place,” Dr. Sorensen said. “We also have seen [30%] decreases in recall rates” — the percentage of screening cases in which further tests are recommended by the radiologist.
Bethesda radiologist Janet Storella, MD, has used the AI program for about 3 years and said the technology has improved her screening performance.
The AI is linked to her practice’s imaging software, and radiologists have the option of turning the AI on at any time during their reading of screening mammograms, Dr. Storella explained. Some radiologists review the mammogram first and then initiate the AI, while others like Dr. Storella turn it on at the start, she said. Once initiated, the AI draws bounding boxes — or outlines — around areas that it deems suspicious.
The AI helps focus Dr. Storella’s attention on suspicious areas and grades the level of suspicion into one of four categories: high, intermediate, low, and minimal, she said.
“I find it especially useful in patients who have dense breast tissue,” said Dr. Storella, medical director of women’s imaging at Community Radiology Associates, a RadNet practice. “In these situations, the tissue on the mammogram is a field of white, and cancers are also white, so you’re looking for that little white golf ball on a sea of snow. The AI really helps hone that down to specific areas.”
About 35% of RadNet’s screening mammography patients have enrolled in the Enhanced Breast Cancer Detection program, according to RadNet data. In a recent study of nine general radiologists and nine breast imaging specialists, all radiologists improved their interpretation performance of DBT screening mammograms when reading with RadNet’s AI versus without it. (An average AUC [area under the receiver operating characteristic curve] of 0.93 versus 0.87, demonstrating a difference in AUC of 0.06 (95% CI, 0.04-0.08; P < .001)
Is Mammography Ready for AI?
RadNet is among a growing number of commercial companies offering AI solutions for mammography. MammoScreen and Hologic, for example, are two other companies that provide AI programs to assist radiologists in reading screening mammograms.
“We are at the start of the AI integration into breast imaging at this point,” said Laura Heacock, MD, a breast imaging radiologist and associate professor of radiology at NYU Langone Health. “There are multiple commercial AI models now available to radiologists to use in their practice [ and] there will likely be more. We’re in the transition stage where people are still deciding: Which is the best model to go with? How do I put it in my system? How do I ensure it works they way it was intended? Every practice and medical system will have a different answer to that question.”
At NYU Langone Health, researchers have been developing and studying optimal AI models for breast imaging for several years, Dr. Heacock said. Researchers thus far, have developed AI models for 2D digital mammography, 3D mammograms, breast ultrasound, and breast MRI. Similar to commercial AI systems, the AI is embedded into the picture archiving and communication (PACS) system used by radiologists to review images. Radiologists press a button to launch the AI, which draws a box around suspicious areas of the image and scores the suspicion.
“I take a look of where it is on the mammogram and decide whether that fits my level of suspicion,” Dr. Heacock said. The AI may not understand things about the mammogram like we do. For example, surgical scars look very suspicious to an AI model. But if I’m looking at a mammogram where [the patient] has had a stable scar that hasn’t changed in 10 years, I’m not concerned that the AI found it suspicious. My clinical judgment is the ultimate decider. This is just an additional piece of information that’s helpful to me.”
Research by New York University (NYU) has shown that when used by an expert radiologist the AI models have improved breast cancer detection in all four modalities, she said.
However, the AI has not yet launched at NYU Langone. More research is needed before deploying the technology, according to Dr. Heacock.
“At NYU, we are still testing the benefits to patients,” she said. “We know it improves cancer detection, but we want to make sure there are no drawbacks. We are still exploring the best ways to put it into effect at our institution.”
Dr. Heacock pointed to recent studies on AI in screening mammography that show promise.
An analysis of more than 80,000 women, for example, published in The Lancet Oncology in August, found that AI-supported screen reading led to a similar cancer detection rate as compared with a two-person reader system. This screening resulted in 244 screen-detected cancers, 861 recalls, and a total of 46,345 screen readings, according to the study. Standard screening resulted in 203 screen-detected cancers, 817 recalls, and a total of 83,231 screen readings.
The AI system also reduced the screen-reading workload for radiologists by 44%, the study found.
Meanwhile, a September 2023 study, published in The Lancet Digital Health, found that replacing one radiologist with AI resulted in more cancer detection without a large increase in false-positive cases. The AI led to a 4% higher, noninferior cancer detection rate, compared with radiologist double reading, the study found.
Dr. Heacock emphasized that both studies were conducted in Europe where the standard is for two radiologists to evaluate mammograms.
“That makes the results exciting, but not directly applicable to US practice just yet,” she said.
What Do the Experts Recommend?
Stamatia V. Destounis, MD, FACR, chair of the American College of Radiology (ACR) Breast Imaging Commission, said the college welcomes ongoing research into the efficacy of AI technologies and that AI may prove to be beneficial as an improved workflow tool.
The ACR has not released any guidance about the use of AI for radiologists and have no recommendation about best practices, Dr. Destounis said.
“The decisions regarding which technologies that various health systems and radiology sites choose to use are made by those facilities,” she said.
Dr. Destounis said more research is needed to demonstrate whether or not AI technologies help radiologists produce better results in identifying disease, injury, and illnesses among the general population or in specific groups — whether based on age, physical characteristics, race, ethnicity or risk status for breast cancer.
“Also, a way to measure each AI product is needed so that we can be certain they are relatively equivalent in their efficacy and accuracy — initially and over a prolonged period of time,” she said.
No consensus or concrete recommendation exists about the use of AI in mammography screening, adds Peter P. Yu, MD, FACP, FASCO, physician-in-chief at the Hartford HealthCare Cancer Institute and a member of the newly-created American Society of Clinical Oncology AI task force.
One of the many discussions concerning AI is to what degree patients should be aware that AI is being used in their healthcare and whether they should be required to give consent to its use, Dr. Yu said.
If AI is used to assist radiologists with mammographic interpretation, radiologists should discuss with patients how it’s being used and explain the ultimate reading is in the hands of their physician radiologist, he said.
“In the unlikely situation where there wasn’t a human in the loop and AI was in effect making a medical decision, the patient needs to be aware,” he said. “I’m not aware that any such situation exists today. AI is more likely to be subtly embedded in the software that operates technology, much like it is embedded in manufacturing and transportation.”
Who Will Pay for AI?
When it comes to payment, Dr. Yu said shifting the cost of AI to patients creates serious risk.
“It has enormous potential to increase health inequities,” he said. “If we believe health care is a fundamental human right, AI should inure to the benefit of all, not just those who can afford it. Healthcare should not be a luxury item; if it works, it works for all.”
In general, the issue of payment for AI is still pretty “thorny,” Dr. Heacock noted. Currently, there’s no way for physicians to request direct reimbursement for AI reads of mammograms.
While Dr. Heacock says she is sympathetic to the companies that spend significant time and effort on their AI technology, she doesn’t think charging patients is the right solution.
“We know that many women already have difficulty in paying for mammography-related services and this is just one more charge to confuse them or that they can’t pay,” she said.
Dr. Sorensen expects that, similar to 3D mammography, payers will eventually cover RadNet’s AI technology and that patients will no longer need to pay out of pocket. One Blue Cross carrier will start covering the AI in April 2024, he said.