Sporebiotics improve functional dyspepsia symptoms

Article Type
Changed
Fri, 06/04/2021 - 15:05

 

Compared with placebo, sporebiotics significantly reduced postprandial distress, epigastric pain, and several other symptoms of functional dyspepsia, reported lead author Lucas Wauters, MD, PhD, of University Hospitals Leuven (Belgium), and colleagues.

“Acid suppressive or first-line therapy with PPIs [proton pump inhibitors] for functional dyspepsia has limited efficacy and potential long-term side effects,” the investigators reported at the annual Digestive Disease Week® (DDW). “Spore-forming bacteria or sporebiotics may be effective for postprandial distress and epigastric pain or burning symptoms, offering benefits which may differ in relation to PPI intake.”
 

Sporebiotics improve variety of symptoms

To test this hypothesis, the investigators recruited 68 patients with functional dyspepsia who had similar characteristics at baseline. Half of the participants (n = 34) were taking PPIs.

Patients were randomized in a 1:1 ratio to receive 2.5 x 109 CFU of Bacillus coagulans MY01 and B. subtilis MY02 twice daily for 8 weeks, or matching placebo. Following this period, an additional 8-week open-label regimen was instituted, during which time all patients received sporebiotics. Throughout the study, a daily diary was used to self-report symptoms.

The primary outcome, measured at 8 weeks, was clinical response, defined by a decrease in weekly postprandial distress symptoms greater than 0.7 among patients who had a baseline score greater than 1.0. Secondary outcomes included change in postprandial distress symptoms greater than 0.5 (minimal clinical response), as well as changes in cardinal epigastric pain, cardinal postprandial distress, and other symptoms. At baseline and 8 weeks, patients taking PPIs underwent a 14C-glycocolic acid breath test to detect changes in small intestinal bacterial overgrowth.

At 8 weeks, a clinical response was observed in 48% of patients taking sporebiotics, compared with 20% of those in the placebo group (P = .03). At the same time point, 56% of patients in the treatment group had a minimal clinical response versus 27% in the control group (P = .03).

Spore-forming probiotics were also associated with significantly greater improvements in cardinal postprandial distress, cardinal epigastric pain, postprandial fullness, and upper abdominal pain. A trend toward improvement in upper abdominal bloating was also seen (P = .07).

Among patients taking PPIs, baseline rates of positivity for bile acid breath testing were similar between those in the sporebiotic and placebo group, at 18% and 25%, respectively (P = .29). After 8 weeks, however, patients taking spore-forming probiotics had a significantly lower rate of bile acid breath test positivity (7% vs. 36%; P = .04), suggesting improvements in small intestinal bacterial overgrowth.

In the open-label portion of the trial, patients in the treatment group maintained improvements in postprandial distress. Patients who switched from placebo to sporebiotics had a significant reduction in postprandial distress symptoms.

At 8 weeks, sporebiotics were associated with a trend toward fewer side effects of any kind (16% vs. 33%; P = .09), while rates of GI-specific side effects were comparable between groups, at 3% and 15% for sporebiotics and placebo, respectively (P = .2).“Spore-forming probiotics are effective and safe in patients with functional dyspepsia, decreasing both postprandial distress and epigastric pain symptoms,” the investigators concluded. “In patients [taking PPIs], sporebiotics decrease the percentage of positive bile acid breath tests, suggesting a reduction of small intestinal bacterial overgrowth.”

 

 

Results are promising, but big questions remain

Pankaj Jay Pasricha, MBBS, MD, vice chair of medicine innovation and commercialization at Johns Hopkins and director of the Johns Hopkins Center for Neurogastroenterology, Baltimore, called the results “very encouraging.”

“This [study] is the first of its kind for this condition,” Dr. Pasricha said in an interview. “It will be very interesting to see whether others can reproduce these findings, and whether [these improvements] are sustained beyond the first few weeks or months.”

He noted that determining associated mechanisms of action could potentially open up new lines of therapy, and provide greater understanding of pathophysiology, which is currently lacking.

“We don’t fully understand the pathophysiology [of functional dyspepsia],” Dr. Pasricha said. “If you don’t understand the pathophysiology, then it’s difficult to identify the right molecular target to address the root cause. Instead, we use a variety of symptomatic treatments that aren’t actually addressing the root cause, but studies like this may help us gain some insight into the cause of the problem, and if it is in fact a fundamental imbalance in the intestinal microbiota, then this would be a rational approach.”

It’s unclear how sporebiotics may improve functional dyspepsia, Dr. Pasricha noted. He proposed three possible mechanisms: the bacteria could be colonizing the intestine, they could be releasing products as they pass through the intestine that have a therapeutic effect, or they may be altering bile acid metabolism in the colon or having some other effect there.

“It’s speculative on my part to say how it works,” Dr. Pasricha said. “All the dots remain to be connected. But it’s a good start, and an outstanding group of investigators.”Dr. Wauters and colleagues reported no conflicts of interest. Dr. Pasricha disclosed a relationship with Pendulum Therapeutics.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Compared with placebo, sporebiotics significantly reduced postprandial distress, epigastric pain, and several other symptoms of functional dyspepsia, reported lead author Lucas Wauters, MD, PhD, of University Hospitals Leuven (Belgium), and colleagues.

“Acid suppressive or first-line therapy with PPIs [proton pump inhibitors] for functional dyspepsia has limited efficacy and potential long-term side effects,” the investigators reported at the annual Digestive Disease Week® (DDW). “Spore-forming bacteria or sporebiotics may be effective for postprandial distress and epigastric pain or burning symptoms, offering benefits which may differ in relation to PPI intake.”
 

Sporebiotics improve variety of symptoms

To test this hypothesis, the investigators recruited 68 patients with functional dyspepsia who had similar characteristics at baseline. Half of the participants (n = 34) were taking PPIs.

Patients were randomized in a 1:1 ratio to receive 2.5 x 109 CFU of Bacillus coagulans MY01 and B. subtilis MY02 twice daily for 8 weeks, or matching placebo. Following this period, an additional 8-week open-label regimen was instituted, during which time all patients received sporebiotics. Throughout the study, a daily diary was used to self-report symptoms.

The primary outcome, measured at 8 weeks, was clinical response, defined by a decrease in weekly postprandial distress symptoms greater than 0.7 among patients who had a baseline score greater than 1.0. Secondary outcomes included change in postprandial distress symptoms greater than 0.5 (minimal clinical response), as well as changes in cardinal epigastric pain, cardinal postprandial distress, and other symptoms. At baseline and 8 weeks, patients taking PPIs underwent a 14C-glycocolic acid breath test to detect changes in small intestinal bacterial overgrowth.

At 8 weeks, a clinical response was observed in 48% of patients taking sporebiotics, compared with 20% of those in the placebo group (P = .03). At the same time point, 56% of patients in the treatment group had a minimal clinical response versus 27% in the control group (P = .03).

Spore-forming probiotics were also associated with significantly greater improvements in cardinal postprandial distress, cardinal epigastric pain, postprandial fullness, and upper abdominal pain. A trend toward improvement in upper abdominal bloating was also seen (P = .07).

Among patients taking PPIs, baseline rates of positivity for bile acid breath testing were similar between those in the sporebiotic and placebo group, at 18% and 25%, respectively (P = .29). After 8 weeks, however, patients taking spore-forming probiotics had a significantly lower rate of bile acid breath test positivity (7% vs. 36%; P = .04), suggesting improvements in small intestinal bacterial overgrowth.

In the open-label portion of the trial, patients in the treatment group maintained improvements in postprandial distress. Patients who switched from placebo to sporebiotics had a significant reduction in postprandial distress symptoms.

At 8 weeks, sporebiotics were associated with a trend toward fewer side effects of any kind (16% vs. 33%; P = .09), while rates of GI-specific side effects were comparable between groups, at 3% and 15% for sporebiotics and placebo, respectively (P = .2).“Spore-forming probiotics are effective and safe in patients with functional dyspepsia, decreasing both postprandial distress and epigastric pain symptoms,” the investigators concluded. “In patients [taking PPIs], sporebiotics decrease the percentage of positive bile acid breath tests, suggesting a reduction of small intestinal bacterial overgrowth.”

 

 

Results are promising, but big questions remain

Pankaj Jay Pasricha, MBBS, MD, vice chair of medicine innovation and commercialization at Johns Hopkins and director of the Johns Hopkins Center for Neurogastroenterology, Baltimore, called the results “very encouraging.”

“This [study] is the first of its kind for this condition,” Dr. Pasricha said in an interview. “It will be very interesting to see whether others can reproduce these findings, and whether [these improvements] are sustained beyond the first few weeks or months.”

He noted that determining associated mechanisms of action could potentially open up new lines of therapy, and provide greater understanding of pathophysiology, which is currently lacking.

“We don’t fully understand the pathophysiology [of functional dyspepsia],” Dr. Pasricha said. “If you don’t understand the pathophysiology, then it’s difficult to identify the right molecular target to address the root cause. Instead, we use a variety of symptomatic treatments that aren’t actually addressing the root cause, but studies like this may help us gain some insight into the cause of the problem, and if it is in fact a fundamental imbalance in the intestinal microbiota, then this would be a rational approach.”

It’s unclear how sporebiotics may improve functional dyspepsia, Dr. Pasricha noted. He proposed three possible mechanisms: the bacteria could be colonizing the intestine, they could be releasing products as they pass through the intestine that have a therapeutic effect, or they may be altering bile acid metabolism in the colon or having some other effect there.

“It’s speculative on my part to say how it works,” Dr. Pasricha said. “All the dots remain to be connected. But it’s a good start, and an outstanding group of investigators.”Dr. Wauters and colleagues reported no conflicts of interest. Dr. Pasricha disclosed a relationship with Pendulum Therapeutics.

 

Compared with placebo, sporebiotics significantly reduced postprandial distress, epigastric pain, and several other symptoms of functional dyspepsia, reported lead author Lucas Wauters, MD, PhD, of University Hospitals Leuven (Belgium), and colleagues.

“Acid suppressive or first-line therapy with PPIs [proton pump inhibitors] for functional dyspepsia has limited efficacy and potential long-term side effects,” the investigators reported at the annual Digestive Disease Week® (DDW). “Spore-forming bacteria or sporebiotics may be effective for postprandial distress and epigastric pain or burning symptoms, offering benefits which may differ in relation to PPI intake.”
 

Sporebiotics improve variety of symptoms

To test this hypothesis, the investigators recruited 68 patients with functional dyspepsia who had similar characteristics at baseline. Half of the participants (n = 34) were taking PPIs.

Patients were randomized in a 1:1 ratio to receive 2.5 x 109 CFU of Bacillus coagulans MY01 and B. subtilis MY02 twice daily for 8 weeks, or matching placebo. Following this period, an additional 8-week open-label regimen was instituted, during which time all patients received sporebiotics. Throughout the study, a daily diary was used to self-report symptoms.

The primary outcome, measured at 8 weeks, was clinical response, defined by a decrease in weekly postprandial distress symptoms greater than 0.7 among patients who had a baseline score greater than 1.0. Secondary outcomes included change in postprandial distress symptoms greater than 0.5 (minimal clinical response), as well as changes in cardinal epigastric pain, cardinal postprandial distress, and other symptoms. At baseline and 8 weeks, patients taking PPIs underwent a 14C-glycocolic acid breath test to detect changes in small intestinal bacterial overgrowth.

At 8 weeks, a clinical response was observed in 48% of patients taking sporebiotics, compared with 20% of those in the placebo group (P = .03). At the same time point, 56% of patients in the treatment group had a minimal clinical response versus 27% in the control group (P = .03).

Spore-forming probiotics were also associated with significantly greater improvements in cardinal postprandial distress, cardinal epigastric pain, postprandial fullness, and upper abdominal pain. A trend toward improvement in upper abdominal bloating was also seen (P = .07).

Among patients taking PPIs, baseline rates of positivity for bile acid breath testing were similar between those in the sporebiotic and placebo group, at 18% and 25%, respectively (P = .29). After 8 weeks, however, patients taking spore-forming probiotics had a significantly lower rate of bile acid breath test positivity (7% vs. 36%; P = .04), suggesting improvements in small intestinal bacterial overgrowth.

In the open-label portion of the trial, patients in the treatment group maintained improvements in postprandial distress. Patients who switched from placebo to sporebiotics had a significant reduction in postprandial distress symptoms.

At 8 weeks, sporebiotics were associated with a trend toward fewer side effects of any kind (16% vs. 33%; P = .09), while rates of GI-specific side effects were comparable between groups, at 3% and 15% for sporebiotics and placebo, respectively (P = .2).“Spore-forming probiotics are effective and safe in patients with functional dyspepsia, decreasing both postprandial distress and epigastric pain symptoms,” the investigators concluded. “In patients [taking PPIs], sporebiotics decrease the percentage of positive bile acid breath tests, suggesting a reduction of small intestinal bacterial overgrowth.”

 

 

Results are promising, but big questions remain

Pankaj Jay Pasricha, MBBS, MD, vice chair of medicine innovation and commercialization at Johns Hopkins and director of the Johns Hopkins Center for Neurogastroenterology, Baltimore, called the results “very encouraging.”

“This [study] is the first of its kind for this condition,” Dr. Pasricha said in an interview. “It will be very interesting to see whether others can reproduce these findings, and whether [these improvements] are sustained beyond the first few weeks or months.”

He noted that determining associated mechanisms of action could potentially open up new lines of therapy, and provide greater understanding of pathophysiology, which is currently lacking.

“We don’t fully understand the pathophysiology [of functional dyspepsia],” Dr. Pasricha said. “If you don’t understand the pathophysiology, then it’s difficult to identify the right molecular target to address the root cause. Instead, we use a variety of symptomatic treatments that aren’t actually addressing the root cause, but studies like this may help us gain some insight into the cause of the problem, and if it is in fact a fundamental imbalance in the intestinal microbiota, then this would be a rational approach.”

It’s unclear how sporebiotics may improve functional dyspepsia, Dr. Pasricha noted. He proposed three possible mechanisms: the bacteria could be colonizing the intestine, they could be releasing products as they pass through the intestine that have a therapeutic effect, or they may be altering bile acid metabolism in the colon or having some other effect there.

“It’s speculative on my part to say how it works,” Dr. Pasricha said. “All the dots remain to be connected. But it’s a good start, and an outstanding group of investigators.”Dr. Wauters and colleagues reported no conflicts of interest. Dr. Pasricha disclosed a relationship with Pendulum Therapeutics.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Some nasogastric intubation procedures lead to less aerosolization than feared

Article Type
Changed
Fri, 06/04/2021 - 10:45

 

Nasogastric intubation for esophageal manometry or impedance monitoring does not generate significant aerosol particles and is associated with minimal droplet spread, according to a Belgian study presented at the annual Digestive Disease Week® (DDW). These findings suggest that standard personal protective equipment and appropriate patient positioning are likely sufficient to protect health care workers from increased risk of coronavirus transmission during tube placement and removal, reported lead author Wout Verbeure, PhD, of Leuven University Hospital, Belgium, and colleagues.

“Subsequent to the COVID-19 peak, [nasogastric tube insertion and extraction] were scaled back based on the assumption that they generate respiratory aerosol particles and droplet spread,” the investigators reported. “However, there is no scientific evidence for this theory.”

To address this knowledge gap, the investigators conducted an observational trial involving SARS-CoV-2-negative patients and including 21 insertions and removals for high-resolution manometry (HRM), plus 12 insertions and 10 removals for 24-hour multichannel intraluminal impedance-pH monitoring (MII-pH). During the study, a Camfil City M Air Purifier was added to the examination room. This was present during 13 of the 21 HRM insertions and removals, allowing for comparison of aerosol particle measurements before and after introduction of the device.
 

The mechanics of the study

Aerosol particles (0.3-10 mcm) were measured with a Particle Measuring Systems LASAIR II Particle Counter positioned 1 cm away from the patient’s face. For both procedures, measurements were taken before, during, and up to 5 minutes after each nasogastric tube placement and removal. Additional measurements were taken while the HRM examination was being conducted.

To measure droplet spread, 1% medical fluorescein in saline was applied to each patient’s nasal cavity; droplets were visualized on a white sheet covering the patient and a white apron worn by the health care worker. The patients’ masks were kept below their noses but were covering their mouths.

“During the placement and removal of the catheter, the health care worker was always standing sideways or even behind the patient, and they always stood higher relative to the patient to ensure that when there was aerosol or droplet spread, it was not in their direction,” Dr. Verbeure said during his virtual presentation.

During placement for HRM and removal for MII-pH, aerosol particles (excluding those that were 0.3 mcm), decreased significantly. Otherwise, particle counts remained stable. “This shows that these investigations do not generate additional aerosol [particles], which is good news,” Dr. Verbeure said.

When the air purifier was present, placement and examination for HRM were associated with significant reductions in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm), whereas removal caused a slight uptick in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm) that did not decline after 5 minutes. “This was actually a surprise to us,” Dr. Verbeure said. “Because we now had an air purifier present, and we expected an even lower number of particles.”

He suggested that the purifier may have been reducing particle counts during HRM examination, thereby lowering baseline values before removal, making small changes more noticeable; or the purifier may have been causing turbulence that spread particles during removal. Whether either of these hypotheses is true, Dr. Verbeure noted that particle counts were never higher than at the start of the examination. Fluorescein visualization showed “surprisingly little droplet spread,” Dr. Verbeure said, apart from some contamination around the patient’s neck.

“Esophageal investigations do not seem to generate additional [aerosol] particles,” Dr. Verbeure concluded. “So wearing the recommended protective gear and also considering the right positioning of the health care worker relative to the patient is important to keep performing this daily clinical routine.” To avoid droplet spread, health care workers should “be aware of the [patient’s] neck region and the direction of the catheter,” Dr. Verbeure added.
 

 

 

SORTing the results

According to Mahdi Najafi, MD, associate professor in the department of anesthesiology at Tehran University of Medical Sciences, Iran, and adjunct professor at Schulich School of Medicine & Dentistry, Western University, London, Ontario, the findings offer valuable insights. “[This study] is very important for at least two reasons: The extent of using this procedure in patient care, especially in the critical care setting, and the paucity of information for COVID-19 transmission and route of transmission as well,” Dr. Najafi said in an interview.

Yet he cautioned against generalizing the results. “We cannot extend the results to all nasogastric tube intubations,” Dr. Najafi said. “There are reasons for that. The tube for manometry is delicate and flexible, while the nasogastric tube used for drainage and GI pressure release – which is used commonly in intensive care and the operating room – is larger and rather rigid. Moreover, the patient is awake and conscious for manometry while the other procedures are done in sedated or unconscious patients.”

He noted that nasogastric intubation is more challenging in unconscious patients, and often requires a laryngoscope and/or Magill forceps. “The result [of using these instruments] is coughing, which is undoubtedly the most important cause of aerosol generation,” Dr. Najafi said. “It can be regarded as a drawback to this study as well. The authors would be better to report the number and/or severity of the airway reactions during the procedures, which are the main source of droplets and aerosols.”

To reduce risk of coronavirus transmission during nasogastric intubation of unconscious patients, Dr. Najafi recommended the SORT (Sniffing position, nasogastric tube Orientation, contralateral Rotation, and Twisting movement) maneuver, which he introduced in 2016 for use in critical care and operating room settings.

“The employment of anatomical approach and avoiding equipment for intubation were devised to increase the level of safety and decrease hazards and adverse effects,” Dr. Najafi said of the SORT maneuver. “The procedure needs to be done step-by-step and as smooth as possible.”

In a recent study, the SORT maneuver was compared with nasogastric intubation using neck flexion lateral pressure in critically ill patients. The investigators concluded that the SORT maneuver is “a promising method” notable for its simple technique, and suggested that more trials are needed.

The investigators and Dr. Najafi reported no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

Nasogastric intubation for esophageal manometry or impedance monitoring does not generate significant aerosol particles and is associated with minimal droplet spread, according to a Belgian study presented at the annual Digestive Disease Week® (DDW). These findings suggest that standard personal protective equipment and appropriate patient positioning are likely sufficient to protect health care workers from increased risk of coronavirus transmission during tube placement and removal, reported lead author Wout Verbeure, PhD, of Leuven University Hospital, Belgium, and colleagues.

“Subsequent to the COVID-19 peak, [nasogastric tube insertion and extraction] were scaled back based on the assumption that they generate respiratory aerosol particles and droplet spread,” the investigators reported. “However, there is no scientific evidence for this theory.”

To address this knowledge gap, the investigators conducted an observational trial involving SARS-CoV-2-negative patients and including 21 insertions and removals for high-resolution manometry (HRM), plus 12 insertions and 10 removals for 24-hour multichannel intraluminal impedance-pH monitoring (MII-pH). During the study, a Camfil City M Air Purifier was added to the examination room. This was present during 13 of the 21 HRM insertions and removals, allowing for comparison of aerosol particle measurements before and after introduction of the device.
 

The mechanics of the study

Aerosol particles (0.3-10 mcm) were measured with a Particle Measuring Systems LASAIR II Particle Counter positioned 1 cm away from the patient’s face. For both procedures, measurements were taken before, during, and up to 5 minutes after each nasogastric tube placement and removal. Additional measurements were taken while the HRM examination was being conducted.

To measure droplet spread, 1% medical fluorescein in saline was applied to each patient’s nasal cavity; droplets were visualized on a white sheet covering the patient and a white apron worn by the health care worker. The patients’ masks were kept below their noses but were covering their mouths.

“During the placement and removal of the catheter, the health care worker was always standing sideways or even behind the patient, and they always stood higher relative to the patient to ensure that when there was aerosol or droplet spread, it was not in their direction,” Dr. Verbeure said during his virtual presentation.

During placement for HRM and removal for MII-pH, aerosol particles (excluding those that were 0.3 mcm), decreased significantly. Otherwise, particle counts remained stable. “This shows that these investigations do not generate additional aerosol [particles], which is good news,” Dr. Verbeure said.

When the air purifier was present, placement and examination for HRM were associated with significant reductions in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm), whereas removal caused a slight uptick in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm) that did not decline after 5 minutes. “This was actually a surprise to us,” Dr. Verbeure said. “Because we now had an air purifier present, and we expected an even lower number of particles.”

He suggested that the purifier may have been reducing particle counts during HRM examination, thereby lowering baseline values before removal, making small changes more noticeable; or the purifier may have been causing turbulence that spread particles during removal. Whether either of these hypotheses is true, Dr. Verbeure noted that particle counts were never higher than at the start of the examination. Fluorescein visualization showed “surprisingly little droplet spread,” Dr. Verbeure said, apart from some contamination around the patient’s neck.

“Esophageal investigations do not seem to generate additional [aerosol] particles,” Dr. Verbeure concluded. “So wearing the recommended protective gear and also considering the right positioning of the health care worker relative to the patient is important to keep performing this daily clinical routine.” To avoid droplet spread, health care workers should “be aware of the [patient’s] neck region and the direction of the catheter,” Dr. Verbeure added.
 

 

 

SORTing the results

According to Mahdi Najafi, MD, associate professor in the department of anesthesiology at Tehran University of Medical Sciences, Iran, and adjunct professor at Schulich School of Medicine & Dentistry, Western University, London, Ontario, the findings offer valuable insights. “[This study] is very important for at least two reasons: The extent of using this procedure in patient care, especially in the critical care setting, and the paucity of information for COVID-19 transmission and route of transmission as well,” Dr. Najafi said in an interview.

Yet he cautioned against generalizing the results. “We cannot extend the results to all nasogastric tube intubations,” Dr. Najafi said. “There are reasons for that. The tube for manometry is delicate and flexible, while the nasogastric tube used for drainage and GI pressure release – which is used commonly in intensive care and the operating room – is larger and rather rigid. Moreover, the patient is awake and conscious for manometry while the other procedures are done in sedated or unconscious patients.”

He noted that nasogastric intubation is more challenging in unconscious patients, and often requires a laryngoscope and/or Magill forceps. “The result [of using these instruments] is coughing, which is undoubtedly the most important cause of aerosol generation,” Dr. Najafi said. “It can be regarded as a drawback to this study as well. The authors would be better to report the number and/or severity of the airway reactions during the procedures, which are the main source of droplets and aerosols.”

To reduce risk of coronavirus transmission during nasogastric intubation of unconscious patients, Dr. Najafi recommended the SORT (Sniffing position, nasogastric tube Orientation, contralateral Rotation, and Twisting movement) maneuver, which he introduced in 2016 for use in critical care and operating room settings.

“The employment of anatomical approach and avoiding equipment for intubation were devised to increase the level of safety and decrease hazards and adverse effects,” Dr. Najafi said of the SORT maneuver. “The procedure needs to be done step-by-step and as smooth as possible.”

In a recent study, the SORT maneuver was compared with nasogastric intubation using neck flexion lateral pressure in critically ill patients. The investigators concluded that the SORT maneuver is “a promising method” notable for its simple technique, and suggested that more trials are needed.

The investigators and Dr. Najafi reported no conflicts of interest.

 

Nasogastric intubation for esophageal manometry or impedance monitoring does not generate significant aerosol particles and is associated with minimal droplet spread, according to a Belgian study presented at the annual Digestive Disease Week® (DDW). These findings suggest that standard personal protective equipment and appropriate patient positioning are likely sufficient to protect health care workers from increased risk of coronavirus transmission during tube placement and removal, reported lead author Wout Verbeure, PhD, of Leuven University Hospital, Belgium, and colleagues.

“Subsequent to the COVID-19 peak, [nasogastric tube insertion and extraction] were scaled back based on the assumption that they generate respiratory aerosol particles and droplet spread,” the investigators reported. “However, there is no scientific evidence for this theory.”

To address this knowledge gap, the investigators conducted an observational trial involving SARS-CoV-2-negative patients and including 21 insertions and removals for high-resolution manometry (HRM), plus 12 insertions and 10 removals for 24-hour multichannel intraluminal impedance-pH monitoring (MII-pH). During the study, a Camfil City M Air Purifier was added to the examination room. This was present during 13 of the 21 HRM insertions and removals, allowing for comparison of aerosol particle measurements before and after introduction of the device.
 

The mechanics of the study

Aerosol particles (0.3-10 mcm) were measured with a Particle Measuring Systems LASAIR II Particle Counter positioned 1 cm away from the patient’s face. For both procedures, measurements were taken before, during, and up to 5 minutes after each nasogastric tube placement and removal. Additional measurements were taken while the HRM examination was being conducted.

To measure droplet spread, 1% medical fluorescein in saline was applied to each patient’s nasal cavity; droplets were visualized on a white sheet covering the patient and a white apron worn by the health care worker. The patients’ masks were kept below their noses but were covering their mouths.

“During the placement and removal of the catheter, the health care worker was always standing sideways or even behind the patient, and they always stood higher relative to the patient to ensure that when there was aerosol or droplet spread, it was not in their direction,” Dr. Verbeure said during his virtual presentation.

During placement for HRM and removal for MII-pH, aerosol particles (excluding those that were 0.3 mcm), decreased significantly. Otherwise, particle counts remained stable. “This shows that these investigations do not generate additional aerosol [particles], which is good news,” Dr. Verbeure said.

When the air purifier was present, placement and examination for HRM were associated with significant reductions in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm), whereas removal caused a slight uptick in aerosol particles (excluding those that were 0.3 mcm or 0.5 mcm) that did not decline after 5 minutes. “This was actually a surprise to us,” Dr. Verbeure said. “Because we now had an air purifier present, and we expected an even lower number of particles.”

He suggested that the purifier may have been reducing particle counts during HRM examination, thereby lowering baseline values before removal, making small changes more noticeable; or the purifier may have been causing turbulence that spread particles during removal. Whether either of these hypotheses is true, Dr. Verbeure noted that particle counts were never higher than at the start of the examination. Fluorescein visualization showed “surprisingly little droplet spread,” Dr. Verbeure said, apart from some contamination around the patient’s neck.

“Esophageal investigations do not seem to generate additional [aerosol] particles,” Dr. Verbeure concluded. “So wearing the recommended protective gear and also considering the right positioning of the health care worker relative to the patient is important to keep performing this daily clinical routine.” To avoid droplet spread, health care workers should “be aware of the [patient’s] neck region and the direction of the catheter,” Dr. Verbeure added.
 

 

 

SORTing the results

According to Mahdi Najafi, MD, associate professor in the department of anesthesiology at Tehran University of Medical Sciences, Iran, and adjunct professor at Schulich School of Medicine & Dentistry, Western University, London, Ontario, the findings offer valuable insights. “[This study] is very important for at least two reasons: The extent of using this procedure in patient care, especially in the critical care setting, and the paucity of information for COVID-19 transmission and route of transmission as well,” Dr. Najafi said in an interview.

Yet he cautioned against generalizing the results. “We cannot extend the results to all nasogastric tube intubations,” Dr. Najafi said. “There are reasons for that. The tube for manometry is delicate and flexible, while the nasogastric tube used for drainage and GI pressure release – which is used commonly in intensive care and the operating room – is larger and rather rigid. Moreover, the patient is awake and conscious for manometry while the other procedures are done in sedated or unconscious patients.”

He noted that nasogastric intubation is more challenging in unconscious patients, and often requires a laryngoscope and/or Magill forceps. “The result [of using these instruments] is coughing, which is undoubtedly the most important cause of aerosol generation,” Dr. Najafi said. “It can be regarded as a drawback to this study as well. The authors would be better to report the number and/or severity of the airway reactions during the procedures, which are the main source of droplets and aerosols.”

To reduce risk of coronavirus transmission during nasogastric intubation of unconscious patients, Dr. Najafi recommended the SORT (Sniffing position, nasogastric tube Orientation, contralateral Rotation, and Twisting movement) maneuver, which he introduced in 2016 for use in critical care and operating room settings.

“The employment of anatomical approach and avoiding equipment for intubation were devised to increase the level of safety and decrease hazards and adverse effects,” Dr. Najafi said of the SORT maneuver. “The procedure needs to be done step-by-step and as smooth as possible.”

In a recent study, the SORT maneuver was compared with nasogastric intubation using neck flexion lateral pressure in critically ill patients. The investigators concluded that the SORT maneuver is “a promising method” notable for its simple technique, and suggested that more trials are needed.

The investigators and Dr. Najafi reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Head-to-head trial compares ustekinumab with adalimumab in Crohn’s

Article Type
Changed
Wed, 06/16/2021 - 10:25

For biologic-naive adults with moderate to severe Crohn’s disease, treatment with adalimumab or ustekinumab leads to similar outcomes, according to results of the head-to-head SEAVUE trial.

Dr. Bruce E. Sands

When lead author Bruce E. Sands, MD, of Icahn School of Medicine at Mount Sinai, New York, compared treatment arms, patients had similar rates of clinical remission at one year. All major secondary endpoints, such as endoscopic remission, were comparable, as were safety profiles, Dr. Sands reported at the annual Digestive Disease Week® (DDW).

“From my perspective, this is an important study,” Dr. Sands wrote in a virtual chat following his presentation. “We need more head-to-head studies!”

Results from the SEAVUE trial come almost 2 years after Dr. Sands reported findings of another head-to-head IBD trial: VARSITY, which demonstrated the superiority of vedolizumab over adalimumab among patients with moderate to severe ulcerative colitis.

The multicenter, double-blinded SEAVUE trial involved 386 patients with biologic-naive Crohn’s disease who had failed corticosteroids or immunomodulators. All patients had Crohn’s Disease Activity Index (CDAI) scores ranging from 220 to 450 and had at least one ulcer detected at baseline ileocolonoscopy.

Participants were randomized in a 1:1 ratio to receive monotherapy with either subcutaneous adalimumab (citrate-free; 160 mg at baseline, 70 mg at week 2, then 40 mg every 2 weeks) or ustekinumab, which was given first intravenously at a dose of 6 mg/kg then subcutaneously at 90 mg every 8 weeks.

The primary endpoint was clinical remission at week 52, defined by a CDAI score less than 150. Major secondary endpoints included clinical response, corticosteroid-free remission, endoscopic remission, remission in patient-reported CDAI components, and clinical remission at week 16.

Results were statistically similar across all endpoints, with clinical remission at 1 year occurring in 64.9% and 61.0% of patients receiving ustekinumab and adalimumab, respectively (P = .417).

“Both treatments demonstrated rapid onset of action and robust endoscopy results,” Dr. Sands noted during his presentation; he reported comparable rates of endoscopic remission, at 28.5% and 30.7% for ustekinumab and adalimumab, respectively (P = .631).

Among secondary endpoints, ustekinumab demonstrated some superiority, with greater maintenance of clinical response at week 52 among patients with response at week 16 (88.6% vs. 78.0%; P = .016), greater reduction in liquid/soft stools in prior 7 days from baseline to week 52 (–19.9 vs. –16.2; P = .004), and greater reduction in sum number of liquid/soft stools and abdominal pain scores in prior 7 days from baseline to week 52 (–29.6 vs. –25.1; P = .013).

Safety metrics were similar between groups, and consistent with previous experience. Although the adalimumab group had a higher rate of discontinuation due to adverse events, this trend was not statistically significant (11.3% vs. 6.3%; P value not provided).
 

Don’t ignore discontinuation rates

Jordan E. Axelrad, MD, assistant professor of medicine at NYU and a clinician at the Inflammatory Bowel Disease Center at NYU Langone Health, New York, commended the SEAVUE trial for its head-to-head design, which is a first for biologics in Crohn’s disease.

Dr. Jordan E. Axelrad

“With newer drugs, there’s a critical need for head-to-head studies for us to understand where to position a lot of these agents,” he said in an interview. “[T]his was a good undifferentiated group to understand what’s the first biologic you should use in a patient with moderate-to-severe Crohn’s disease. The primary, major take-home is that [ustekinumab and adalimumab] are similarly effective.”

When asked about the slight superiority in minor secondary endpoints associated with ustekinumab, Dr. Axelrad suggested that rates of discontinuation deserve more attention.

“For me, maybe the major focus would be on the number of patients who stopped treatment,” Dr. Axelrad said, noting a higher rate of discontinuation in the adalimumab group. “Although that was just numerical, that to me is actually more important than [the minor secondary endpoints].” He also highlighted the lower injection burden associated with ustekinumab, which is given every 8 weeks, compared with every 2 weeks for adalimumab.

Ultimately, however, it’s unlikely that treatment sequencing will depend on these finer points, Dr. Axelrad suggested, and will instead come down to finances, especially with adalimumab biosimilars on the horizon, which may be the most cost-effective.

“A lot of the decision-making of where to position [ustekinumab in Crohn’s disease] is going to come down to the payer,” Dr. Axelrad said. “If there was a clear signal, providers such as myself would have a better leg to stand on, like we saw with VARSITY, where vedolizumab was clearly superior to adalimumab on multiple endpoints. We didn’t see that sort of robust signal here.”

The SEAVUE trial was supported by Janssen Scientific Affairs. Dr. Sands disclosed relationships with Janssen, AbbVie, Takeda, and others. Dr. Axelrad disclosed previous consulting fees from Janssen and research support from BioFire.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

For biologic-naive adults with moderate to severe Crohn’s disease, treatment with adalimumab or ustekinumab leads to similar outcomes, according to results of the head-to-head SEAVUE trial.

Dr. Bruce E. Sands

When lead author Bruce E. Sands, MD, of Icahn School of Medicine at Mount Sinai, New York, compared treatment arms, patients had similar rates of clinical remission at one year. All major secondary endpoints, such as endoscopic remission, were comparable, as were safety profiles, Dr. Sands reported at the annual Digestive Disease Week® (DDW).

“From my perspective, this is an important study,” Dr. Sands wrote in a virtual chat following his presentation. “We need more head-to-head studies!”

Results from the SEAVUE trial come almost 2 years after Dr. Sands reported findings of another head-to-head IBD trial: VARSITY, which demonstrated the superiority of vedolizumab over adalimumab among patients with moderate to severe ulcerative colitis.

The multicenter, double-blinded SEAVUE trial involved 386 patients with biologic-naive Crohn’s disease who had failed corticosteroids or immunomodulators. All patients had Crohn’s Disease Activity Index (CDAI) scores ranging from 220 to 450 and had at least one ulcer detected at baseline ileocolonoscopy.

Participants were randomized in a 1:1 ratio to receive monotherapy with either subcutaneous adalimumab (citrate-free; 160 mg at baseline, 70 mg at week 2, then 40 mg every 2 weeks) or ustekinumab, which was given first intravenously at a dose of 6 mg/kg then subcutaneously at 90 mg every 8 weeks.

The primary endpoint was clinical remission at week 52, defined by a CDAI score less than 150. Major secondary endpoints included clinical response, corticosteroid-free remission, endoscopic remission, remission in patient-reported CDAI components, and clinical remission at week 16.

Results were statistically similar across all endpoints, with clinical remission at 1 year occurring in 64.9% and 61.0% of patients receiving ustekinumab and adalimumab, respectively (P = .417).

“Both treatments demonstrated rapid onset of action and robust endoscopy results,” Dr. Sands noted during his presentation; he reported comparable rates of endoscopic remission, at 28.5% and 30.7% for ustekinumab and adalimumab, respectively (P = .631).

Among secondary endpoints, ustekinumab demonstrated some superiority, with greater maintenance of clinical response at week 52 among patients with response at week 16 (88.6% vs. 78.0%; P = .016), greater reduction in liquid/soft stools in prior 7 days from baseline to week 52 (–19.9 vs. –16.2; P = .004), and greater reduction in sum number of liquid/soft stools and abdominal pain scores in prior 7 days from baseline to week 52 (–29.6 vs. –25.1; P = .013).

Safety metrics were similar between groups, and consistent with previous experience. Although the adalimumab group had a higher rate of discontinuation due to adverse events, this trend was not statistically significant (11.3% vs. 6.3%; P value not provided).
 

Don’t ignore discontinuation rates

Jordan E. Axelrad, MD, assistant professor of medicine at NYU and a clinician at the Inflammatory Bowel Disease Center at NYU Langone Health, New York, commended the SEAVUE trial for its head-to-head design, which is a first for biologics in Crohn’s disease.

Dr. Jordan E. Axelrad

“With newer drugs, there’s a critical need for head-to-head studies for us to understand where to position a lot of these agents,” he said in an interview. “[T]his was a good undifferentiated group to understand what’s the first biologic you should use in a patient with moderate-to-severe Crohn’s disease. The primary, major take-home is that [ustekinumab and adalimumab] are similarly effective.”

When asked about the slight superiority in minor secondary endpoints associated with ustekinumab, Dr. Axelrad suggested that rates of discontinuation deserve more attention.

“For me, maybe the major focus would be on the number of patients who stopped treatment,” Dr. Axelrad said, noting a higher rate of discontinuation in the adalimumab group. “Although that was just numerical, that to me is actually more important than [the minor secondary endpoints].” He also highlighted the lower injection burden associated with ustekinumab, which is given every 8 weeks, compared with every 2 weeks for adalimumab.

Ultimately, however, it’s unlikely that treatment sequencing will depend on these finer points, Dr. Axelrad suggested, and will instead come down to finances, especially with adalimumab biosimilars on the horizon, which may be the most cost-effective.

“A lot of the decision-making of where to position [ustekinumab in Crohn’s disease] is going to come down to the payer,” Dr. Axelrad said. “If there was a clear signal, providers such as myself would have a better leg to stand on, like we saw with VARSITY, where vedolizumab was clearly superior to adalimumab on multiple endpoints. We didn’t see that sort of robust signal here.”

The SEAVUE trial was supported by Janssen Scientific Affairs. Dr. Sands disclosed relationships with Janssen, AbbVie, Takeda, and others. Dr. Axelrad disclosed previous consulting fees from Janssen and research support from BioFire.

For biologic-naive adults with moderate to severe Crohn’s disease, treatment with adalimumab or ustekinumab leads to similar outcomes, according to results of the head-to-head SEAVUE trial.

Dr. Bruce E. Sands

When lead author Bruce E. Sands, MD, of Icahn School of Medicine at Mount Sinai, New York, compared treatment arms, patients had similar rates of clinical remission at one year. All major secondary endpoints, such as endoscopic remission, were comparable, as were safety profiles, Dr. Sands reported at the annual Digestive Disease Week® (DDW).

“From my perspective, this is an important study,” Dr. Sands wrote in a virtual chat following his presentation. “We need more head-to-head studies!”

Results from the SEAVUE trial come almost 2 years after Dr. Sands reported findings of another head-to-head IBD trial: VARSITY, which demonstrated the superiority of vedolizumab over adalimumab among patients with moderate to severe ulcerative colitis.

The multicenter, double-blinded SEAVUE trial involved 386 patients with biologic-naive Crohn’s disease who had failed corticosteroids or immunomodulators. All patients had Crohn’s Disease Activity Index (CDAI) scores ranging from 220 to 450 and had at least one ulcer detected at baseline ileocolonoscopy.

Participants were randomized in a 1:1 ratio to receive monotherapy with either subcutaneous adalimumab (citrate-free; 160 mg at baseline, 70 mg at week 2, then 40 mg every 2 weeks) or ustekinumab, which was given first intravenously at a dose of 6 mg/kg then subcutaneously at 90 mg every 8 weeks.

The primary endpoint was clinical remission at week 52, defined by a CDAI score less than 150. Major secondary endpoints included clinical response, corticosteroid-free remission, endoscopic remission, remission in patient-reported CDAI components, and clinical remission at week 16.

Results were statistically similar across all endpoints, with clinical remission at 1 year occurring in 64.9% and 61.0% of patients receiving ustekinumab and adalimumab, respectively (P = .417).

“Both treatments demonstrated rapid onset of action and robust endoscopy results,” Dr. Sands noted during his presentation; he reported comparable rates of endoscopic remission, at 28.5% and 30.7% for ustekinumab and adalimumab, respectively (P = .631).

Among secondary endpoints, ustekinumab demonstrated some superiority, with greater maintenance of clinical response at week 52 among patients with response at week 16 (88.6% vs. 78.0%; P = .016), greater reduction in liquid/soft stools in prior 7 days from baseline to week 52 (–19.9 vs. –16.2; P = .004), and greater reduction in sum number of liquid/soft stools and abdominal pain scores in prior 7 days from baseline to week 52 (–29.6 vs. –25.1; P = .013).

Safety metrics were similar between groups, and consistent with previous experience. Although the adalimumab group had a higher rate of discontinuation due to adverse events, this trend was not statistically significant (11.3% vs. 6.3%; P value not provided).
 

Don’t ignore discontinuation rates

Jordan E. Axelrad, MD, assistant professor of medicine at NYU and a clinician at the Inflammatory Bowel Disease Center at NYU Langone Health, New York, commended the SEAVUE trial for its head-to-head design, which is a first for biologics in Crohn’s disease.

Dr. Jordan E. Axelrad

“With newer drugs, there’s a critical need for head-to-head studies for us to understand where to position a lot of these agents,” he said in an interview. “[T]his was a good undifferentiated group to understand what’s the first biologic you should use in a patient with moderate-to-severe Crohn’s disease. The primary, major take-home is that [ustekinumab and adalimumab] are similarly effective.”

When asked about the slight superiority in minor secondary endpoints associated with ustekinumab, Dr. Axelrad suggested that rates of discontinuation deserve more attention.

“For me, maybe the major focus would be on the number of patients who stopped treatment,” Dr. Axelrad said, noting a higher rate of discontinuation in the adalimumab group. “Although that was just numerical, that to me is actually more important than [the minor secondary endpoints].” He also highlighted the lower injection burden associated with ustekinumab, which is given every 8 weeks, compared with every 2 weeks for adalimumab.

Ultimately, however, it’s unlikely that treatment sequencing will depend on these finer points, Dr. Axelrad suggested, and will instead come down to finances, especially with adalimumab biosimilars on the horizon, which may be the most cost-effective.

“A lot of the decision-making of where to position [ustekinumab in Crohn’s disease] is going to come down to the payer,” Dr. Axelrad said. “If there was a clear signal, providers such as myself would have a better leg to stand on, like we saw with VARSITY, where vedolizumab was clearly superior to adalimumab on multiple endpoints. We didn’t see that sort of robust signal here.”

The SEAVUE trial was supported by Janssen Scientific Affairs. Dr. Sands disclosed relationships with Janssen, AbbVie, Takeda, and others. Dr. Axelrad disclosed previous consulting fees from Janssen and research support from BioFire.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Microbiome therapeutic offers durable protection against C. difficile recurrence

Article Type
Changed
Thu, 06/03/2021 - 10:34

 

SER-109, an oral microbiome therapeutic, safely protects against Clostridioides difficile recurrence for up to 24 weeks, according to a recent phase 3 trial. Three days of treatment with purified Firmicutes spores reduced risk of recurrence by 54%, suggesting a sustained, clinically meaningful response, according to a multicenter study presented at this year’s Digestive Disease Week® (DDW).

“Antibiotics targeted against C. difficile bacteria are necessary but insufficient to achieve a durable clinical response because they have no effect on C. difficile spores that germinate within a disrupted microbiome,” the investigators reported at the meeting.

“The manufacturing processes for SER-109 are designed to inactivate potential pathogens, while enriching for beneficial Firmicutes spores, which play a central role in inhibiting the cycle of C. difficile,” said Louis Y. Korman, MD, a gastroenterologist in Washington, who was lead author.
 

Extended data from ECOSPOR-III

The ECOSPOR-III trial involved 182 patients with at least three episodes of C. difficile infection in the previous 12 months. Patients underwent 10-21 days of antibiotic therapy with fidaxomicin or vancomycin to resolve symptoms before they were then randomized in a 1:1 ratio to receive either SER-109 (four capsules daily for 3 days) or placebo, with stratification by specific antibiotic and patient age (threshold of 65 years).

The primary objectives were safety and efficacy at 8 weeks. These results, which were previously reported at ACG 2020, showed a 68% relative risk reduction in the SER-109 group, and favorable safety data. The findings presented at DDW added to those earlier ones by providing safety and efficacy data extending to week 24. At this time point, patients treated with SER-109 had a 54% relative risk reduction in C. difficile recurrence. Recurrence rates were 21.3% and 47.3% for the treatment and placebo groups, respectively (P less than .001).

Patients 65 years and older benefited the most from SER-109 therapy, based on a relative risk reduction of 56% (P less than .001), versus a 49% relative risk reduction (lacking statistical significance) for patients younger than 65 years (P = .093). The specific antibiotic therapy patients received also appeared to impact outcomes. Patients treated with fidaxomicin had a 73% relative risk reduction (P = .009), compared with 48% for vancomycin (P = .006). Safety profiles were similar between study arms.

“By enriching for Firmicutes spores, SER-109 achieves high efficacy, while mitigating risk of transmitting infectious agents and represents a major paradigm shift in the clinical management of patients with recurrent C. difficile infection,” the investigators concluded, noting that “an open-label study for patients with recurrent C. difficile infection is currently enrolling.”
 

Microbiome restoration therapies

According to Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn., these findings “advance the field” because they show a sustained response. “We know that microbiome restoration therapies help restore colonization resistance,” Dr. Khanna said in an interview, noting that they offer benefits comparable to fecal microbiota transplantation (FMT) without the downsides.

Dr. Sahil Khanna


“The trouble with FMT is that it’s heterogenous – everybody does it differently … and also it’s an invasive procedure,” Dr. Khanna said. He noted that FMT may transmit infectious agents between donors and patients, which isn’t an issue with purified products such as SER-109.

Several other standardized microbiota restoration products are under development, Dr. Khanna said, including an enema form (RBX2660) in phase 3 testing, and two other capsules (CP101 and VE303) in phase 2 trials. “The hope would be that one or more of these products would be approved for clinical use in the near future and would probably replace the vast majority of FMT [procedures] that we do clinically,” Dr. Khanna said. “That’s where the field is headed.”

The investigators reported no conflicts of interest. Dr. Khanna disclosed research support from Finch, Rebiotix/Ferring, Vedanta, and Seres.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

SER-109, an oral microbiome therapeutic, safely protects against Clostridioides difficile recurrence for up to 24 weeks, according to a recent phase 3 trial. Three days of treatment with purified Firmicutes spores reduced risk of recurrence by 54%, suggesting a sustained, clinically meaningful response, according to a multicenter study presented at this year’s Digestive Disease Week® (DDW).

“Antibiotics targeted against C. difficile bacteria are necessary but insufficient to achieve a durable clinical response because they have no effect on C. difficile spores that germinate within a disrupted microbiome,” the investigators reported at the meeting.

“The manufacturing processes for SER-109 are designed to inactivate potential pathogens, while enriching for beneficial Firmicutes spores, which play a central role in inhibiting the cycle of C. difficile,” said Louis Y. Korman, MD, a gastroenterologist in Washington, who was lead author.
 

Extended data from ECOSPOR-III

The ECOSPOR-III trial involved 182 patients with at least three episodes of C. difficile infection in the previous 12 months. Patients underwent 10-21 days of antibiotic therapy with fidaxomicin or vancomycin to resolve symptoms before they were then randomized in a 1:1 ratio to receive either SER-109 (four capsules daily for 3 days) or placebo, with stratification by specific antibiotic and patient age (threshold of 65 years).

The primary objectives were safety and efficacy at 8 weeks. These results, which were previously reported at ACG 2020, showed a 68% relative risk reduction in the SER-109 group, and favorable safety data. The findings presented at DDW added to those earlier ones by providing safety and efficacy data extending to week 24. At this time point, patients treated with SER-109 had a 54% relative risk reduction in C. difficile recurrence. Recurrence rates were 21.3% and 47.3% for the treatment and placebo groups, respectively (P less than .001).

Patients 65 years and older benefited the most from SER-109 therapy, based on a relative risk reduction of 56% (P less than .001), versus a 49% relative risk reduction (lacking statistical significance) for patients younger than 65 years (P = .093). The specific antibiotic therapy patients received also appeared to impact outcomes. Patients treated with fidaxomicin had a 73% relative risk reduction (P = .009), compared with 48% for vancomycin (P = .006). Safety profiles were similar between study arms.

“By enriching for Firmicutes spores, SER-109 achieves high efficacy, while mitigating risk of transmitting infectious agents and represents a major paradigm shift in the clinical management of patients with recurrent C. difficile infection,” the investigators concluded, noting that “an open-label study for patients with recurrent C. difficile infection is currently enrolling.”
 

Microbiome restoration therapies

According to Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn., these findings “advance the field” because they show a sustained response. “We know that microbiome restoration therapies help restore colonization resistance,” Dr. Khanna said in an interview, noting that they offer benefits comparable to fecal microbiota transplantation (FMT) without the downsides.

Dr. Sahil Khanna


“The trouble with FMT is that it’s heterogenous – everybody does it differently … and also it’s an invasive procedure,” Dr. Khanna said. He noted that FMT may transmit infectious agents between donors and patients, which isn’t an issue with purified products such as SER-109.

Several other standardized microbiota restoration products are under development, Dr. Khanna said, including an enema form (RBX2660) in phase 3 testing, and two other capsules (CP101 and VE303) in phase 2 trials. “The hope would be that one or more of these products would be approved for clinical use in the near future and would probably replace the vast majority of FMT [procedures] that we do clinically,” Dr. Khanna said. “That’s where the field is headed.”

The investigators reported no conflicts of interest. Dr. Khanna disclosed research support from Finch, Rebiotix/Ferring, Vedanta, and Seres.

 

SER-109, an oral microbiome therapeutic, safely protects against Clostridioides difficile recurrence for up to 24 weeks, according to a recent phase 3 trial. Three days of treatment with purified Firmicutes spores reduced risk of recurrence by 54%, suggesting a sustained, clinically meaningful response, according to a multicenter study presented at this year’s Digestive Disease Week® (DDW).

“Antibiotics targeted against C. difficile bacteria are necessary but insufficient to achieve a durable clinical response because they have no effect on C. difficile spores that germinate within a disrupted microbiome,” the investigators reported at the meeting.

“The manufacturing processes for SER-109 are designed to inactivate potential pathogens, while enriching for beneficial Firmicutes spores, which play a central role in inhibiting the cycle of C. difficile,” said Louis Y. Korman, MD, a gastroenterologist in Washington, who was lead author.
 

Extended data from ECOSPOR-III

The ECOSPOR-III trial involved 182 patients with at least three episodes of C. difficile infection in the previous 12 months. Patients underwent 10-21 days of antibiotic therapy with fidaxomicin or vancomycin to resolve symptoms before they were then randomized in a 1:1 ratio to receive either SER-109 (four capsules daily for 3 days) or placebo, with stratification by specific antibiotic and patient age (threshold of 65 years).

The primary objectives were safety and efficacy at 8 weeks. These results, which were previously reported at ACG 2020, showed a 68% relative risk reduction in the SER-109 group, and favorable safety data. The findings presented at DDW added to those earlier ones by providing safety and efficacy data extending to week 24. At this time point, patients treated with SER-109 had a 54% relative risk reduction in C. difficile recurrence. Recurrence rates were 21.3% and 47.3% for the treatment and placebo groups, respectively (P less than .001).

Patients 65 years and older benefited the most from SER-109 therapy, based on a relative risk reduction of 56% (P less than .001), versus a 49% relative risk reduction (lacking statistical significance) for patients younger than 65 years (P = .093). The specific antibiotic therapy patients received also appeared to impact outcomes. Patients treated with fidaxomicin had a 73% relative risk reduction (P = .009), compared with 48% for vancomycin (P = .006). Safety profiles were similar between study arms.

“By enriching for Firmicutes spores, SER-109 achieves high efficacy, while mitigating risk of transmitting infectious agents and represents a major paradigm shift in the clinical management of patients with recurrent C. difficile infection,” the investigators concluded, noting that “an open-label study for patients with recurrent C. difficile infection is currently enrolling.”
 

Microbiome restoration therapies

According to Sahil Khanna, MBBS, professor of medicine at Mayo Clinic, Rochester, Minn., these findings “advance the field” because they show a sustained response. “We know that microbiome restoration therapies help restore colonization resistance,” Dr. Khanna said in an interview, noting that they offer benefits comparable to fecal microbiota transplantation (FMT) without the downsides.

Dr. Sahil Khanna


“The trouble with FMT is that it’s heterogenous – everybody does it differently … and also it’s an invasive procedure,” Dr. Khanna said. He noted that FMT may transmit infectious agents between donors and patients, which isn’t an issue with purified products such as SER-109.

Several other standardized microbiota restoration products are under development, Dr. Khanna said, including an enema form (RBX2660) in phase 3 testing, and two other capsules (CP101 and VE303) in phase 2 trials. “The hope would be that one or more of these products would be approved for clinical use in the near future and would probably replace the vast majority of FMT [procedures] that we do clinically,” Dr. Khanna said. “That’s where the field is headed.”

The investigators reported no conflicts of interest. Dr. Khanna disclosed research support from Finch, Rebiotix/Ferring, Vedanta, and Seres.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Intervention reduces PPI use without worsening acid-related diseases

Article Type
Changed
Fri, 05/28/2021 - 12:32

Proton pump inhibitor (PPI) use can safely be reduced by deprescribing efforts coupled with patient and clinician education, according to a retrospective study involving more than 4 million veterans.

Dr. Jacob E. Kurlander

After 1 year, the intervention was associated with a significant reduction in PPI use without worsening of acid-related diseases, reported lead author Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Healthcare System’s Center for Clinical Management Research.

“There’s increasing interest in interventions to reduce PPI use,” Dr. Kurlander said during his virtual presentation at the annual Digestive Disease Week® (DDW). “Many of the interventions have come in the form of patient and provider education, like the Choosing Wisely campaign put out by the American Board of Internal Medicine. However, in rigorous studies, few interventions have actually proven effective, and many of these studies lack data on clinical outcomes, so it’s difficult to ascertain the real clinical benefits, or even harms.”

In an effort to address this gap, the investigators conducted a retrospective, difference-in-difference study spanning 10 years, from 2009 to 2019. The 1-year intervention, implemented in August 2013, included refill restrictions for PPIs without documented indication for long-term use, voiding of PPI prescriptions not filled within 6 months, a quick-order option for H2-receptor antagonists, reports to identify high-dose PPI prescribing, and patient and clinician education.

The intervention group consisted of 192,607-250,349 veterans in Veteran Integrated Service Network 17, whereas the control group consisted of 3,775,978-4,360,908 veterans in other service networks (ranges in population size are due to variations across 6-month intervals of analysis). For each 6-month interval, patients were included if they had at least two primary care visits within the past 2 years, and excluded if they received primary care at three other sites that joined the intervention site after initial implementation.

The investigators analyzed three main outcomes: Proportion of veterans dispensed a PPI prescription from the VA at any dose; incidence proportion of hospitalization for upper GI diseases, including upper GI bleeding other than from esophageal varices or angiodysplasia, as well as nonbleeding acid peptic disease; and rates of primary care visits, gastroenterology visits, and esophagogastroduodenoscopies (EGDs).

The analysis was divided into a preimplementation period, lasting approximately 5 years, and a postimplementation period with a similar duration. In the postimplementation period, the intervention group had a 5.9% relative reduction in PPI prescriptions, compared with the control group (P < .001). During the same period, the intervention site did not have a significant increase in the rate of patients hospitalized for upper GI diseases, primary care visits, GI clinic visits, or EGDs.

In a subgroup analysis of patients coprescribed PPIs during time at high-risk for upper GI bleeding (that is, when they possessed at least two high-risk medications, such as warfarin), there was a 4.6% relative reduction in time with PPI gastroprotection among the intervention group, compared with the control group (P = .003). In a second sensitivity analysis, hospitalization for upper GI diseases in high-risk patients at least 65 years of age was not significantly different between groups.

“[This] multicomponent PPI deprescribing program led to sustained reductions in PPI use,” Dr. Kurlander concluded. “However, this blunt intervention also reduced appropriate use of PPIs for gastroprotection, raising some concerns about clinical quality of care, but this did not appear to cause any measurable clinical harm in terms of hospitalizations for upper GI diseases.”
 

 

 

Debate around ‘unnecessary PPI use’

According to Philip O. Katz, MD, professor of medicine and director of motility laboratories at Weill Cornell Medicine, New York, the study “makes an attempt to do what others have tried in different ways, which is to develop a mechanism to help reduce or discontinue proton pump inhibitors when people believe they’re not indicated.”

Yet this latter element – appropriate indication – drives an ongoing debate.

“This is a very controversial area,” Dr. Katz said in an interview. “The concept of using the lowest effective dose of medication needed for a symptom or a disease is not new, but the push to reducing or eliminating ‘unnecessary PPI use’ is one that I believe should be carefully discussed, and that we have a clear understanding of what constitutes unnecessary use. And quite honestly, I’m willing to state that I don’t believe that’s been well defined.”

Dr. Katz, who recently coauthored an article about PPIs, suggested that more prospective research is needed to identify which patients need PPIs and which don’t.

“What we really need are more studies that look at who really needs [PPIs] long term,” Dr. Katz said, “as opposed to doing it ad hoc.”

The study was funded by the U.S. Department of Veterans Affairs and the National Institute of Diabetes and Digestive and Kidney Diseases. The investigators reported no conflicts of interest. Dr. Katz is a consultant for Phathom Pharma.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Proton pump inhibitor (PPI) use can safely be reduced by deprescribing efforts coupled with patient and clinician education, according to a retrospective study involving more than 4 million veterans.

Dr. Jacob E. Kurlander

After 1 year, the intervention was associated with a significant reduction in PPI use without worsening of acid-related diseases, reported lead author Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Healthcare System’s Center for Clinical Management Research.

“There’s increasing interest in interventions to reduce PPI use,” Dr. Kurlander said during his virtual presentation at the annual Digestive Disease Week® (DDW). “Many of the interventions have come in the form of patient and provider education, like the Choosing Wisely campaign put out by the American Board of Internal Medicine. However, in rigorous studies, few interventions have actually proven effective, and many of these studies lack data on clinical outcomes, so it’s difficult to ascertain the real clinical benefits, or even harms.”

In an effort to address this gap, the investigators conducted a retrospective, difference-in-difference study spanning 10 years, from 2009 to 2019. The 1-year intervention, implemented in August 2013, included refill restrictions for PPIs without documented indication for long-term use, voiding of PPI prescriptions not filled within 6 months, a quick-order option for H2-receptor antagonists, reports to identify high-dose PPI prescribing, and patient and clinician education.

The intervention group consisted of 192,607-250,349 veterans in Veteran Integrated Service Network 17, whereas the control group consisted of 3,775,978-4,360,908 veterans in other service networks (ranges in population size are due to variations across 6-month intervals of analysis). For each 6-month interval, patients were included if they had at least two primary care visits within the past 2 years, and excluded if they received primary care at three other sites that joined the intervention site after initial implementation.

The investigators analyzed three main outcomes: Proportion of veterans dispensed a PPI prescription from the VA at any dose; incidence proportion of hospitalization for upper GI diseases, including upper GI bleeding other than from esophageal varices or angiodysplasia, as well as nonbleeding acid peptic disease; and rates of primary care visits, gastroenterology visits, and esophagogastroduodenoscopies (EGDs).

The analysis was divided into a preimplementation period, lasting approximately 5 years, and a postimplementation period with a similar duration. In the postimplementation period, the intervention group had a 5.9% relative reduction in PPI prescriptions, compared with the control group (P < .001). During the same period, the intervention site did not have a significant increase in the rate of patients hospitalized for upper GI diseases, primary care visits, GI clinic visits, or EGDs.

In a subgroup analysis of patients coprescribed PPIs during time at high-risk for upper GI bleeding (that is, when they possessed at least two high-risk medications, such as warfarin), there was a 4.6% relative reduction in time with PPI gastroprotection among the intervention group, compared with the control group (P = .003). In a second sensitivity analysis, hospitalization for upper GI diseases in high-risk patients at least 65 years of age was not significantly different between groups.

“[This] multicomponent PPI deprescribing program led to sustained reductions in PPI use,” Dr. Kurlander concluded. “However, this blunt intervention also reduced appropriate use of PPIs for gastroprotection, raising some concerns about clinical quality of care, but this did not appear to cause any measurable clinical harm in terms of hospitalizations for upper GI diseases.”
 

 

 

Debate around ‘unnecessary PPI use’

According to Philip O. Katz, MD, professor of medicine and director of motility laboratories at Weill Cornell Medicine, New York, the study “makes an attempt to do what others have tried in different ways, which is to develop a mechanism to help reduce or discontinue proton pump inhibitors when people believe they’re not indicated.”

Yet this latter element – appropriate indication – drives an ongoing debate.

“This is a very controversial area,” Dr. Katz said in an interview. “The concept of using the lowest effective dose of medication needed for a symptom or a disease is not new, but the push to reducing or eliminating ‘unnecessary PPI use’ is one that I believe should be carefully discussed, and that we have a clear understanding of what constitutes unnecessary use. And quite honestly, I’m willing to state that I don’t believe that’s been well defined.”

Dr. Katz, who recently coauthored an article about PPIs, suggested that more prospective research is needed to identify which patients need PPIs and which don’t.

“What we really need are more studies that look at who really needs [PPIs] long term,” Dr. Katz said, “as opposed to doing it ad hoc.”

The study was funded by the U.S. Department of Veterans Affairs and the National Institute of Diabetes and Digestive and Kidney Diseases. The investigators reported no conflicts of interest. Dr. Katz is a consultant for Phathom Pharma.

Proton pump inhibitor (PPI) use can safely be reduced by deprescribing efforts coupled with patient and clinician education, according to a retrospective study involving more than 4 million veterans.

Dr. Jacob E. Kurlander

After 1 year, the intervention was associated with a significant reduction in PPI use without worsening of acid-related diseases, reported lead author Jacob E. Kurlander, MD, of the University of Michigan, Ann Arbor, and the VA Ann Arbor Healthcare System’s Center for Clinical Management Research.

“There’s increasing interest in interventions to reduce PPI use,” Dr. Kurlander said during his virtual presentation at the annual Digestive Disease Week® (DDW). “Many of the interventions have come in the form of patient and provider education, like the Choosing Wisely campaign put out by the American Board of Internal Medicine. However, in rigorous studies, few interventions have actually proven effective, and many of these studies lack data on clinical outcomes, so it’s difficult to ascertain the real clinical benefits, or even harms.”

In an effort to address this gap, the investigators conducted a retrospective, difference-in-difference study spanning 10 years, from 2009 to 2019. The 1-year intervention, implemented in August 2013, included refill restrictions for PPIs without documented indication for long-term use, voiding of PPI prescriptions not filled within 6 months, a quick-order option for H2-receptor antagonists, reports to identify high-dose PPI prescribing, and patient and clinician education.

The intervention group consisted of 192,607-250,349 veterans in Veteran Integrated Service Network 17, whereas the control group consisted of 3,775,978-4,360,908 veterans in other service networks (ranges in population size are due to variations across 6-month intervals of analysis). For each 6-month interval, patients were included if they had at least two primary care visits within the past 2 years, and excluded if they received primary care at three other sites that joined the intervention site after initial implementation.

The investigators analyzed three main outcomes: Proportion of veterans dispensed a PPI prescription from the VA at any dose; incidence proportion of hospitalization for upper GI diseases, including upper GI bleeding other than from esophageal varices or angiodysplasia, as well as nonbleeding acid peptic disease; and rates of primary care visits, gastroenterology visits, and esophagogastroduodenoscopies (EGDs).

The analysis was divided into a preimplementation period, lasting approximately 5 years, and a postimplementation period with a similar duration. In the postimplementation period, the intervention group had a 5.9% relative reduction in PPI prescriptions, compared with the control group (P < .001). During the same period, the intervention site did not have a significant increase in the rate of patients hospitalized for upper GI diseases, primary care visits, GI clinic visits, or EGDs.

In a subgroup analysis of patients coprescribed PPIs during time at high-risk for upper GI bleeding (that is, when they possessed at least two high-risk medications, such as warfarin), there was a 4.6% relative reduction in time with PPI gastroprotection among the intervention group, compared with the control group (P = .003). In a second sensitivity analysis, hospitalization for upper GI diseases in high-risk patients at least 65 years of age was not significantly different between groups.

“[This] multicomponent PPI deprescribing program led to sustained reductions in PPI use,” Dr. Kurlander concluded. “However, this blunt intervention also reduced appropriate use of PPIs for gastroprotection, raising some concerns about clinical quality of care, but this did not appear to cause any measurable clinical harm in terms of hospitalizations for upper GI diseases.”
 

 

 

Debate around ‘unnecessary PPI use’

According to Philip O. Katz, MD, professor of medicine and director of motility laboratories at Weill Cornell Medicine, New York, the study “makes an attempt to do what others have tried in different ways, which is to develop a mechanism to help reduce or discontinue proton pump inhibitors when people believe they’re not indicated.”

Yet this latter element – appropriate indication – drives an ongoing debate.

“This is a very controversial area,” Dr. Katz said in an interview. “The concept of using the lowest effective dose of medication needed for a symptom or a disease is not new, but the push to reducing or eliminating ‘unnecessary PPI use’ is one that I believe should be carefully discussed, and that we have a clear understanding of what constitutes unnecessary use. And quite honestly, I’m willing to state that I don’t believe that’s been well defined.”

Dr. Katz, who recently coauthored an article about PPIs, suggested that more prospective research is needed to identify which patients need PPIs and which don’t.

“What we really need are more studies that look at who really needs [PPIs] long term,” Dr. Katz said, “as opposed to doing it ad hoc.”

The study was funded by the U.S. Department of Veterans Affairs and the National Institute of Diabetes and Digestive and Kidney Diseases. The investigators reported no conflicts of interest. Dr. Katz is a consultant for Phathom Pharma.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pandemic colonoscopy restrictions may lead to worse CRC outcomes

Article Type
Changed
Wed, 05/26/2021 - 12:01

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Pandemic colonoscopy restrictions may lead to worse CRC outcomes

Article Type
Changed
Thu, 09/09/2021 - 16:19

 

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

Dr. Jill Tinmouth

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

 

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

Dr. Jill Tinmouth

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

 

For veterans, changes in colonoscopy screening caused by the COVID-19 pandemic may have increased risks of delayed colorectal cancer (CRC) diagnosis and could lead to worse CRC outcomes, based on data from more than 33,000 patients in the Veterans Health Administration.

After COVID-19 screening policies were implemented, a significantly lower rate of veterans with red-flag signs or symptoms for CRC underwent colonoscopy, lead author Joshua Demb, PhD, a cancer epidemiologist at the University of California, San Diego, reported at the annual Digestive Disease Week® (DDW).

“As a result of the COVID-19 pandemic, the Veterans Health Administration enacted risk mitigation and management strategies in March 2020, including postponement of nearly all colonoscopies,” the investigators reported. “Notably, this included veterans with red flag signs or symptoms for CRC, among whom delays in workup could increase risk for later-stage and fatal CRC, if present.”

To measure the effects of this policy change, Dr. Demb and colleagues performed a cohort study involving 33,804 veterans with red-flag signs or symptoms for CRC, including hematochezia, iron deficiency anemia, or abnormal guaiac fecal occult blood test or fecal immunochemical test (FIT). Veterans were divided into two cohorts based on date of first red flag diagnosis: either before the COVID-19 policy was implemented (April to October 2019; n = 19,472) or after (April to October 2020; n = 14,332), with an intervening 6-month washout period.

Primary outcomes were proportion completing colonoscopy and time to colonoscopy completion. Multivariable logistic regression incorporated a number of demographic and medical covariates, including race/ethnicity, sex, age, number of red-flag signs/symptoms, first red-flag sign/symptom, and others.

Before the COVID-19 policy change, 44% of individuals with red-flag signs or symptoms received a colonoscopy, compared with 32% after the policy was introduced (P < .01). Adjusted models showed that veterans in the COVID policy group were 42% less likely to receive a diagnostic colonoscopy than those in the prepolicy group (odds ratio, 0.58; 95% confidence interval, 0.55-0.61). While these findings showed greater likelihood of receiving a screening before the pandemic, postpolicy colonoscopies were conducted sooner, with a median time to procedure of 41 days, compared with 65 days before the pandemic (P < .01). Similar differences in screening rates between pre- and postpandemic groups were observed across all types of red flag signs and symptoms.

“Lower colonoscopy uptake was observed among individuals with red-flag signs/symptoms for CRC post- versus preimplementation of COVID-19 policies, suggesting increased future risk for delayed CRC diagnosis and adverse CRC outcomes,” the investigators concluded.

Prioritization may be needed to overcome backlog of colonoscopies

Jill Tinmouth, MD, PhD, lead scientist for ColonCancerCheck, Ontario’s organized colorectal cancer screening program, and a gastroenterologist and scientist at Sunnybrook Health Sciences Centre, Toronto, shared similar concerns about delayed diagnoses.

Dr. Jill Tinmouth

“We might expect these cancers to present ... at a more advanced stage, and that, as a result, the outcomes from these cancers could be worse,” Dr. Tinmouth said in an interview.

She also noted the change in colonoscopy timing.

“A particularly interesting finding was that, when a colonoscopy occurred, the time to colonoscopy was shorter during the COVID era than in the pre-COVID era,” Dr. Tinmouth said. “The authors suggested that this might be as a result of Veterans Health Administration policies implemented as a result of the pandemic that led to prioritization of more urgent procedures.”

According to Dr. Tinmouth, similar prioritization may be needed to catch up with the backlog of colonoscopies created by pandemic-related policy changes. In a recent study comparing two backlog management techniques, Dr. Tinmouth and colleagues concluded that redirecting low-yield colonoscopies to FIT without increasing hospital colonoscopy capacity could reduce time to recovery by more than half.

Even so, screening programs may be facing a long road to recovery.

“Recovery of the colonoscopy backlog is going to be a challenge that will take a while – maybe even years – to resolve,” Dr. Tinmouth said. “Jurisdictions/institutions that have a strong centralized intake or triage will likely be most successful in resolving the backlog quickly as they will be able to prioritize the most urgent cases, such as persons with an abnormal FIT or with symptoms, and to redirect persons scheduled for a ‘low-yield’ colonoscopy to have a FIT instead.” Ontario defines low-yield colonoscopies as primary screening for average-risk individuals and follow-up colonoscopies for patients with low-risk adenomas at baseline.

When asked about strategies to address future pandemics, Dr. Tinmouth said, “I think that two key learnings for me from this [pandemic] are: one, not to let our guard down, and to remain vigilant and prepared – in terms of monitoring, supply chain, equipment, etc.] ... and two to create a nimble and agile health system so that we are able to assess the challenges that the next pandemic brings and address them as quickly as possible.”The investigators and Dr. Tinmouth reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM DDW 2021

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Adversity accelerates aging at early ages, now measurable in real-time

Article Type
Changed
Tue, 05/25/2021 - 10:40

 

Adversity in early life – whether preterm birth or socioeconomic disadvantage in childhood – accelerates aging, according to two recent studies, but underlying mechanisms remain unclear, and methods of investigation continue to evolve.

While one study used an established epigenetic clock to measure biological age among adults with extremely low birth weight, the other showcased a relatively new tool to measure pace of biological aging in disadvantaged children, suggesting that the metric may one day serve as a real-time measure of interventional efficacy.

These findings build upon previous studies that have demonstrated a correlation between biological age, also known as methylation age, and an increased risk of health problems later in life, according to Daniel A. Notterman, MD, professor of molecular biology at Princeton (N.J.) University.

“Finding that a person’s methylation age is greater than their chronological age has been taken as evidence of increased ‘biological age’ and perhaps a tendency to greater future morbidity,” Dr. Notterman wrote in a Pediatrics editorial. “Indeed, methylation age is advanced in association with a number of childhood and midlife adversities as well as morbidities such as atherosclerosis, cancer, and obesity.”
 

Extremely low birth weight associated with faster aging in men

For some individuals, accelerated biological aging begins at birth, or even in utero, according to Ryan J. Van Lieshout, MD, PhD, Canada Research Chair in the Perinatal Programming of Mental Disorders and the Albert Einstein/Irving Zucker Chair in Neuroscience at McMaster University, Hamilton, Ont., and colleagues.

The investigators conducted a study involving 45 extremely low birth weight (ELBW) survivors and 49 individuals born at normal birth weight. All participants were drawn from a longitudinal study conducted between 1977 and 1982 that assessed advances in neonatal intensive care. Controls were recruited at 8 years of age and matched with ELBW survivors based on family socioeconomic status, sex, and age. Follow-up continued through adulthood, allowing for the present trial to compare data from ages 8, 30, and 35.

Using samples of buccal epithelial cells, the investigators measured biological age with the Horvath epigenetic clock, the most commonly used tool of its kind, which measures cytosine-5 methylation at 353 cytosine-phosphate-guanine sites. Results were adjusted for a variety of covariates, such as smoking status, body mass index, number of chronic health conditions, and others.

Between groups, ELBW survivors trended toward older biological age, compared with adults born at normal birth weight (29.0 vs. 27.9 years), a difference that was not statistically significant. Further analysis, however, showed a significant sex-based difference between groups: Male survivors of ELBW, in adulthood, were almost 5 years biologically older than men born at normal birth weight (31.4 vs. 26.9 years; P = .01).

“[W]e provide preliminary evidence of a new link between ELBW and accelerated biological aging among men,” the investigators concluded.

In an accompanying editorial, Pam Factor-Litvak, PhD, vice chair of epidemiology at Columbia University, New York, wrote, “The findings are intriguing and open many questions for further study.”

Dr. Factor-Litvak noted that it remains unclear whether differences in biological aging were present at birth.

“[D]ifferences would provide evidence that accelerated aging begins during the in utero period, perhaps because of maternal undernutrition, stress, or another exposure,” Dr. Factor-Litvak wrote. “[R]eductions in chronic stress levels, which may begin for neonates with ELBW in utero and in the first hours of life, may provide an opportunity for interventions,” she added.

According to Calvin J. Hobel, MD, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at University of California, Los Angeles, who has been studying preterm birth for more than 40 years, interventions may need to begin even earlier.

Dr. Calvin J. Hobel


“The only way to prevent preterm birth is to do it before women get pregnant,” Dr. Hobel said in an interview. “The reason for preterm birth and poor fetal growth is the fact that the mother has early cardiovascular disease – unrecognized.”

Compared with women who give birth to full-term infants, women who give birth to preterm infants typically have increased blood pressure, Dr. Hobel said. Although these elevations in blood pressure are generally asymptomatic and not high enough to be classified as hypertensive, they impact umbilical artery vascular resistance starting at 28 weeks of gestation.

“In utero, [preterm infants] are programmed for increased vascular resistance and increased risk of cardiovascular disease,” Dr. Hobel said.

Regarding the effects of ELBW in men versus women, Dr. Hobel suggested that dissimilar neuroendocrine systems between sexes may protect females from adverse outcomes, although exact mechanisms remain elusive.
 

 

 

Measuring the impact of socioeconomic status on biological aging, now in real-time

A second study, by Laurel Raffington, PhD, of the University of Texas at Austin, and colleagues, evaluated the relationship between socioeconomic disadvantage in childhood and pace of biological aging.

To do so, they used the DunedinPoAm DNA methylation algorithm, a relatively new tool that was developed by analyzing changes in organ system integrity over time among adults with the same chronological age.

“Whereas epigenetic clocks quantify the amount of aging that has already occurred up to the time of measurement, DunedinPoAm quantifies how fast an individual is aging,” Dr. Raffington and colleagues wrote. “In other words, whereas epigenetic clocks tell you what time it is, pace-of-aging measures tell you how fast the clock is ticking.”

The investigators measured pace of aging in 600 children and adolescents (8-18 years of age) from the Texas Twin Project, “an ongoing longitudinal study that includes the collection of salivary samples.” The final dataset included 457 participants who identified as White, 77 who identified as Latinx, and 61 who identified as both White and Latinx.

The investigators evaluated pace of aging compared with family-level and neighborhood-level socioeconomic status, and tested for confounding by tobacco exposure, BMI, and pubertal development.

This analysis revealed that children experiencing socioeconomic disadvantage were aging more quickly than their peers, in terms of both family-level and neighborhood-level inequity (both levels, r = 0.18; P = .001).

Children who identified as Latinx aged faster than did those who identified as White only or White and Latinx, “consistent with higher levels of disadvantage in this group,” the investigators wrote. “Thus, our findings are consistent with observations that racial and/or ethnic socioeconomic disparities are an important contributor to racial and/or ethnic disparities in health.”

Higher BMI, greater tobacco exposure, and more advanced pubertal development were also associated with more rapid aging. After adjustment for these covariates, however, the significant correlation between socioeconomic disadvantage and rapid aging remained, the investigators noted.

“Our results suggest that salivary DNA methylation measures of pace of aging may provide a surrogate or intermediate endpoint for understanding the health impacts of [childhood] interventions,” the investigators concluded. “Such applications may prove particularly useful for evaluating the effectiveness of health-promoting interventions in at-risk groups.”

Still, more work is needed to understand exactly how socioeconomic disadvantage is associated with accelerated aging.

“Ultimately, not only longitudinal repeated-measures studies but also natural experiment studies and randomized controlled trials of social programs are needed to establish causal effects of social disadvantage on DunedinPoAm-measured pace of aging and to establish DunedinPoAm as a mediator of the process through which childhood disadvantage leads to aging-related health conditions,” the investigators wrote.

In his editorial, Dr. Notterman emphasized this point.

“[I]t is worth remembering that associations with either methylation age or pace of aging and health or longevity may represent the effect of an exposure on both the measure and the outcome of interest rather than a causal pathway that runs from the exposure (low socioeconomic status, adversity) to health outcome (i.e., cancer, vascular disease),” he wrote.

Paul Chung, MD, professor and chair of health systems science at Kaiser Permanente Bernard J. Tyson School of Medicine, Pasadena, Calif., and adjunct professor at the University of California, Los Angeles, called the findings “preliminary,” but noted that confirmation through further research could “fill in some really important gaps.

“Right now, to some degree, we’re at a little bit of an impasse,” Dr. Chung said.

Adverse childhood experiences are “associated very strongly” with mental and physical health issues, Dr. Chung said, “but we don’t know exactly why, and because of that, it’s really hard to come up with social policy solutions that aren’t anything but extremely sort of blunt-ended. We just say, ‘Well, I guess you gotta fix everything.’ And it’s a hard place to be, I think, in the field.”

Although the present study doesn’t resolve this issue, Dr. Chung suggested that the findings “really open the door to a lot of really exciting research that could have a lot of impacts on practice and policy.”

“Sometimes the only way to get people to pay attention enough to generate the level of excitement that would allow you to even do these sorts of studies ... is to generate some initial exploratory data that makes people perk up their ears, and makes people go, ‘Hey, wow, maybe we should be looking into this.’ ”

The study by Dr. Raffington and colleagues was funded by the National Institutes of Health and the Jacobs Foundation, with additional support from the German Research Foundation, Russell Sage Foundation Biology and Social Science Grant, the Canadian Institute for Advanced Research Child and Brain Development Network, and others. The study by Dr. Lieshout and colleagues was supported by Canadian Institutes of Health Research. Dr. Factor-Litvak and Dr. Notterman reported funding from the National Institutes of Health. All of the investigators and interviewees reported no conflicts of interest.

Publications
Topics
Sections

 

Adversity in early life – whether preterm birth or socioeconomic disadvantage in childhood – accelerates aging, according to two recent studies, but underlying mechanisms remain unclear, and methods of investigation continue to evolve.

While one study used an established epigenetic clock to measure biological age among adults with extremely low birth weight, the other showcased a relatively new tool to measure pace of biological aging in disadvantaged children, suggesting that the metric may one day serve as a real-time measure of interventional efficacy.

These findings build upon previous studies that have demonstrated a correlation between biological age, also known as methylation age, and an increased risk of health problems later in life, according to Daniel A. Notterman, MD, professor of molecular biology at Princeton (N.J.) University.

“Finding that a person’s methylation age is greater than their chronological age has been taken as evidence of increased ‘biological age’ and perhaps a tendency to greater future morbidity,” Dr. Notterman wrote in a Pediatrics editorial. “Indeed, methylation age is advanced in association with a number of childhood and midlife adversities as well as morbidities such as atherosclerosis, cancer, and obesity.”
 

Extremely low birth weight associated with faster aging in men

For some individuals, accelerated biological aging begins at birth, or even in utero, according to Ryan J. Van Lieshout, MD, PhD, Canada Research Chair in the Perinatal Programming of Mental Disorders and the Albert Einstein/Irving Zucker Chair in Neuroscience at McMaster University, Hamilton, Ont., and colleagues.

The investigators conducted a study involving 45 extremely low birth weight (ELBW) survivors and 49 individuals born at normal birth weight. All participants were drawn from a longitudinal study conducted between 1977 and 1982 that assessed advances in neonatal intensive care. Controls were recruited at 8 years of age and matched with ELBW survivors based on family socioeconomic status, sex, and age. Follow-up continued through adulthood, allowing for the present trial to compare data from ages 8, 30, and 35.

Using samples of buccal epithelial cells, the investigators measured biological age with the Horvath epigenetic clock, the most commonly used tool of its kind, which measures cytosine-5 methylation at 353 cytosine-phosphate-guanine sites. Results were adjusted for a variety of covariates, such as smoking status, body mass index, number of chronic health conditions, and others.

Between groups, ELBW survivors trended toward older biological age, compared with adults born at normal birth weight (29.0 vs. 27.9 years), a difference that was not statistically significant. Further analysis, however, showed a significant sex-based difference between groups: Male survivors of ELBW, in adulthood, were almost 5 years biologically older than men born at normal birth weight (31.4 vs. 26.9 years; P = .01).

“[W]e provide preliminary evidence of a new link between ELBW and accelerated biological aging among men,” the investigators concluded.

In an accompanying editorial, Pam Factor-Litvak, PhD, vice chair of epidemiology at Columbia University, New York, wrote, “The findings are intriguing and open many questions for further study.”

Dr. Factor-Litvak noted that it remains unclear whether differences in biological aging were present at birth.

“[D]ifferences would provide evidence that accelerated aging begins during the in utero period, perhaps because of maternal undernutrition, stress, or another exposure,” Dr. Factor-Litvak wrote. “[R]eductions in chronic stress levels, which may begin for neonates with ELBW in utero and in the first hours of life, may provide an opportunity for interventions,” she added.

According to Calvin J. Hobel, MD, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at University of California, Los Angeles, who has been studying preterm birth for more than 40 years, interventions may need to begin even earlier.

Dr. Calvin J. Hobel


“The only way to prevent preterm birth is to do it before women get pregnant,” Dr. Hobel said in an interview. “The reason for preterm birth and poor fetal growth is the fact that the mother has early cardiovascular disease – unrecognized.”

Compared with women who give birth to full-term infants, women who give birth to preterm infants typically have increased blood pressure, Dr. Hobel said. Although these elevations in blood pressure are generally asymptomatic and not high enough to be classified as hypertensive, they impact umbilical artery vascular resistance starting at 28 weeks of gestation.

“In utero, [preterm infants] are programmed for increased vascular resistance and increased risk of cardiovascular disease,” Dr. Hobel said.

Regarding the effects of ELBW in men versus women, Dr. Hobel suggested that dissimilar neuroendocrine systems between sexes may protect females from adverse outcomes, although exact mechanisms remain elusive.
 

 

 

Measuring the impact of socioeconomic status on biological aging, now in real-time

A second study, by Laurel Raffington, PhD, of the University of Texas at Austin, and colleagues, evaluated the relationship between socioeconomic disadvantage in childhood and pace of biological aging.

To do so, they used the DunedinPoAm DNA methylation algorithm, a relatively new tool that was developed by analyzing changes in organ system integrity over time among adults with the same chronological age.

“Whereas epigenetic clocks quantify the amount of aging that has already occurred up to the time of measurement, DunedinPoAm quantifies how fast an individual is aging,” Dr. Raffington and colleagues wrote. “In other words, whereas epigenetic clocks tell you what time it is, pace-of-aging measures tell you how fast the clock is ticking.”

The investigators measured pace of aging in 600 children and adolescents (8-18 years of age) from the Texas Twin Project, “an ongoing longitudinal study that includes the collection of salivary samples.” The final dataset included 457 participants who identified as White, 77 who identified as Latinx, and 61 who identified as both White and Latinx.

The investigators evaluated pace of aging compared with family-level and neighborhood-level socioeconomic status, and tested for confounding by tobacco exposure, BMI, and pubertal development.

This analysis revealed that children experiencing socioeconomic disadvantage were aging more quickly than their peers, in terms of both family-level and neighborhood-level inequity (both levels, r = 0.18; P = .001).

Children who identified as Latinx aged faster than did those who identified as White only or White and Latinx, “consistent with higher levels of disadvantage in this group,” the investigators wrote. “Thus, our findings are consistent with observations that racial and/or ethnic socioeconomic disparities are an important contributor to racial and/or ethnic disparities in health.”

Higher BMI, greater tobacco exposure, and more advanced pubertal development were also associated with more rapid aging. After adjustment for these covariates, however, the significant correlation between socioeconomic disadvantage and rapid aging remained, the investigators noted.

“Our results suggest that salivary DNA methylation measures of pace of aging may provide a surrogate or intermediate endpoint for understanding the health impacts of [childhood] interventions,” the investigators concluded. “Such applications may prove particularly useful for evaluating the effectiveness of health-promoting interventions in at-risk groups.”

Still, more work is needed to understand exactly how socioeconomic disadvantage is associated with accelerated aging.

“Ultimately, not only longitudinal repeated-measures studies but also natural experiment studies and randomized controlled trials of social programs are needed to establish causal effects of social disadvantage on DunedinPoAm-measured pace of aging and to establish DunedinPoAm as a mediator of the process through which childhood disadvantage leads to aging-related health conditions,” the investigators wrote.

In his editorial, Dr. Notterman emphasized this point.

“[I]t is worth remembering that associations with either methylation age or pace of aging and health or longevity may represent the effect of an exposure on both the measure and the outcome of interest rather than a causal pathway that runs from the exposure (low socioeconomic status, adversity) to health outcome (i.e., cancer, vascular disease),” he wrote.

Paul Chung, MD, professor and chair of health systems science at Kaiser Permanente Bernard J. Tyson School of Medicine, Pasadena, Calif., and adjunct professor at the University of California, Los Angeles, called the findings “preliminary,” but noted that confirmation through further research could “fill in some really important gaps.

“Right now, to some degree, we’re at a little bit of an impasse,” Dr. Chung said.

Adverse childhood experiences are “associated very strongly” with mental and physical health issues, Dr. Chung said, “but we don’t know exactly why, and because of that, it’s really hard to come up with social policy solutions that aren’t anything but extremely sort of blunt-ended. We just say, ‘Well, I guess you gotta fix everything.’ And it’s a hard place to be, I think, in the field.”

Although the present study doesn’t resolve this issue, Dr. Chung suggested that the findings “really open the door to a lot of really exciting research that could have a lot of impacts on practice and policy.”

“Sometimes the only way to get people to pay attention enough to generate the level of excitement that would allow you to even do these sorts of studies ... is to generate some initial exploratory data that makes people perk up their ears, and makes people go, ‘Hey, wow, maybe we should be looking into this.’ ”

The study by Dr. Raffington and colleagues was funded by the National Institutes of Health and the Jacobs Foundation, with additional support from the German Research Foundation, Russell Sage Foundation Biology and Social Science Grant, the Canadian Institute for Advanced Research Child and Brain Development Network, and others. The study by Dr. Lieshout and colleagues was supported by Canadian Institutes of Health Research. Dr. Factor-Litvak and Dr. Notterman reported funding from the National Institutes of Health. All of the investigators and interviewees reported no conflicts of interest.

 

Adversity in early life – whether preterm birth or socioeconomic disadvantage in childhood – accelerates aging, according to two recent studies, but underlying mechanisms remain unclear, and methods of investigation continue to evolve.

While one study used an established epigenetic clock to measure biological age among adults with extremely low birth weight, the other showcased a relatively new tool to measure pace of biological aging in disadvantaged children, suggesting that the metric may one day serve as a real-time measure of interventional efficacy.

These findings build upon previous studies that have demonstrated a correlation between biological age, also known as methylation age, and an increased risk of health problems later in life, according to Daniel A. Notterman, MD, professor of molecular biology at Princeton (N.J.) University.

“Finding that a person’s methylation age is greater than their chronological age has been taken as evidence of increased ‘biological age’ and perhaps a tendency to greater future morbidity,” Dr. Notterman wrote in a Pediatrics editorial. “Indeed, methylation age is advanced in association with a number of childhood and midlife adversities as well as morbidities such as atherosclerosis, cancer, and obesity.”
 

Extremely low birth weight associated with faster aging in men

For some individuals, accelerated biological aging begins at birth, or even in utero, according to Ryan J. Van Lieshout, MD, PhD, Canada Research Chair in the Perinatal Programming of Mental Disorders and the Albert Einstein/Irving Zucker Chair in Neuroscience at McMaster University, Hamilton, Ont., and colleagues.

The investigators conducted a study involving 45 extremely low birth weight (ELBW) survivors and 49 individuals born at normal birth weight. All participants were drawn from a longitudinal study conducted between 1977 and 1982 that assessed advances in neonatal intensive care. Controls were recruited at 8 years of age and matched with ELBW survivors based on family socioeconomic status, sex, and age. Follow-up continued through adulthood, allowing for the present trial to compare data from ages 8, 30, and 35.

Using samples of buccal epithelial cells, the investigators measured biological age with the Horvath epigenetic clock, the most commonly used tool of its kind, which measures cytosine-5 methylation at 353 cytosine-phosphate-guanine sites. Results were adjusted for a variety of covariates, such as smoking status, body mass index, number of chronic health conditions, and others.

Between groups, ELBW survivors trended toward older biological age, compared with adults born at normal birth weight (29.0 vs. 27.9 years), a difference that was not statistically significant. Further analysis, however, showed a significant sex-based difference between groups: Male survivors of ELBW, in adulthood, were almost 5 years biologically older than men born at normal birth weight (31.4 vs. 26.9 years; P = .01).

“[W]e provide preliminary evidence of a new link between ELBW and accelerated biological aging among men,” the investigators concluded.

In an accompanying editorial, Pam Factor-Litvak, PhD, vice chair of epidemiology at Columbia University, New York, wrote, “The findings are intriguing and open many questions for further study.”

Dr. Factor-Litvak noted that it remains unclear whether differences in biological aging were present at birth.

“[D]ifferences would provide evidence that accelerated aging begins during the in utero period, perhaps because of maternal undernutrition, stress, or another exposure,” Dr. Factor-Litvak wrote. “[R]eductions in chronic stress levels, which may begin for neonates with ELBW in utero and in the first hours of life, may provide an opportunity for interventions,” she added.

According to Calvin J. Hobel, MD, professor of pediatrics at Cedars-Sinai and professor of obstetrics and gynecology at University of California, Los Angeles, who has been studying preterm birth for more than 40 years, interventions may need to begin even earlier.

Dr. Calvin J. Hobel


“The only way to prevent preterm birth is to do it before women get pregnant,” Dr. Hobel said in an interview. “The reason for preterm birth and poor fetal growth is the fact that the mother has early cardiovascular disease – unrecognized.”

Compared with women who give birth to full-term infants, women who give birth to preterm infants typically have increased blood pressure, Dr. Hobel said. Although these elevations in blood pressure are generally asymptomatic and not high enough to be classified as hypertensive, they impact umbilical artery vascular resistance starting at 28 weeks of gestation.

“In utero, [preterm infants] are programmed for increased vascular resistance and increased risk of cardiovascular disease,” Dr. Hobel said.

Regarding the effects of ELBW in men versus women, Dr. Hobel suggested that dissimilar neuroendocrine systems between sexes may protect females from adverse outcomes, although exact mechanisms remain elusive.
 

 

 

Measuring the impact of socioeconomic status on biological aging, now in real-time

A second study, by Laurel Raffington, PhD, of the University of Texas at Austin, and colleagues, evaluated the relationship between socioeconomic disadvantage in childhood and pace of biological aging.

To do so, they used the DunedinPoAm DNA methylation algorithm, a relatively new tool that was developed by analyzing changes in organ system integrity over time among adults with the same chronological age.

“Whereas epigenetic clocks quantify the amount of aging that has already occurred up to the time of measurement, DunedinPoAm quantifies how fast an individual is aging,” Dr. Raffington and colleagues wrote. “In other words, whereas epigenetic clocks tell you what time it is, pace-of-aging measures tell you how fast the clock is ticking.”

The investigators measured pace of aging in 600 children and adolescents (8-18 years of age) from the Texas Twin Project, “an ongoing longitudinal study that includes the collection of salivary samples.” The final dataset included 457 participants who identified as White, 77 who identified as Latinx, and 61 who identified as both White and Latinx.

The investigators evaluated pace of aging compared with family-level and neighborhood-level socioeconomic status, and tested for confounding by tobacco exposure, BMI, and pubertal development.

This analysis revealed that children experiencing socioeconomic disadvantage were aging more quickly than their peers, in terms of both family-level and neighborhood-level inequity (both levels, r = 0.18; P = .001).

Children who identified as Latinx aged faster than did those who identified as White only or White and Latinx, “consistent with higher levels of disadvantage in this group,” the investigators wrote. “Thus, our findings are consistent with observations that racial and/or ethnic socioeconomic disparities are an important contributor to racial and/or ethnic disparities in health.”

Higher BMI, greater tobacco exposure, and more advanced pubertal development were also associated with more rapid aging. After adjustment for these covariates, however, the significant correlation between socioeconomic disadvantage and rapid aging remained, the investigators noted.

“Our results suggest that salivary DNA methylation measures of pace of aging may provide a surrogate or intermediate endpoint for understanding the health impacts of [childhood] interventions,” the investigators concluded. “Such applications may prove particularly useful for evaluating the effectiveness of health-promoting interventions in at-risk groups.”

Still, more work is needed to understand exactly how socioeconomic disadvantage is associated with accelerated aging.

“Ultimately, not only longitudinal repeated-measures studies but also natural experiment studies and randomized controlled trials of social programs are needed to establish causal effects of social disadvantage on DunedinPoAm-measured pace of aging and to establish DunedinPoAm as a mediator of the process through which childhood disadvantage leads to aging-related health conditions,” the investigators wrote.

In his editorial, Dr. Notterman emphasized this point.

“[I]t is worth remembering that associations with either methylation age or pace of aging and health or longevity may represent the effect of an exposure on both the measure and the outcome of interest rather than a causal pathway that runs from the exposure (low socioeconomic status, adversity) to health outcome (i.e., cancer, vascular disease),” he wrote.

Paul Chung, MD, professor and chair of health systems science at Kaiser Permanente Bernard J. Tyson School of Medicine, Pasadena, Calif., and adjunct professor at the University of California, Los Angeles, called the findings “preliminary,” but noted that confirmation through further research could “fill in some really important gaps.

“Right now, to some degree, we’re at a little bit of an impasse,” Dr. Chung said.

Adverse childhood experiences are “associated very strongly” with mental and physical health issues, Dr. Chung said, “but we don’t know exactly why, and because of that, it’s really hard to come up with social policy solutions that aren’t anything but extremely sort of blunt-ended. We just say, ‘Well, I guess you gotta fix everything.’ And it’s a hard place to be, I think, in the field.”

Although the present study doesn’t resolve this issue, Dr. Chung suggested that the findings “really open the door to a lot of really exciting research that could have a lot of impacts on practice and policy.”

“Sometimes the only way to get people to pay attention enough to generate the level of excitement that would allow you to even do these sorts of studies ... is to generate some initial exploratory data that makes people perk up their ears, and makes people go, ‘Hey, wow, maybe we should be looking into this.’ ”

The study by Dr. Raffington and colleagues was funded by the National Institutes of Health and the Jacobs Foundation, with additional support from the German Research Foundation, Russell Sage Foundation Biology and Social Science Grant, the Canadian Institute for Advanced Research Child and Brain Development Network, and others. The study by Dr. Lieshout and colleagues was supported by Canadian Institutes of Health Research. Dr. Factor-Litvak and Dr. Notterman reported funding from the National Institutes of Health. All of the investigators and interviewees reported no conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PEDIATRICS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Weighing the pros and cons of disposable duodenoscopes

Article Type
Changed
Wed, 05/19/2021 - 14:27

Disposable duodenoscopes have one irrefutable advantage over their reusable counterparts: They definitively solve the problem of scope-related multidrug-resistant organism (MDRO) infections. Yet they also come with trade-offs, such as increased cost and medical waste, which has triggered pushback from skeptical endoscopists. How endoscopists weigh their differing concerns will ultimately determine the uptake of these devices going forward, according to Andrew S. Ross, MD, medical director for strategic growth at Virginia Mason Medical Center, Seattle.

“What would you pay to not have to deal with the scope infection issue at all?” Dr. Ross asked during a virtual presentation at the 2021 AGA Tech Summit sponsored by the AGA Center for GI Innovation and Technology. “I think that x-factor is going to depend [on] who you’re talking to and how much they really believe in [duodenoscope-related infection] as an issue.”

Dr. Ross explained that some endoscopists doubt the clinical relevance of duodenoscope-related MDRO infections, possibly because of a lack of direct experience.

“There still is a prevailing sentiment among some endoscopists that duodenoscope infection is really not a problem,” Dr. Ross said. “Or [they may say,]: ‘We haven’t had that issue here in our medical center, so therefore it is not a problem.’ ”

In fact, the exact magnitude of the problem remains unknown.

“In the end, we have an unquantifiable risk to patients wherever [reusable duodenoscopes] are used,” Dr. Ross said.
 

Just how common are scope-related MDRO infections?

According to V. Raman Muthusamy, MD, AGAF, immediate former chair of the AGA Center for GI Innovation and Technology, and director of endoscopy at the University of California, Los Angeles Health System, scope-related MDRO infections are “relatively uncommon,” but they do occur.

Dr. V. Raman Muthusamy

MDRO infections are generally linked with contaminated endoscopes, but duodenoscopes are the most common culprit because they pose a unique risk.

“Traditionally, when outbreaks have occurred [with nonduodenoscopes], it has usually been due to a breach in the reprocessing protocol,” Dr. Muthusamy said in an interview. “But with duodenoscopes, we’ve found that that does not appear to be necessary, and that in many cases there are no identified breaches, and yet there are still outbreaks.”

Dr. Muthusamy, the first endoscopist to test a disposable duodenoscope in a human patient, noted that it’s challenging to definitively prove infection from a reusable scope. Citing an Executive Summary from the Food and Drug Administration, he said, “We know it’s happened 300-400 times over the past decade or so,” with infection rates peaking in 2014-2016 and steadily declining since then.

Approximately 5% of reprocessed duodenoscopes harbor pathogenic bacteria, according to Dr. Muthusamy, but the rate of infection is significantly lower.

“[The use of a contaminated duodenoscope] doesn’t mean a patient will actually get sick ... but it does mean the potential exists, obviously,” he said. “It just shows that these devices are hard to clean and a fraction of people have the potential of becoming ill. It’s our goal to improve on those numbers, and really try to eliminate the risk of this problem, as best we can.”
 

 

 

Infection isn’t the only concern

There are several potential ways to tackle the issue of scope-related infections, Dr. Ross said during his presentation, including designing devices that are easier to clean and optimizing the cleaning process; however, the only definitive solution is to eliminate cleaning altogether.

This is where disposable duodenoscopes come in.

At present, there are two such FDA-approved devices, the Ascope Duodeno from Ambu and the Exalt Model D from Boston Scientific, both of which Dr. Ross characterized as being “in their infancy.”

Studies testing the Exalt Model D suggest that performance compares favorably with reusable duodenoscopes.

“The scope works in a benchtop model, it works in a lab, and it seems to be functional in expert hands,” Dr. Ross said. “With inexperienced users, we also see that this device works, albeit with a rate of crossover that may approach up to 10%. So, a functional, disposable scope has been produced.”

Despite availability, several pain points may slow adoption, Dr. Ross said, including reluctance to use new technology, skepticism about the clinical impact of scope-related infections, environmental concerns of increased medical waste, and increased cost.

On this latter topic, Dr. Ross pointed out that the true cost of a reusable scope goes beyond the purchase or lease price to include repair costs, reprocessing costs, and, potentially, the cost of litigation from scope-related infection.

“If you have an outbreak in your medical center, you can rest assured that you will have some litigation exposure,” Dr. Ross said.
 

Fitting disposable duodenoscopes into routine practice

Currently, both FDA-approved disposable duodenoscopes are covered by outpatient pass-through reimbursement for Medicare, and in October, both will be covered on an inpatient basis, according to Dr. Ross.

“I think the big question regarding pass-through reimbursement is what happens when the codes get revalued,” he said. “How long will the additional reimbursement stay in place?”

For now, Dr. Ross suggested that endoscopists reach for disposable duodenoscopes in unique scenarios, such as weekend or night procedures, to avoid calling in a scope-reprocessing technician; or in operating room cases when the scope enters a sterile field. Disposable scopes should also be considered for patients with known MDROs, he added, and conversely, for patients who are immunocompromised or critically ill and “can least afford a scope-related infection.”

Ultimately, the role of disposable duodenoscopes may be decided by the patients themselves, Dr. Ross concluded.

“Certainly, patients know about this – they may come in and demand the use of a single-use scope in certain situations,” Dr. Ross said. “We have to remember when we’re bringing any new technology into the marketplace that while it’s important to understand the input and perspectives of multiple stakeholders, the single-most important stakeholder at the end of the day are our patients.”

Dr. Ross disclosed a relationship with Boston Scientific. Dr. Muthusamy disclosed a relationship with Boston Scientific and Medivators.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Disposable duodenoscopes have one irrefutable advantage over their reusable counterparts: They definitively solve the problem of scope-related multidrug-resistant organism (MDRO) infections. Yet they also come with trade-offs, such as increased cost and medical waste, which has triggered pushback from skeptical endoscopists. How endoscopists weigh their differing concerns will ultimately determine the uptake of these devices going forward, according to Andrew S. Ross, MD, medical director for strategic growth at Virginia Mason Medical Center, Seattle.

“What would you pay to not have to deal with the scope infection issue at all?” Dr. Ross asked during a virtual presentation at the 2021 AGA Tech Summit sponsored by the AGA Center for GI Innovation and Technology. “I think that x-factor is going to depend [on] who you’re talking to and how much they really believe in [duodenoscope-related infection] as an issue.”

Dr. Ross explained that some endoscopists doubt the clinical relevance of duodenoscope-related MDRO infections, possibly because of a lack of direct experience.

“There still is a prevailing sentiment among some endoscopists that duodenoscope infection is really not a problem,” Dr. Ross said. “Or [they may say,]: ‘We haven’t had that issue here in our medical center, so therefore it is not a problem.’ ”

In fact, the exact magnitude of the problem remains unknown.

“In the end, we have an unquantifiable risk to patients wherever [reusable duodenoscopes] are used,” Dr. Ross said.
 

Just how common are scope-related MDRO infections?

According to V. Raman Muthusamy, MD, AGAF, immediate former chair of the AGA Center for GI Innovation and Technology, and director of endoscopy at the University of California, Los Angeles Health System, scope-related MDRO infections are “relatively uncommon,” but they do occur.

Dr. V. Raman Muthusamy

MDRO infections are generally linked with contaminated endoscopes, but duodenoscopes are the most common culprit because they pose a unique risk.

“Traditionally, when outbreaks have occurred [with nonduodenoscopes], it has usually been due to a breach in the reprocessing protocol,” Dr. Muthusamy said in an interview. “But with duodenoscopes, we’ve found that that does not appear to be necessary, and that in many cases there are no identified breaches, and yet there are still outbreaks.”

Dr. Muthusamy, the first endoscopist to test a disposable duodenoscope in a human patient, noted that it’s challenging to definitively prove infection from a reusable scope. Citing an Executive Summary from the Food and Drug Administration, he said, “We know it’s happened 300-400 times over the past decade or so,” with infection rates peaking in 2014-2016 and steadily declining since then.

Approximately 5% of reprocessed duodenoscopes harbor pathogenic bacteria, according to Dr. Muthusamy, but the rate of infection is significantly lower.

“[The use of a contaminated duodenoscope] doesn’t mean a patient will actually get sick ... but it does mean the potential exists, obviously,” he said. “It just shows that these devices are hard to clean and a fraction of people have the potential of becoming ill. It’s our goal to improve on those numbers, and really try to eliminate the risk of this problem, as best we can.”
 

 

 

Infection isn’t the only concern

There are several potential ways to tackle the issue of scope-related infections, Dr. Ross said during his presentation, including designing devices that are easier to clean and optimizing the cleaning process; however, the only definitive solution is to eliminate cleaning altogether.

This is where disposable duodenoscopes come in.

At present, there are two such FDA-approved devices, the Ascope Duodeno from Ambu and the Exalt Model D from Boston Scientific, both of which Dr. Ross characterized as being “in their infancy.”

Studies testing the Exalt Model D suggest that performance compares favorably with reusable duodenoscopes.

“The scope works in a benchtop model, it works in a lab, and it seems to be functional in expert hands,” Dr. Ross said. “With inexperienced users, we also see that this device works, albeit with a rate of crossover that may approach up to 10%. So, a functional, disposable scope has been produced.”

Despite availability, several pain points may slow adoption, Dr. Ross said, including reluctance to use new technology, skepticism about the clinical impact of scope-related infections, environmental concerns of increased medical waste, and increased cost.

On this latter topic, Dr. Ross pointed out that the true cost of a reusable scope goes beyond the purchase or lease price to include repair costs, reprocessing costs, and, potentially, the cost of litigation from scope-related infection.

“If you have an outbreak in your medical center, you can rest assured that you will have some litigation exposure,” Dr. Ross said.
 

Fitting disposable duodenoscopes into routine practice

Currently, both FDA-approved disposable duodenoscopes are covered by outpatient pass-through reimbursement for Medicare, and in October, both will be covered on an inpatient basis, according to Dr. Ross.

“I think the big question regarding pass-through reimbursement is what happens when the codes get revalued,” he said. “How long will the additional reimbursement stay in place?”

For now, Dr. Ross suggested that endoscopists reach for disposable duodenoscopes in unique scenarios, such as weekend or night procedures, to avoid calling in a scope-reprocessing technician; or in operating room cases when the scope enters a sterile field. Disposable scopes should also be considered for patients with known MDROs, he added, and conversely, for patients who are immunocompromised or critically ill and “can least afford a scope-related infection.”

Ultimately, the role of disposable duodenoscopes may be decided by the patients themselves, Dr. Ross concluded.

“Certainly, patients know about this – they may come in and demand the use of a single-use scope in certain situations,” Dr. Ross said. “We have to remember when we’re bringing any new technology into the marketplace that while it’s important to understand the input and perspectives of multiple stakeholders, the single-most important stakeholder at the end of the day are our patients.”

Dr. Ross disclosed a relationship with Boston Scientific. Dr. Muthusamy disclosed a relationship with Boston Scientific and Medivators.

Disposable duodenoscopes have one irrefutable advantage over their reusable counterparts: They definitively solve the problem of scope-related multidrug-resistant organism (MDRO) infections. Yet they also come with trade-offs, such as increased cost and medical waste, which has triggered pushback from skeptical endoscopists. How endoscopists weigh their differing concerns will ultimately determine the uptake of these devices going forward, according to Andrew S. Ross, MD, medical director for strategic growth at Virginia Mason Medical Center, Seattle.

“What would you pay to not have to deal with the scope infection issue at all?” Dr. Ross asked during a virtual presentation at the 2021 AGA Tech Summit sponsored by the AGA Center for GI Innovation and Technology. “I think that x-factor is going to depend [on] who you’re talking to and how much they really believe in [duodenoscope-related infection] as an issue.”

Dr. Ross explained that some endoscopists doubt the clinical relevance of duodenoscope-related MDRO infections, possibly because of a lack of direct experience.

“There still is a prevailing sentiment among some endoscopists that duodenoscope infection is really not a problem,” Dr. Ross said. “Or [they may say,]: ‘We haven’t had that issue here in our medical center, so therefore it is not a problem.’ ”

In fact, the exact magnitude of the problem remains unknown.

“In the end, we have an unquantifiable risk to patients wherever [reusable duodenoscopes] are used,” Dr. Ross said.
 

Just how common are scope-related MDRO infections?

According to V. Raman Muthusamy, MD, AGAF, immediate former chair of the AGA Center for GI Innovation and Technology, and director of endoscopy at the University of California, Los Angeles Health System, scope-related MDRO infections are “relatively uncommon,” but they do occur.

Dr. V. Raman Muthusamy

MDRO infections are generally linked with contaminated endoscopes, but duodenoscopes are the most common culprit because they pose a unique risk.

“Traditionally, when outbreaks have occurred [with nonduodenoscopes], it has usually been due to a breach in the reprocessing protocol,” Dr. Muthusamy said in an interview. “But with duodenoscopes, we’ve found that that does not appear to be necessary, and that in many cases there are no identified breaches, and yet there are still outbreaks.”

Dr. Muthusamy, the first endoscopist to test a disposable duodenoscope in a human patient, noted that it’s challenging to definitively prove infection from a reusable scope. Citing an Executive Summary from the Food and Drug Administration, he said, “We know it’s happened 300-400 times over the past decade or so,” with infection rates peaking in 2014-2016 and steadily declining since then.

Approximately 5% of reprocessed duodenoscopes harbor pathogenic bacteria, according to Dr. Muthusamy, but the rate of infection is significantly lower.

“[The use of a contaminated duodenoscope] doesn’t mean a patient will actually get sick ... but it does mean the potential exists, obviously,” he said. “It just shows that these devices are hard to clean and a fraction of people have the potential of becoming ill. It’s our goal to improve on those numbers, and really try to eliminate the risk of this problem, as best we can.”
 

 

 

Infection isn’t the only concern

There are several potential ways to tackle the issue of scope-related infections, Dr. Ross said during his presentation, including designing devices that are easier to clean and optimizing the cleaning process; however, the only definitive solution is to eliminate cleaning altogether.

This is where disposable duodenoscopes come in.

At present, there are two such FDA-approved devices, the Ascope Duodeno from Ambu and the Exalt Model D from Boston Scientific, both of which Dr. Ross characterized as being “in their infancy.”

Studies testing the Exalt Model D suggest that performance compares favorably with reusable duodenoscopes.

“The scope works in a benchtop model, it works in a lab, and it seems to be functional in expert hands,” Dr. Ross said. “With inexperienced users, we also see that this device works, albeit with a rate of crossover that may approach up to 10%. So, a functional, disposable scope has been produced.”

Despite availability, several pain points may slow adoption, Dr. Ross said, including reluctance to use new technology, skepticism about the clinical impact of scope-related infections, environmental concerns of increased medical waste, and increased cost.

On this latter topic, Dr. Ross pointed out that the true cost of a reusable scope goes beyond the purchase or lease price to include repair costs, reprocessing costs, and, potentially, the cost of litigation from scope-related infection.

“If you have an outbreak in your medical center, you can rest assured that you will have some litigation exposure,” Dr. Ross said.
 

Fitting disposable duodenoscopes into routine practice

Currently, both FDA-approved disposable duodenoscopes are covered by outpatient pass-through reimbursement for Medicare, and in October, both will be covered on an inpatient basis, according to Dr. Ross.

“I think the big question regarding pass-through reimbursement is what happens when the codes get revalued,” he said. “How long will the additional reimbursement stay in place?”

For now, Dr. Ross suggested that endoscopists reach for disposable duodenoscopes in unique scenarios, such as weekend or night procedures, to avoid calling in a scope-reprocessing technician; or in operating room cases when the scope enters a sterile field. Disposable scopes should also be considered for patients with known MDROs, he added, and conversely, for patients who are immunocompromised or critically ill and “can least afford a scope-related infection.”

Ultimately, the role of disposable duodenoscopes may be decided by the patients themselves, Dr. Ross concluded.

“Certainly, patients know about this – they may come in and demand the use of a single-use scope in certain situations,” Dr. Ross said. “We have to remember when we’re bringing any new technology into the marketplace that while it’s important to understand the input and perspectives of multiple stakeholders, the single-most important stakeholder at the end of the day are our patients.”

Dr. Ross disclosed a relationship with Boston Scientific. Dr. Muthusamy disclosed a relationship with Boston Scientific and Medivators.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE 2021 AGA TECH SUMMIT

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Admit or send home for GI bleeding? AI may help you decide

Article Type
Changed
Wed, 05/19/2021 - 11:15

GI Genius recently became the first Food and Drug Administration–approved device to use artificial intelligence (AI) for endoscopy. Soon, similar technology may give gastroenterologists an edge before they even walk into the procedure room.

Dr. Dennis Shung

AI can provide highly accurate risk scores for patients with suspected upper GI bleeding, and make a recommendation for discharge or hospitalization, according to Dennis Shung, MD, MHS, a clinical instructor at Yale University, New Haven, Conn. And this could provide extensive benefit.

“Acute gastrointestinal bleeding is the most common gastrointestinal diagnosis requiring hospitalization. It costs around $19.2 billion per year,” Dr. Shung said, citing a study from Gastroenterology. He made these remarks during a virtual presentation at the 2021 AGA Tech Summit sponsored by the AGA Center for GI Innovation and Technology.

Emergency department visits for upper GI bleeding increased 17% from 2006 to 2014, Dr. Shung added, suggesting a rising trend.
 

The trouble with using risk scores

A variety of conventional risk scores are presently available to help manage these patients. Generally, they use a composite outcome of hemostatic intervention, transfusion, or death to determine which patients should be hospitalized (high risk) and which patients can go home (low risk). Although these models can offer high sensitivity, they remain underutilized.

“[Clinical risk scores] are cumbersome, it’s difficult to calculate them, [and] you may not remember to do that in your busy workflow,” Dr. Shung said.

He pointed out that low implementation may also stem from poorly defined clinical responsibilities.

“[Observing] providers caring for patients with GI bleeding showed that there was a culture of not taking ownership,” he said. “Emergency department physicians thought that it was the gastroenterologists who needed to [perform risk scoring]. Gastroenterologists thought it was the ED [physicians’ responsibility].”

To overcome these pitfalls, Dr. Shung and colleagues are developing AI that automates risk analysis for upper GI bleeding by integrating the process into the clinical workflow. Like GI Genius, their strategy relies upon machine learning, which is a type of AI that can improve automatically without being explicitly programmed.

Their most recent study (Sci Rep. 2021 Apr 23;11[1]:8827) involved a machine learning model that could predict transfusion in patients admitted for acute GI bleeding. The model was developed and internally validated in a cohort of 2,524 patients, then shown to outperform conventional regression-based models when externally validated in 1,526 patients similarly admitted at large urban hospitals.
 

Google Maps for GI bleeding

“The future, as I envision it, is a Google Maps for GI bleeding,” Dr. Shung said, referring to how the popular web-mapping product analyzes real-time data, such as weather and traffic patterns, to provide the best route and an estimated time of arrival. “With the electronic health record, we have the ability to personalize care by basically using data obtained during the clinical encounter to generate risk assessment in real time.”

In other words, machine learning software reads a patient’s electronic health record, runs relevant data through an algorithm, and produces both a risk score and a clinical recommendation. In the case of suspected upper GI bleeding, the clinician is advised to either discharge for outpatient endoscopy or hospitalize for inpatient evaluation.

Because the quality and consistency of data in EHRs can vary, the most advanced form of machine learning – deep learning – is needed to make this a clinical reality. Deep learning converts simpler concepts into complex ones. In this scenario, that would mean deciding which clinical data are relevant and which are just noise. Taking this a step further, deep learning can actually “draw conclusions” from what’s missing.

“There are huge challenges in [irregular data] that need to be overcome,” Dr. Shung said in an interview. “But I see it as an opportunity. When you see things that are irregularly sampled, when you see things are missing – they mean something. They mean that a human has decided that that is not the way we should do things because this patient doesn’t need it. And I think there is a lot of value in learning how to model those things.”
 

 

 

The road to clinical implementation

With further research and validation, deep learning models for gastroenterology are likely to play a role in clinical decision-making, according to Dr. Shung. But to reach the clinic floor, developers will need to outsmart some more fundamental obstacles. “The main thing that’s really barring [AI risk modeling] from being used is the reimbursement issue,” he said, referring to uncertainty in how payers will cover associated costs.

Dr. Sushovan Guha

In an interview, Sushovan Guha, MD, PhD, moderator of the virtual session and codirector of the center for interventional gastroenterology at UTHealth (iGUT) in Houston, pointed out another financial unknown: liability.

“What happens if there is an error?” he asked. “It’s done by the computers, but who is at fault?”

In addition to these challenges, some clinicians may need to be persuaded before they are willing to trust an algorithm with a patient’s life.

“We have to have community physicians convinced about the importance of using these tools to further improve their clinical practice,” Dr. Guha said. To this end, he added, “It’s time for us to accept and adapt, and make our decision-making process much more efficient.”

The investigators disclosed no relevant conflicts of interest.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

GI Genius recently became the first Food and Drug Administration–approved device to use artificial intelligence (AI) for endoscopy. Soon, similar technology may give gastroenterologists an edge before they even walk into the procedure room.

Dr. Dennis Shung

AI can provide highly accurate risk scores for patients with suspected upper GI bleeding, and make a recommendation for discharge or hospitalization, according to Dennis Shung, MD, MHS, a clinical instructor at Yale University, New Haven, Conn. And this could provide extensive benefit.

“Acute gastrointestinal bleeding is the most common gastrointestinal diagnosis requiring hospitalization. It costs around $19.2 billion per year,” Dr. Shung said, citing a study from Gastroenterology. He made these remarks during a virtual presentation at the 2021 AGA Tech Summit sponsored by the AGA Center for GI Innovation and Technology.

Emergency department visits for upper GI bleeding increased 17% from 2006 to 2014, Dr. Shung added, suggesting a rising trend.
 

The trouble with using risk scores

A variety of conventional risk scores are presently available to help manage these patients. Generally, they use a composite outcome of hemostatic intervention, transfusion, or death to determine which patients should be hospitalized (high risk) and which patients can go home (low risk). Although these models can offer high sensitivity, they remain underutilized.

“[Clinical risk scores] are cumbersome, it’s difficult to calculate them, [and] you may not remember to do that in your busy workflow,” Dr. Shung said.

He pointed out that low implementation may also stem from poorly defined clinical responsibilities.

“[Observing] providers caring for patients with GI bleeding showed that there was a culture of not taking ownership,” he said. “Emergency department physicians thought that it was the gastroenterologists who needed to [perform risk scoring]. Gastroenterologists thought it was the ED [physicians’ responsibility].”

To overcome these pitfalls, Dr. Shung and colleagues are developing AI that automates risk analysis for upper GI bleeding by integrating the process into the clinical workflow. Like GI Genius, their strategy relies upon machine learning, which is a type of AI that can improve automatically without being explicitly programmed.

Their most recent study (Sci Rep. 2021 Apr 23;11[1]:8827) involved a machine learning model that could predict transfusion in patients admitted for acute GI bleeding. The model was developed and internally validated in a cohort of 2,524 patients, then shown to outperform conventional regression-based models when externally validated in 1,526 patients similarly admitted at large urban hospitals.
 

Google Maps for GI bleeding

“The future, as I envision it, is a Google Maps for GI bleeding,” Dr. Shung said, referring to how the popular web-mapping product analyzes real-time data, such as weather and traffic patterns, to provide the best route and an estimated time of arrival. “With the electronic health record, we have the ability to personalize care by basically using data obtained during the clinical encounter to generate risk assessment in real time.”

In other words, machine learning software reads a patient’s electronic health record, runs relevant data through an algorithm, and produces both a risk score and a clinical recommendation. In the case of suspected upper GI bleeding, the clinician is advised to either discharge for outpatient endoscopy or hospitalize for inpatient evaluation.

Because the quality and consistency of data in EHRs can vary, the most advanced form of machine learning – deep learning – is needed to make this a clinical reality. Deep learning converts simpler concepts into complex ones. In this scenario, that would mean deciding which clinical data are relevant and which are just noise. Taking this a step further, deep learning can actually “draw conclusions” from what’s missing.

“There are huge challenges in [irregular data] that need to be overcome,” Dr. Shung said in an interview. “But I see it as an opportunity. When you see things that are irregularly sampled, when you see things are missing – they mean something. They mean that a human has decided that that is not the way we should do things because this patient doesn’t need it. And I think there is a lot of value in learning how to model those things.”
 

 

 

The road to clinical implementation

With further research and validation, deep learning models for gastroenterology are likely to play a role in clinical decision-making, according to Dr. Shung. But to reach the clinic floor, developers will need to outsmart some more fundamental obstacles. “The main thing that’s really barring [AI risk modeling] from being used is the reimbursement issue,” he said, referring to uncertainty in how payers will cover associated costs.

Dr. Sushovan Guha

In an interview, Sushovan Guha, MD, PhD, moderator of the virtual session and codirector of the center for interventional gastroenterology at UTHealth (iGUT) in Houston, pointed out another financial unknown: liability.

“What happens if there is an error?” he asked. “It’s done by the computers, but who is at fault?”

In addition to these challenges, some clinicians may need to be persuaded before they are willing to trust an algorithm with a patient’s life.

“We have to have community physicians convinced about the importance of using these tools to further improve their clinical practice,” Dr. Guha said. To this end, he added, “It’s time for us to accept and adapt, and make our decision-making process much more efficient.”

The investigators disclosed no relevant conflicts of interest.

GI Genius recently became the first Food and Drug Administration–approved device to use artificial intelligence (AI) for endoscopy. Soon, similar technology may give gastroenterologists an edge before they even walk into the procedure room.

Dr. Dennis Shung

AI can provide highly accurate risk scores for patients with suspected upper GI bleeding, and make a recommendation for discharge or hospitalization, according to Dennis Shung, MD, MHS, a clinical instructor at Yale University, New Haven, Conn. And this could provide extensive benefit.

“Acute gastrointestinal bleeding is the most common gastrointestinal diagnosis requiring hospitalization. It costs around $19.2 billion per year,” Dr. Shung said, citing a study from Gastroenterology. He made these remarks during a virtual presentation at the 2021 AGA Tech Summit sponsored by the AGA Center for GI Innovation and Technology.

Emergency department visits for upper GI bleeding increased 17% from 2006 to 2014, Dr. Shung added, suggesting a rising trend.
 

The trouble with using risk scores

A variety of conventional risk scores are presently available to help manage these patients. Generally, they use a composite outcome of hemostatic intervention, transfusion, or death to determine which patients should be hospitalized (high risk) and which patients can go home (low risk). Although these models can offer high sensitivity, they remain underutilized.

“[Clinical risk scores] are cumbersome, it’s difficult to calculate them, [and] you may not remember to do that in your busy workflow,” Dr. Shung said.

He pointed out that low implementation may also stem from poorly defined clinical responsibilities.

“[Observing] providers caring for patients with GI bleeding showed that there was a culture of not taking ownership,” he said. “Emergency department physicians thought that it was the gastroenterologists who needed to [perform risk scoring]. Gastroenterologists thought it was the ED [physicians’ responsibility].”

To overcome these pitfalls, Dr. Shung and colleagues are developing AI that automates risk analysis for upper GI bleeding by integrating the process into the clinical workflow. Like GI Genius, their strategy relies upon machine learning, which is a type of AI that can improve automatically without being explicitly programmed.

Their most recent study (Sci Rep. 2021 Apr 23;11[1]:8827) involved a machine learning model that could predict transfusion in patients admitted for acute GI bleeding. The model was developed and internally validated in a cohort of 2,524 patients, then shown to outperform conventional regression-based models when externally validated in 1,526 patients similarly admitted at large urban hospitals.
 

Google Maps for GI bleeding

“The future, as I envision it, is a Google Maps for GI bleeding,” Dr. Shung said, referring to how the popular web-mapping product analyzes real-time data, such as weather and traffic patterns, to provide the best route and an estimated time of arrival. “With the electronic health record, we have the ability to personalize care by basically using data obtained during the clinical encounter to generate risk assessment in real time.”

In other words, machine learning software reads a patient’s electronic health record, runs relevant data through an algorithm, and produces both a risk score and a clinical recommendation. In the case of suspected upper GI bleeding, the clinician is advised to either discharge for outpatient endoscopy or hospitalize for inpatient evaluation.

Because the quality and consistency of data in EHRs can vary, the most advanced form of machine learning – deep learning – is needed to make this a clinical reality. Deep learning converts simpler concepts into complex ones. In this scenario, that would mean deciding which clinical data are relevant and which are just noise. Taking this a step further, deep learning can actually “draw conclusions” from what’s missing.

“There are huge challenges in [irregular data] that need to be overcome,” Dr. Shung said in an interview. “But I see it as an opportunity. When you see things that are irregularly sampled, when you see things are missing – they mean something. They mean that a human has decided that that is not the way we should do things because this patient doesn’t need it. And I think there is a lot of value in learning how to model those things.”
 

 

 

The road to clinical implementation

With further research and validation, deep learning models for gastroenterology are likely to play a role in clinical decision-making, according to Dr. Shung. But to reach the clinic floor, developers will need to outsmart some more fundamental obstacles. “The main thing that’s really barring [AI risk modeling] from being used is the reimbursement issue,” he said, referring to uncertainty in how payers will cover associated costs.

Dr. Sushovan Guha

In an interview, Sushovan Guha, MD, PhD, moderator of the virtual session and codirector of the center for interventional gastroenterology at UTHealth (iGUT) in Houston, pointed out another financial unknown: liability.

“What happens if there is an error?” he asked. “It’s done by the computers, but who is at fault?”

In addition to these challenges, some clinicians may need to be persuaded before they are willing to trust an algorithm with a patient’s life.

“We have to have community physicians convinced about the importance of using these tools to further improve their clinical practice,” Dr. Guha said. To this end, he added, “It’s time for us to accept and adapt, and make our decision-making process much more efficient.”

The investigators disclosed no relevant conflicts of interest.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM 2021 AGA TECH SUMMIT

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article