Optimizing Likelihood of Treatment for Postpartum Depression: Assessment of Barriers to Care

Article Type
Changed
Tue, 08/27/2024 - 12:34

I have written in my first two columns of 2024 about how the obstacles for women to access perinatal mental healthcare are not well understood. This is despite an almost uniform adoption of screening practices for postpartum depression (PPD) over the last 10-15 years in the United States, the approval and off-label use of effective pharmacologic and nonpharmacologic treatments for PPD, and the growing numbers of perinatal access programs across the country in various states and hospitals.

I want to revisit this topic because I believe it is extremely important that we get to a better understanding of the obstacles postpartum patients experience so we can flatten the curve with respect to the perinatal treatment cascade. It turns out that screening is easy but accessing care for those with a positive screen with significant depressive symptoms is an entirely distinct outcome.

Recently, a group of investigators examined the barriers to identifying and treating women for PPD. In a meta-analysis that included 32 reviews, the researchers analyzed the barriers women face when they seek help, access care, and engage in treatment for mental health issues while pregnant or in the postpartum period. The researchers found women have a wide variety of barriers to seeking and accessing care related to societal, political, organizational, interpersonal, healthcare professional, and individual factors at every level of the care pathway. In total, the researchers categorized barriers into six overarching themes and 62 sub-themes, and I want to highlight a few of the biggest contributors below.

In the meta-analysis, a major contributor to deciding to consult with a healthcare professional was a lack of understanding of what constituted a perinatal mental illness. This lack of understanding led women to ignore or minimize their symptoms. Others said that the cost of travel or arranging childcare were factors that prevented them from making an appointment with a provider. Some women reported that their healthcare professionals’ normalization of their symptoms was a barrier in the early stages of the care pathway, and others were unclear about the role a healthcare professional played in involving social services and removing their child from their care, or feared being judged as a bad mom.

One of the major societal factors identified in the study is the stigma associated with PPD. It is unfortunate that for so many postpartum patients, an extraordinary stigma associated with PPD still persists despite efforts from a large number of stakeholders, including the scientific community, advocacy groups, and celebrities who have publicly come out and described their experiences with PPD. For so many postpartum patients, there is an inability to let go of the stigma, shame, humiliation, and isolation associated with the suffering that goes along with PPD.

Another factor identified in the study as being an obstacle to care was a lack of a network to help postpartum patients navigate the shifting roles associated with new parenthood, which is magnified if a patient has developed major depressive disorder. This is why a strong social support network is critical to help women navigate the novelty of being a new mom. We were aware of this as a field nearly 30 years ago when Michael W. O’Hara, PhD, published a paper in the Archives of General Psychiatry noting that social support was an important predictor for risk of PPD.

When we talk with patients in clinic, and even when we interviewed subjects for our upcoming documentary More Than Blue, which will be completed in the fall of 2024, women in the postpartum period have cited the navigation of our current healthcare system as one of the greatest obstacles to getting care. Suffering from PPD and being handed a book of potential providers, absent someone helping to navigate that referral system, is really asking a new mom to climb a very tall mountain. Additionally, moms living in rural areas likely don’t have the sort of access to perinatal mental health services that women in more urban areas do.

It becomes increasingly clear that it is not the lack of availability of effective treatments that is the problem. As I’ve mentioned in previous columns, the last 15 years has given us a much greater understanding of the effectiveness of antidepressants as well as nonpharmacologic psychotherapies for women who may not want to be on a medicine. We now have very effective psychotherapies and there’s excitement about other new treatments that may have a role in the treatment of postpartum depression, including the use of neurosteroids, ketamine or esketamine, and psychedelics or neuromodulation such as transcranial magnetic stimulation. There is also no dearth of both well-studied treatments and even new and effective treatments that, as we move toward precision reproductive psychiatry, may be useful in tailoring treatment for patients.

If we’re looking to understand the anatomy of the perinatal treatment cascade, finally systematically evaluating these barriers may lead us down a path to understand how to build the bridge to postpartum wellness for women who are suffering. While what’s on the horizon is very exciting, we still have yet to address these barriers that prevent women from accessing this expanding array of treatment options. That is, in fact, the challenge to patients, their families, advocacy groups, political organizations, and society in general. The bridging of that gap is a burden that we all share as we try to mitigate the suffering associated with such an exquisitely treatable illness while access to treatment still feels beyond reach of so many postpartum persons around us.

As we continue our research on new treatments, we should keep in mind that they will be of no value unless we understand how to facilitate access to these treatments for the greatest number of patients. This endeavor really highlights the importance of health services research and implementation science, and that we need to be partnering early and often with colleagues if we are to truly achieve this goal.

Dr. Cohen is the director of the Ammon-Pinizzotto Center for Women’s Mental Health at Massachusetts General Hospital (MGH) in Boston, which provides information resources and conducts clinical care and research in reproductive mental health. He has been a consultant to manufacturers of psychiatric medications. Full disclosure information for Dr. Cohen is available at womensmentalhealth.org. Email Dr. Cohen at obnews@mdedge.com

Publications
Topics
Sections

I have written in my first two columns of 2024 about how the obstacles for women to access perinatal mental healthcare are not well understood. This is despite an almost uniform adoption of screening practices for postpartum depression (PPD) over the last 10-15 years in the United States, the approval and off-label use of effective pharmacologic and nonpharmacologic treatments for PPD, and the growing numbers of perinatal access programs across the country in various states and hospitals.

I want to revisit this topic because I believe it is extremely important that we get to a better understanding of the obstacles postpartum patients experience so we can flatten the curve with respect to the perinatal treatment cascade. It turns out that screening is easy but accessing care for those with a positive screen with significant depressive symptoms is an entirely distinct outcome.

Recently, a group of investigators examined the barriers to identifying and treating women for PPD. In a meta-analysis that included 32 reviews, the researchers analyzed the barriers women face when they seek help, access care, and engage in treatment for mental health issues while pregnant or in the postpartum period. The researchers found women have a wide variety of barriers to seeking and accessing care related to societal, political, organizational, interpersonal, healthcare professional, and individual factors at every level of the care pathway. In total, the researchers categorized barriers into six overarching themes and 62 sub-themes, and I want to highlight a few of the biggest contributors below.

In the meta-analysis, a major contributor to deciding to consult with a healthcare professional was a lack of understanding of what constituted a perinatal mental illness. This lack of understanding led women to ignore or minimize their symptoms. Others said that the cost of travel or arranging childcare were factors that prevented them from making an appointment with a provider. Some women reported that their healthcare professionals’ normalization of their symptoms was a barrier in the early stages of the care pathway, and others were unclear about the role a healthcare professional played in involving social services and removing their child from their care, or feared being judged as a bad mom.

One of the major societal factors identified in the study is the stigma associated with PPD. It is unfortunate that for so many postpartum patients, an extraordinary stigma associated with PPD still persists despite efforts from a large number of stakeholders, including the scientific community, advocacy groups, and celebrities who have publicly come out and described their experiences with PPD. For so many postpartum patients, there is an inability to let go of the stigma, shame, humiliation, and isolation associated with the suffering that goes along with PPD.

Another factor identified in the study as being an obstacle to care was a lack of a network to help postpartum patients navigate the shifting roles associated with new parenthood, which is magnified if a patient has developed major depressive disorder. This is why a strong social support network is critical to help women navigate the novelty of being a new mom. We were aware of this as a field nearly 30 years ago when Michael W. O’Hara, PhD, published a paper in the Archives of General Psychiatry noting that social support was an important predictor for risk of PPD.

When we talk with patients in clinic, and even when we interviewed subjects for our upcoming documentary More Than Blue, which will be completed in the fall of 2024, women in the postpartum period have cited the navigation of our current healthcare system as one of the greatest obstacles to getting care. Suffering from PPD and being handed a book of potential providers, absent someone helping to navigate that referral system, is really asking a new mom to climb a very tall mountain. Additionally, moms living in rural areas likely don’t have the sort of access to perinatal mental health services that women in more urban areas do.

It becomes increasingly clear that it is not the lack of availability of effective treatments that is the problem. As I’ve mentioned in previous columns, the last 15 years has given us a much greater understanding of the effectiveness of antidepressants as well as nonpharmacologic psychotherapies for women who may not want to be on a medicine. We now have very effective psychotherapies and there’s excitement about other new treatments that may have a role in the treatment of postpartum depression, including the use of neurosteroids, ketamine or esketamine, and psychedelics or neuromodulation such as transcranial magnetic stimulation. There is also no dearth of both well-studied treatments and even new and effective treatments that, as we move toward precision reproductive psychiatry, may be useful in tailoring treatment for patients.

If we’re looking to understand the anatomy of the perinatal treatment cascade, finally systematically evaluating these barriers may lead us down a path to understand how to build the bridge to postpartum wellness for women who are suffering. While what’s on the horizon is very exciting, we still have yet to address these barriers that prevent women from accessing this expanding array of treatment options. That is, in fact, the challenge to patients, their families, advocacy groups, political organizations, and society in general. The bridging of that gap is a burden that we all share as we try to mitigate the suffering associated with such an exquisitely treatable illness while access to treatment still feels beyond reach of so many postpartum persons around us.

As we continue our research on new treatments, we should keep in mind that they will be of no value unless we understand how to facilitate access to these treatments for the greatest number of patients. This endeavor really highlights the importance of health services research and implementation science, and that we need to be partnering early and often with colleagues if we are to truly achieve this goal.

Dr. Cohen is the director of the Ammon-Pinizzotto Center for Women’s Mental Health at Massachusetts General Hospital (MGH) in Boston, which provides information resources and conducts clinical care and research in reproductive mental health. He has been a consultant to manufacturers of psychiatric medications. Full disclosure information for Dr. Cohen is available at womensmentalhealth.org. Email Dr. Cohen at obnews@mdedge.com

I have written in my first two columns of 2024 about how the obstacles for women to access perinatal mental healthcare are not well understood. This is despite an almost uniform adoption of screening practices for postpartum depression (PPD) over the last 10-15 years in the United States, the approval and off-label use of effective pharmacologic and nonpharmacologic treatments for PPD, and the growing numbers of perinatal access programs across the country in various states and hospitals.

I want to revisit this topic because I believe it is extremely important that we get to a better understanding of the obstacles postpartum patients experience so we can flatten the curve with respect to the perinatal treatment cascade. It turns out that screening is easy but accessing care for those with a positive screen with significant depressive symptoms is an entirely distinct outcome.

Recently, a group of investigators examined the barriers to identifying and treating women for PPD. In a meta-analysis that included 32 reviews, the researchers analyzed the barriers women face when they seek help, access care, and engage in treatment for mental health issues while pregnant or in the postpartum period. The researchers found women have a wide variety of barriers to seeking and accessing care related to societal, political, organizational, interpersonal, healthcare professional, and individual factors at every level of the care pathway. In total, the researchers categorized barriers into six overarching themes and 62 sub-themes, and I want to highlight a few of the biggest contributors below.

In the meta-analysis, a major contributor to deciding to consult with a healthcare professional was a lack of understanding of what constituted a perinatal mental illness. This lack of understanding led women to ignore or minimize their symptoms. Others said that the cost of travel or arranging childcare were factors that prevented them from making an appointment with a provider. Some women reported that their healthcare professionals’ normalization of their symptoms was a barrier in the early stages of the care pathway, and others were unclear about the role a healthcare professional played in involving social services and removing their child from their care, or feared being judged as a bad mom.

One of the major societal factors identified in the study is the stigma associated with PPD. It is unfortunate that for so many postpartum patients, an extraordinary stigma associated with PPD still persists despite efforts from a large number of stakeholders, including the scientific community, advocacy groups, and celebrities who have publicly come out and described their experiences with PPD. For so many postpartum patients, there is an inability to let go of the stigma, shame, humiliation, and isolation associated with the suffering that goes along with PPD.

Another factor identified in the study as being an obstacle to care was a lack of a network to help postpartum patients navigate the shifting roles associated with new parenthood, which is magnified if a patient has developed major depressive disorder. This is why a strong social support network is critical to help women navigate the novelty of being a new mom. We were aware of this as a field nearly 30 years ago when Michael W. O’Hara, PhD, published a paper in the Archives of General Psychiatry noting that social support was an important predictor for risk of PPD.

When we talk with patients in clinic, and even when we interviewed subjects for our upcoming documentary More Than Blue, which will be completed in the fall of 2024, women in the postpartum period have cited the navigation of our current healthcare system as one of the greatest obstacles to getting care. Suffering from PPD and being handed a book of potential providers, absent someone helping to navigate that referral system, is really asking a new mom to climb a very tall mountain. Additionally, moms living in rural areas likely don’t have the sort of access to perinatal mental health services that women in more urban areas do.

It becomes increasingly clear that it is not the lack of availability of effective treatments that is the problem. As I’ve mentioned in previous columns, the last 15 years has given us a much greater understanding of the effectiveness of antidepressants as well as nonpharmacologic psychotherapies for women who may not want to be on a medicine. We now have very effective psychotherapies and there’s excitement about other new treatments that may have a role in the treatment of postpartum depression, including the use of neurosteroids, ketamine or esketamine, and psychedelics or neuromodulation such as transcranial magnetic stimulation. There is also no dearth of both well-studied treatments and even new and effective treatments that, as we move toward precision reproductive psychiatry, may be useful in tailoring treatment for patients.

If we’re looking to understand the anatomy of the perinatal treatment cascade, finally systematically evaluating these barriers may lead us down a path to understand how to build the bridge to postpartum wellness for women who are suffering. While what’s on the horizon is very exciting, we still have yet to address these barriers that prevent women from accessing this expanding array of treatment options. That is, in fact, the challenge to patients, their families, advocacy groups, political organizations, and society in general. The bridging of that gap is a burden that we all share as we try to mitigate the suffering associated with such an exquisitely treatable illness while access to treatment still feels beyond reach of so many postpartum persons around us.

As we continue our research on new treatments, we should keep in mind that they will be of no value unless we understand how to facilitate access to these treatments for the greatest number of patients. This endeavor really highlights the importance of health services research and implementation science, and that we need to be partnering early and often with colleagues if we are to truly achieve this goal.

Dr. Cohen is the director of the Ammon-Pinizzotto Center for Women’s Mental Health at Massachusetts General Hospital (MGH) in Boston, which provides information resources and conducts clinical care and research in reproductive mental health. He has been a consultant to manufacturers of psychiatric medications. Full disclosure information for Dr. Cohen is available at womensmentalhealth.org. Email Dr. Cohen at obnews@mdedge.com

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

HIIT May Best Moderate Exercise for Poststroke Fitness

Article Type
Changed
Tue, 08/27/2024 - 12:04

Repeated 1-minute bursts of high-intensity interval training (HIIT) are more effective than conventional moderate, continuous exercise for improving aerobic fitness after stroke, according to a multicenter randomized controlled trial.

“We hoped that we would see improvements in cardiovascular fitness after HIIT and anticipated that these improvements would be greater than in the moderate-intensity group, but we were pleasantly surprised by the degree of improvement we observed,” said Ada Tang, PT, PhD, associate professor of health sciences at McMaster University in Hamilton, Ontario, Canada. “The improvements seen in the HIIT group were twofold higher than in the other group.”

The results were published in Stroke.
 

Clinically Meaningful

Researchers compared the effects of 12 weeks of short-interval HIIT with those of moderate-intensity continuous training (MICT) on peak oxygen uptake (VO2peak), cardiovascular risk factors, and mobility outcomes after stroke.

They randomly assigned participants to receive 3 days per week of HIIT or traditional moderate exercise sessions for 12 weeks. Participants’ mean age was 65 years, and 39% were women. They enrolled at a mean age of 1.8 years after sustaining a mild stroke.

A total of 42 participants were randomized to HIIT and 40 to MICT. There were no significant differences between the groups at baseline, and both groups exercised on adaptive recumbent steppers, which are suitable for stroke survivors with varying abilities.

The short-interval HIIT protocol involved 10 1-minute intervals of high-intensity exercise, interspersed with nine 1-minute low-intensity intervals, for a total of 19 minutes. HIIT intervals targeted 80% heart rate reserve (HRR) and progressed by 10% every 4 weeks up to 100% HRR. The low-intensity intervals targeted 30% HRR.

The traditional MICT protocol for stroke rehabilitation targeted 40% HRR for 20 minutes and progressed by 10% HRR and 5 minutes every 4 weeks, up to 60% HRR for 30 minutes.

The HIIT group’s cardiorespiratory fitness levels (VO2peak) improved twice as much as those of the MICT group: 3.5 mL of oxygen consumed in 1 minute per kg of body weight (mL/kg/min) compared with 1.8 mL/kg/min.

Of note, changes in VO2peak from baseline remained above the clinically important threshold of 1.0 mL/kg/min at 8-week follow-up in the HIIT group (1.71 mL/kg/min) but not in the MICT group (0.67 mL/kg/min).

Both groups increased their 6-minute walk test distances by 8.8 m at 12 weeks and by 18.5 m at 20 weeks. No between-group differences were found for cardiovascular risk or mobility outcomes, and no adverse events occurred in either group.

On average, the HIIT group spent 36% of total training time exercising at intensities above 80% HRR throughout the intervention, while the MICT group spent 42% of time at intensities of 40%-59% HRR.

The study was limited by a small sample size of high-functioning individuals who sustained a mild stroke. Enrollment was halted for 2 years due to the COVID-19 lockdowns, limiting the study’s statistical power.

Nevertheless, the authors concluded, “Given that a lack of time is a significant barrier to the implementation of aerobic exercise in stroke clinical practice, our findings suggest that short-interval HIIT may be an effective alternative to traditional MICT for improving VO2peak after stroke, with potential clinically meaningful benefits sustained in the short-term.”

“Our findings show that a short HIIT protocol is possible in people with stroke, which is exciting to see,” said Tang. “But there are different factors that clinicians should consider before recommending this training for their patients, such as their health status and their physical status. Stroke rehabilitation specialists, including stroke physical therapists, can advise on how to proceed to ensure the safety and effectiveness of HIIT.”
 

 

 

Selected Patients May Benefit

“Broad implementation of this intervention may be premature without further research,” said Ryan Glatt, CPT, senior brain health coach and director of the FitBrain Program at Pacific Neuroscience Institute in Santa Monica, California. “The study focused on relatively high-functioning stroke survivors, which raises questions about the applicability of the results to those with more severe impairments.” Mr. Glatt did not participate in the research.

“Additional studies are needed to confirm whether these findings are applicable to more diverse and severely affected populations and to assess the long-term sustainability of the benefits observed,” he said. “Also, the lack of significant improvements in other critical outcomes, such as mobility, suggests limitations in the broader application of HIIT for stroke rehabilitation.”

“While HIIT shows potential, it should be approached with caution,” Mr. Glatt continued. “It may benefit select patients, but replacing traditional exercise protocols with HIIT should not be done in all cases. More robust evidence and careful consideration of individual patient needs are essential.”

This study was funded by an operating grant from the Canadian Institutes of Health Research. Dr. Tang reported grants from the Canadian Institutes of Health Research, the Physiotherapy Foundation of Canada, and the Heart and Stroke Foundation of Canada. Mr. Glatt declared no relevant financial relationships.
 

A version of this article appeared on Medscape.com.

Publications
Topics
Sections

Repeated 1-minute bursts of high-intensity interval training (HIIT) are more effective than conventional moderate, continuous exercise for improving aerobic fitness after stroke, according to a multicenter randomized controlled trial.

“We hoped that we would see improvements in cardiovascular fitness after HIIT and anticipated that these improvements would be greater than in the moderate-intensity group, but we were pleasantly surprised by the degree of improvement we observed,” said Ada Tang, PT, PhD, associate professor of health sciences at McMaster University in Hamilton, Ontario, Canada. “The improvements seen in the HIIT group were twofold higher than in the other group.”

The results were published in Stroke.
 

Clinically Meaningful

Researchers compared the effects of 12 weeks of short-interval HIIT with those of moderate-intensity continuous training (MICT) on peak oxygen uptake (VO2peak), cardiovascular risk factors, and mobility outcomes after stroke.

They randomly assigned participants to receive 3 days per week of HIIT or traditional moderate exercise sessions for 12 weeks. Participants’ mean age was 65 years, and 39% were women. They enrolled at a mean age of 1.8 years after sustaining a mild stroke.

A total of 42 participants were randomized to HIIT and 40 to MICT. There were no significant differences between the groups at baseline, and both groups exercised on adaptive recumbent steppers, which are suitable for stroke survivors with varying abilities.

The short-interval HIIT protocol involved 10 1-minute intervals of high-intensity exercise, interspersed with nine 1-minute low-intensity intervals, for a total of 19 minutes. HIIT intervals targeted 80% heart rate reserve (HRR) and progressed by 10% every 4 weeks up to 100% HRR. The low-intensity intervals targeted 30% HRR.

The traditional MICT protocol for stroke rehabilitation targeted 40% HRR for 20 minutes and progressed by 10% HRR and 5 minutes every 4 weeks, up to 60% HRR for 30 minutes.

The HIIT group’s cardiorespiratory fitness levels (VO2peak) improved twice as much as those of the MICT group: 3.5 mL of oxygen consumed in 1 minute per kg of body weight (mL/kg/min) compared with 1.8 mL/kg/min.

Of note, changes in VO2peak from baseline remained above the clinically important threshold of 1.0 mL/kg/min at 8-week follow-up in the HIIT group (1.71 mL/kg/min) but not in the MICT group (0.67 mL/kg/min).

Both groups increased their 6-minute walk test distances by 8.8 m at 12 weeks and by 18.5 m at 20 weeks. No between-group differences were found for cardiovascular risk or mobility outcomes, and no adverse events occurred in either group.

On average, the HIIT group spent 36% of total training time exercising at intensities above 80% HRR throughout the intervention, while the MICT group spent 42% of time at intensities of 40%-59% HRR.

The study was limited by a small sample size of high-functioning individuals who sustained a mild stroke. Enrollment was halted for 2 years due to the COVID-19 lockdowns, limiting the study’s statistical power.

Nevertheless, the authors concluded, “Given that a lack of time is a significant barrier to the implementation of aerobic exercise in stroke clinical practice, our findings suggest that short-interval HIIT may be an effective alternative to traditional MICT for improving VO2peak after stroke, with potential clinically meaningful benefits sustained in the short-term.”

“Our findings show that a short HIIT protocol is possible in people with stroke, which is exciting to see,” said Tang. “But there are different factors that clinicians should consider before recommending this training for their patients, such as their health status and their physical status. Stroke rehabilitation specialists, including stroke physical therapists, can advise on how to proceed to ensure the safety and effectiveness of HIIT.”
 

 

 

Selected Patients May Benefit

“Broad implementation of this intervention may be premature without further research,” said Ryan Glatt, CPT, senior brain health coach and director of the FitBrain Program at Pacific Neuroscience Institute in Santa Monica, California. “The study focused on relatively high-functioning stroke survivors, which raises questions about the applicability of the results to those with more severe impairments.” Mr. Glatt did not participate in the research.

“Additional studies are needed to confirm whether these findings are applicable to more diverse and severely affected populations and to assess the long-term sustainability of the benefits observed,” he said. “Also, the lack of significant improvements in other critical outcomes, such as mobility, suggests limitations in the broader application of HIIT for stroke rehabilitation.”

“While HIIT shows potential, it should be approached with caution,” Mr. Glatt continued. “It may benefit select patients, but replacing traditional exercise protocols with HIIT should not be done in all cases. More robust evidence and careful consideration of individual patient needs are essential.”

This study was funded by an operating grant from the Canadian Institutes of Health Research. Dr. Tang reported grants from the Canadian Institutes of Health Research, the Physiotherapy Foundation of Canada, and the Heart and Stroke Foundation of Canada. Mr. Glatt declared no relevant financial relationships.
 

A version of this article appeared on Medscape.com.

Repeated 1-minute bursts of high-intensity interval training (HIIT) are more effective than conventional moderate, continuous exercise for improving aerobic fitness after stroke, according to a multicenter randomized controlled trial.

“We hoped that we would see improvements in cardiovascular fitness after HIIT and anticipated that these improvements would be greater than in the moderate-intensity group, but we were pleasantly surprised by the degree of improvement we observed,” said Ada Tang, PT, PhD, associate professor of health sciences at McMaster University in Hamilton, Ontario, Canada. “The improvements seen in the HIIT group were twofold higher than in the other group.”

The results were published in Stroke.
 

Clinically Meaningful

Researchers compared the effects of 12 weeks of short-interval HIIT with those of moderate-intensity continuous training (MICT) on peak oxygen uptake (VO2peak), cardiovascular risk factors, and mobility outcomes after stroke.

They randomly assigned participants to receive 3 days per week of HIIT or traditional moderate exercise sessions for 12 weeks. Participants’ mean age was 65 years, and 39% were women. They enrolled at a mean age of 1.8 years after sustaining a mild stroke.

A total of 42 participants were randomized to HIIT and 40 to MICT. There were no significant differences between the groups at baseline, and both groups exercised on adaptive recumbent steppers, which are suitable for stroke survivors with varying abilities.

The short-interval HIIT protocol involved 10 1-minute intervals of high-intensity exercise, interspersed with nine 1-minute low-intensity intervals, for a total of 19 minutes. HIIT intervals targeted 80% heart rate reserve (HRR) and progressed by 10% every 4 weeks up to 100% HRR. The low-intensity intervals targeted 30% HRR.

The traditional MICT protocol for stroke rehabilitation targeted 40% HRR for 20 minutes and progressed by 10% HRR and 5 minutes every 4 weeks, up to 60% HRR for 30 minutes.

The HIIT group’s cardiorespiratory fitness levels (VO2peak) improved twice as much as those of the MICT group: 3.5 mL of oxygen consumed in 1 minute per kg of body weight (mL/kg/min) compared with 1.8 mL/kg/min.

Of note, changes in VO2peak from baseline remained above the clinically important threshold of 1.0 mL/kg/min at 8-week follow-up in the HIIT group (1.71 mL/kg/min) but not in the MICT group (0.67 mL/kg/min).

Both groups increased their 6-minute walk test distances by 8.8 m at 12 weeks and by 18.5 m at 20 weeks. No between-group differences were found for cardiovascular risk or mobility outcomes, and no adverse events occurred in either group.

On average, the HIIT group spent 36% of total training time exercising at intensities above 80% HRR throughout the intervention, while the MICT group spent 42% of time at intensities of 40%-59% HRR.

The study was limited by a small sample size of high-functioning individuals who sustained a mild stroke. Enrollment was halted for 2 years due to the COVID-19 lockdowns, limiting the study’s statistical power.

Nevertheless, the authors concluded, “Given that a lack of time is a significant barrier to the implementation of aerobic exercise in stroke clinical practice, our findings suggest that short-interval HIIT may be an effective alternative to traditional MICT for improving VO2peak after stroke, with potential clinically meaningful benefits sustained in the short-term.”

“Our findings show that a short HIIT protocol is possible in people with stroke, which is exciting to see,” said Tang. “But there are different factors that clinicians should consider before recommending this training for their patients, such as their health status and their physical status. Stroke rehabilitation specialists, including stroke physical therapists, can advise on how to proceed to ensure the safety and effectiveness of HIIT.”
 

 

 

Selected Patients May Benefit

“Broad implementation of this intervention may be premature without further research,” said Ryan Glatt, CPT, senior brain health coach and director of the FitBrain Program at Pacific Neuroscience Institute in Santa Monica, California. “The study focused on relatively high-functioning stroke survivors, which raises questions about the applicability of the results to those with more severe impairments.” Mr. Glatt did not participate in the research.

“Additional studies are needed to confirm whether these findings are applicable to more diverse and severely affected populations and to assess the long-term sustainability of the benefits observed,” he said. “Also, the lack of significant improvements in other critical outcomes, such as mobility, suggests limitations in the broader application of HIIT for stroke rehabilitation.”

“While HIIT shows potential, it should be approached with caution,” Mr. Glatt continued. “It may benefit select patients, but replacing traditional exercise protocols with HIIT should not be done in all cases. More robust evidence and careful consideration of individual patient needs are essential.”

This study was funded by an operating grant from the Canadian Institutes of Health Research. Dr. Tang reported grants from the Canadian Institutes of Health Research, the Physiotherapy Foundation of Canada, and the Heart and Stroke Foundation of Canada. Mr. Glatt declared no relevant financial relationships.
 

A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Is Your Patient Too Old for a Colonoscopy?

Article Type
Changed
Tue, 10/15/2024 - 09:12

 

Colonoscopy remains the gold standard method for detecting colorectal cancer (CRC) and removing precancerous polyps.

The recommended age for CRC screening in the United States spans 45-75 years, with the benefits of colonoscopy diminishing considerably after this point.

Older adults are much more likely to experience complications before, during, and after a colonoscopy. Bowel preps can cause dehydration or electrolyte problems in some, while bleeding and bowel perforation can occur perioperatively, and pulmonary or cardiovascular complications may arise postoperatively.

These risks often outweigh the benefits of catching a precancerous lesion or early-stage cancer, especially given the low rates of advanced neoplasia and CRC detected from screening and surveillance after age 75. Yet the research overall suggests that more than half of older individuals continue to receive screening and surveillance colonoscopies outside the recommended screening window.

So is there a point in time when a person is too old to receive a colonoscopy? The answer is not always clear-cut, but life expectancy should be a key consideration.

“Taking the most extreme example, if you have 6 months to live, finding early-stage cancer is not going to help you,” Michael Rothberg, MD, vice chair for research at Cleveland Clinic’s Medical Institute and director of the Center for Value-Based Care Research, told Medscape Medical News.

For those with more time, the benefits of continued screening and surveillance may outweigh the risks, but when that balance shifts from helpful to not helpful remains inexact, Dr. Rothberg noted.

What’s Recommended?

In May 2021, the US Preventive Services Task Force (USPSTF) lowered the CRC screening threshold to age 45, recommending all adults aged between 45 and 75 years receive screening.

For those aged between 76 and 85 years, the USPSTF upheld its 2016 recommendation of selective screening, noting that the “net benefit of screening all persons in this age group is small” and should be determined on an individual basis. The USPSTF, however, did not provide recommendations on surveillance colonoscopies among those with previously identified polyps.

In November 2023, the American Gastroenterological Association (AGA) issued a clinical practice update that provided advice on risk stratification for CRC screening and post-polypectomy surveillance. For adults older than 75 years specifically, the AGA recommended that the decision to continue CRC screening or perform post-polypectomy surveillance be based on risks, benefits, comorbidities, and screening history and decided on a case-by-case basis.

For instance, previously unscreened patients without comorbidities could benefit from screening beyond age 75 — up to age 80 for men and 90 for women — while those who have had regular colonoscopies, per recommended guidelines, but severe comorbidities that may limit life expectancy could stop sooner, even by age 65.

Although an individualized approach leaves room for variation, it’s essential to consider life expectancy and the time it takes for a polyp to progress to CRC, as well as the risks associated with the procedure itself. Certain older adults are “less likely to live long enough to benefit from surveillance colonoscopy, due to competing, non-CRC mortality risks,” and clinicians should discuss these risks with their patients, the experts explained.
 

When to Stop Screening Colonoscopies

Research shows that screening colonoscopies continue well after the recommended stop age.

A 2023 JAMA Internal Medicine study found, for instance, that a large proportion of screening colonoscopies occurred among the 7067 patients who were 75 years and older with a life expectancy < 10 years. Overall, 30% of patients aged between 76 and 80 years with a limited life expectancy had a colonoscopy. That percentage increased to 71% for those aged 81-85 years and to 100% for those older than 85 years.

But the benefits of screening were minimal. Overall, colonoscopies detected advanced neoplasia in 5.4% of patients aged 76-80 years, 6.2% of those aged 81-85 years, and 9.5% of those older than 85 years. Only 15 patients (0.2%) had CRC detected via colonoscopy, five of whom underwent cancer treatment. Of those five, four had a life expectancy ≥ 10 years, and one had a life expectancy < 10 years.

At the same time, adverse events requiring hospitalization were common 10 days post-colonoscopy (13.58 per 1000), and the risk for hospitalization increased with age.

“For all kinds of screening, we’re not that comfortable in America with the idea that people are eventually going to die, but as you get older, the potential benefits for screening decrease,” study author Dr. Rothberg told this news organization.

In general, life expectancy provides a good predictor of whether people should continue screening or receive treatment following a CRC diagnosis.

Patients aged 76-80 years in good health, for instance, could benefit from screening and, potentially, treatment, Dr. Rothberg said. And “if doctors don’t feel comfortable or confident about predicting life expectancy, taking comorbid illnesses into account can be helpful, especially for that age range.”
 

Weighing Surveillance Benefits

Surveillance colonoscopy is often recommended post-polypectomy to reduce the risk for CRC. But even in this higher-risk population, those older than 75 years may not benefit.

Recent evidence indicates that those with a history of one or two adenomas less than 1 cm in size have only a slightly (1.3-fold) increased risk for incident CRC — and no significant increased risk for fatal CRC.

Another recent study found that detecting CRC at surveillance colonoscopy was rare among older adults. In surveillance colonoscopies performed among 9601 individuals aged 70-85 years with prior adenomas, 12% had advanced neoplasia detected, and only 0.3% had CRC detected.

Similar rates of advanced polyps (7.8%) or CRC (0.2%) were reported in another recent analysis of more than 9800 adults older than 65 years receiving surveillance colonoscopies.

Despite the low rates of polyp and CRC detection, nearly 90% of patients with recommendation information available received advice to return for a future colonoscopy. Even among patients with no polyps or small ones, almost 60% who had life expectancy of less than 5 years were told to return.

Although someone with prior adenomas has a higher risk for CRC, that doesn’t tell the whole story for an individual patient, Samir Gupta, MD, professor of gastroenterology at the University of California San Diego, and co-lead of the Cancer Control Program at Moores Cancer Center, told this news organization. For older adults, it’s vital to consider the competing risks and how much time it might take for CRC to develop.

At Digestive Disease Week in May, Dr. Gupta presented new research that looked at cumulative risk among patients aged 75 years and older with prior precancerous polyps vs prior normal colonoscopies. Although those with prior adenomas had a higher risk for CRC overall, their cumulative CRC risk was low — about 0.3% at 5 years and 0.8% at 10 years. Cumulative CRC deaths were even lower — 0.2% at 5 years and 0.7% at 10 years — while the risk of dying from something other than CRC was 20% at 5 years and 40% at 10 years.

“What this means to me is that patients who are 75 and older should think really carefully about whether they want to do surveillance,” said Dr. Gupta, who coauthored the AGA’s clinical practice update. “Someone who is very healthy and doesn’t have obvious medical problems can look at that risk for developing colon cancer and the risk of dying and make a decision about whether there’s enough concern to go ahead with surveillance.”

Those with competing health priorities, on other hand, should likely concentrate on those instead, he said, and feel reassured that even if they choose not to do surveillance, they’re probably not doing themselves any harm.

“The bottom line is that referring older adults or frail adults for surveillance colonoscopy shouldn’t be a rubber stamp or check-the-box action,” Dr. Gupta said. “We need to think about it carefully and give ourselves — as clinicians and patients — the room to decide that it may not need to take high priority.”
 

 

 

What to Tell Patients

Overall, older adults who have had prior colonoscopies, no or low-risk polyps, and low CRC risk will likely face greater risks from the procedure than benefits.

“The more invasive the screening the test, the more dangerous it could be,” Dr. Rothberg noted.

Many patients, however, are open to stopping and often trust their primary care provider in the decision-making process, said Audrey Calderwood, MD, director of the Comprehensive Gastroenterology Center at Dartmouth Hitchcock Medical Center. “But the systems we have in place don’t optimally support that decision-making at the time it matters most.”

For example, at a prior colonoscopy, a gastroenterologist may recommend surveillance again in 5-7 years. But in the interim, the patient could have new medications or develop comorbidities and other health issues. Rather than defer to the gastroenterologist’s recommendations from years ago, clinicians and patients can reassess the pros and cons of screening or surveillance based on current circumstances, Dr. Calderwood said.

“There should be lines of communication and systems of support to allow primary care providers to decide whether it is still needed,” she said.

While some may be ready to stop, other patients are going to continue to want and ask about CRC screening or surveillance, Dr. Rothberg said.

In these instances, communication style matters.

“You don’t want to tell a patient that they’re not going to be screened because they’re not going to live long enough to benefit,” Dr. Rothberg said.

However, steering people toward less invasive tests or telling them it’s important to give other health problems priority may be more sensitive ways to communicate that it’s time to ramp down or halt screening.

“Sometimes when you say you’re going to stop cancer screening, older adults misperceive that you’re giving up on them,” Dr. Gupta said. “We spend 30-40 years driving home the message that prevention and screening are important, and then it feels like we’re taking it away, so we need to find the best way to discuss it and make the choice that’s comfortable for them.”

Dr. Rothberg, Dr. Gupta, and Dr. Calderwood disclosed no relevant conflicts of interest.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

Colonoscopy remains the gold standard method for detecting colorectal cancer (CRC) and removing precancerous polyps.

The recommended age for CRC screening in the United States spans 45-75 years, with the benefits of colonoscopy diminishing considerably after this point.

Older adults are much more likely to experience complications before, during, and after a colonoscopy. Bowel preps can cause dehydration or electrolyte problems in some, while bleeding and bowel perforation can occur perioperatively, and pulmonary or cardiovascular complications may arise postoperatively.

These risks often outweigh the benefits of catching a precancerous lesion or early-stage cancer, especially given the low rates of advanced neoplasia and CRC detected from screening and surveillance after age 75. Yet the research overall suggests that more than half of older individuals continue to receive screening and surveillance colonoscopies outside the recommended screening window.

So is there a point in time when a person is too old to receive a colonoscopy? The answer is not always clear-cut, but life expectancy should be a key consideration.

“Taking the most extreme example, if you have 6 months to live, finding early-stage cancer is not going to help you,” Michael Rothberg, MD, vice chair for research at Cleveland Clinic’s Medical Institute and director of the Center for Value-Based Care Research, told Medscape Medical News.

For those with more time, the benefits of continued screening and surveillance may outweigh the risks, but when that balance shifts from helpful to not helpful remains inexact, Dr. Rothberg noted.

What’s Recommended?

In May 2021, the US Preventive Services Task Force (USPSTF) lowered the CRC screening threshold to age 45, recommending all adults aged between 45 and 75 years receive screening.

For those aged between 76 and 85 years, the USPSTF upheld its 2016 recommendation of selective screening, noting that the “net benefit of screening all persons in this age group is small” and should be determined on an individual basis. The USPSTF, however, did not provide recommendations on surveillance colonoscopies among those with previously identified polyps.

In November 2023, the American Gastroenterological Association (AGA) issued a clinical practice update that provided advice on risk stratification for CRC screening and post-polypectomy surveillance. For adults older than 75 years specifically, the AGA recommended that the decision to continue CRC screening or perform post-polypectomy surveillance be based on risks, benefits, comorbidities, and screening history and decided on a case-by-case basis.

For instance, previously unscreened patients without comorbidities could benefit from screening beyond age 75 — up to age 80 for men and 90 for women — while those who have had regular colonoscopies, per recommended guidelines, but severe comorbidities that may limit life expectancy could stop sooner, even by age 65.

Although an individualized approach leaves room for variation, it’s essential to consider life expectancy and the time it takes for a polyp to progress to CRC, as well as the risks associated with the procedure itself. Certain older adults are “less likely to live long enough to benefit from surveillance colonoscopy, due to competing, non-CRC mortality risks,” and clinicians should discuss these risks with their patients, the experts explained.
 

When to Stop Screening Colonoscopies

Research shows that screening colonoscopies continue well after the recommended stop age.

A 2023 JAMA Internal Medicine study found, for instance, that a large proportion of screening colonoscopies occurred among the 7067 patients who were 75 years and older with a life expectancy < 10 years. Overall, 30% of patients aged between 76 and 80 years with a limited life expectancy had a colonoscopy. That percentage increased to 71% for those aged 81-85 years and to 100% for those older than 85 years.

But the benefits of screening were minimal. Overall, colonoscopies detected advanced neoplasia in 5.4% of patients aged 76-80 years, 6.2% of those aged 81-85 years, and 9.5% of those older than 85 years. Only 15 patients (0.2%) had CRC detected via colonoscopy, five of whom underwent cancer treatment. Of those five, four had a life expectancy ≥ 10 years, and one had a life expectancy < 10 years.

At the same time, adverse events requiring hospitalization were common 10 days post-colonoscopy (13.58 per 1000), and the risk for hospitalization increased with age.

“For all kinds of screening, we’re not that comfortable in America with the idea that people are eventually going to die, but as you get older, the potential benefits for screening decrease,” study author Dr. Rothberg told this news organization.

In general, life expectancy provides a good predictor of whether people should continue screening or receive treatment following a CRC diagnosis.

Patients aged 76-80 years in good health, for instance, could benefit from screening and, potentially, treatment, Dr. Rothberg said. And “if doctors don’t feel comfortable or confident about predicting life expectancy, taking comorbid illnesses into account can be helpful, especially for that age range.”
 

Weighing Surveillance Benefits

Surveillance colonoscopy is often recommended post-polypectomy to reduce the risk for CRC. But even in this higher-risk population, those older than 75 years may not benefit.

Recent evidence indicates that those with a history of one or two adenomas less than 1 cm in size have only a slightly (1.3-fold) increased risk for incident CRC — and no significant increased risk for fatal CRC.

Another recent study found that detecting CRC at surveillance colonoscopy was rare among older adults. In surveillance colonoscopies performed among 9601 individuals aged 70-85 years with prior adenomas, 12% had advanced neoplasia detected, and only 0.3% had CRC detected.

Similar rates of advanced polyps (7.8%) or CRC (0.2%) were reported in another recent analysis of more than 9800 adults older than 65 years receiving surveillance colonoscopies.

Despite the low rates of polyp and CRC detection, nearly 90% of patients with recommendation information available received advice to return for a future colonoscopy. Even among patients with no polyps or small ones, almost 60% who had life expectancy of less than 5 years were told to return.

Although someone with prior adenomas has a higher risk for CRC, that doesn’t tell the whole story for an individual patient, Samir Gupta, MD, professor of gastroenterology at the University of California San Diego, and co-lead of the Cancer Control Program at Moores Cancer Center, told this news organization. For older adults, it’s vital to consider the competing risks and how much time it might take for CRC to develop.

At Digestive Disease Week in May, Dr. Gupta presented new research that looked at cumulative risk among patients aged 75 years and older with prior precancerous polyps vs prior normal colonoscopies. Although those with prior adenomas had a higher risk for CRC overall, their cumulative CRC risk was low — about 0.3% at 5 years and 0.8% at 10 years. Cumulative CRC deaths were even lower — 0.2% at 5 years and 0.7% at 10 years — while the risk of dying from something other than CRC was 20% at 5 years and 40% at 10 years.

“What this means to me is that patients who are 75 and older should think really carefully about whether they want to do surveillance,” said Dr. Gupta, who coauthored the AGA’s clinical practice update. “Someone who is very healthy and doesn’t have obvious medical problems can look at that risk for developing colon cancer and the risk of dying and make a decision about whether there’s enough concern to go ahead with surveillance.”

Those with competing health priorities, on other hand, should likely concentrate on those instead, he said, and feel reassured that even if they choose not to do surveillance, they’re probably not doing themselves any harm.

“The bottom line is that referring older adults or frail adults for surveillance colonoscopy shouldn’t be a rubber stamp or check-the-box action,” Dr. Gupta said. “We need to think about it carefully and give ourselves — as clinicians and patients — the room to decide that it may not need to take high priority.”
 

 

 

What to Tell Patients

Overall, older adults who have had prior colonoscopies, no or low-risk polyps, and low CRC risk will likely face greater risks from the procedure than benefits.

“The more invasive the screening the test, the more dangerous it could be,” Dr. Rothberg noted.

Many patients, however, are open to stopping and often trust their primary care provider in the decision-making process, said Audrey Calderwood, MD, director of the Comprehensive Gastroenterology Center at Dartmouth Hitchcock Medical Center. “But the systems we have in place don’t optimally support that decision-making at the time it matters most.”

For example, at a prior colonoscopy, a gastroenterologist may recommend surveillance again in 5-7 years. But in the interim, the patient could have new medications or develop comorbidities and other health issues. Rather than defer to the gastroenterologist’s recommendations from years ago, clinicians and patients can reassess the pros and cons of screening or surveillance based on current circumstances, Dr. Calderwood said.

“There should be lines of communication and systems of support to allow primary care providers to decide whether it is still needed,” she said.

While some may be ready to stop, other patients are going to continue to want and ask about CRC screening or surveillance, Dr. Rothberg said.

In these instances, communication style matters.

“You don’t want to tell a patient that they’re not going to be screened because they’re not going to live long enough to benefit,” Dr. Rothberg said.

However, steering people toward less invasive tests or telling them it’s important to give other health problems priority may be more sensitive ways to communicate that it’s time to ramp down or halt screening.

“Sometimes when you say you’re going to stop cancer screening, older adults misperceive that you’re giving up on them,” Dr. Gupta said. “We spend 30-40 years driving home the message that prevention and screening are important, and then it feels like we’re taking it away, so we need to find the best way to discuss it and make the choice that’s comfortable for them.”

Dr. Rothberg, Dr. Gupta, and Dr. Calderwood disclosed no relevant conflicts of interest.
 

A version of this article first appeared on Medscape.com.

 

Colonoscopy remains the gold standard method for detecting colorectal cancer (CRC) and removing precancerous polyps.

The recommended age for CRC screening in the United States spans 45-75 years, with the benefits of colonoscopy diminishing considerably after this point.

Older adults are much more likely to experience complications before, during, and after a colonoscopy. Bowel preps can cause dehydration or electrolyte problems in some, while bleeding and bowel perforation can occur perioperatively, and pulmonary or cardiovascular complications may arise postoperatively.

These risks often outweigh the benefits of catching a precancerous lesion or early-stage cancer, especially given the low rates of advanced neoplasia and CRC detected from screening and surveillance after age 75. Yet the research overall suggests that more than half of older individuals continue to receive screening and surveillance colonoscopies outside the recommended screening window.

So is there a point in time when a person is too old to receive a colonoscopy? The answer is not always clear-cut, but life expectancy should be a key consideration.

“Taking the most extreme example, if you have 6 months to live, finding early-stage cancer is not going to help you,” Michael Rothberg, MD, vice chair for research at Cleveland Clinic’s Medical Institute and director of the Center for Value-Based Care Research, told Medscape Medical News.

For those with more time, the benefits of continued screening and surveillance may outweigh the risks, but when that balance shifts from helpful to not helpful remains inexact, Dr. Rothberg noted.

What’s Recommended?

In May 2021, the US Preventive Services Task Force (USPSTF) lowered the CRC screening threshold to age 45, recommending all adults aged between 45 and 75 years receive screening.

For those aged between 76 and 85 years, the USPSTF upheld its 2016 recommendation of selective screening, noting that the “net benefit of screening all persons in this age group is small” and should be determined on an individual basis. The USPSTF, however, did not provide recommendations on surveillance colonoscopies among those with previously identified polyps.

In November 2023, the American Gastroenterological Association (AGA) issued a clinical practice update that provided advice on risk stratification for CRC screening and post-polypectomy surveillance. For adults older than 75 years specifically, the AGA recommended that the decision to continue CRC screening or perform post-polypectomy surveillance be based on risks, benefits, comorbidities, and screening history and decided on a case-by-case basis.

For instance, previously unscreened patients without comorbidities could benefit from screening beyond age 75 — up to age 80 for men and 90 for women — while those who have had regular colonoscopies, per recommended guidelines, but severe comorbidities that may limit life expectancy could stop sooner, even by age 65.

Although an individualized approach leaves room for variation, it’s essential to consider life expectancy and the time it takes for a polyp to progress to CRC, as well as the risks associated with the procedure itself. Certain older adults are “less likely to live long enough to benefit from surveillance colonoscopy, due to competing, non-CRC mortality risks,” and clinicians should discuss these risks with their patients, the experts explained.
 

When to Stop Screening Colonoscopies

Research shows that screening colonoscopies continue well after the recommended stop age.

A 2023 JAMA Internal Medicine study found, for instance, that a large proportion of screening colonoscopies occurred among the 7067 patients who were 75 years and older with a life expectancy < 10 years. Overall, 30% of patients aged between 76 and 80 years with a limited life expectancy had a colonoscopy. That percentage increased to 71% for those aged 81-85 years and to 100% for those older than 85 years.

But the benefits of screening were minimal. Overall, colonoscopies detected advanced neoplasia in 5.4% of patients aged 76-80 years, 6.2% of those aged 81-85 years, and 9.5% of those older than 85 years. Only 15 patients (0.2%) had CRC detected via colonoscopy, five of whom underwent cancer treatment. Of those five, four had a life expectancy ≥ 10 years, and one had a life expectancy < 10 years.

At the same time, adverse events requiring hospitalization were common 10 days post-colonoscopy (13.58 per 1000), and the risk for hospitalization increased with age.

“For all kinds of screening, we’re not that comfortable in America with the idea that people are eventually going to die, but as you get older, the potential benefits for screening decrease,” study author Dr. Rothberg told this news organization.

In general, life expectancy provides a good predictor of whether people should continue screening or receive treatment following a CRC diagnosis.

Patients aged 76-80 years in good health, for instance, could benefit from screening and, potentially, treatment, Dr. Rothberg said. And “if doctors don’t feel comfortable or confident about predicting life expectancy, taking comorbid illnesses into account can be helpful, especially for that age range.”
 

Weighing Surveillance Benefits

Surveillance colonoscopy is often recommended post-polypectomy to reduce the risk for CRC. But even in this higher-risk population, those older than 75 years may not benefit.

Recent evidence indicates that those with a history of one or two adenomas less than 1 cm in size have only a slightly (1.3-fold) increased risk for incident CRC — and no significant increased risk for fatal CRC.

Another recent study found that detecting CRC at surveillance colonoscopy was rare among older adults. In surveillance colonoscopies performed among 9601 individuals aged 70-85 years with prior adenomas, 12% had advanced neoplasia detected, and only 0.3% had CRC detected.

Similar rates of advanced polyps (7.8%) or CRC (0.2%) were reported in another recent analysis of more than 9800 adults older than 65 years receiving surveillance colonoscopies.

Despite the low rates of polyp and CRC detection, nearly 90% of patients with recommendation information available received advice to return for a future colonoscopy. Even among patients with no polyps or small ones, almost 60% who had life expectancy of less than 5 years were told to return.

Although someone with prior adenomas has a higher risk for CRC, that doesn’t tell the whole story for an individual patient, Samir Gupta, MD, professor of gastroenterology at the University of California San Diego, and co-lead of the Cancer Control Program at Moores Cancer Center, told this news organization. For older adults, it’s vital to consider the competing risks and how much time it might take for CRC to develop.

At Digestive Disease Week in May, Dr. Gupta presented new research that looked at cumulative risk among patients aged 75 years and older with prior precancerous polyps vs prior normal colonoscopies. Although those with prior adenomas had a higher risk for CRC overall, their cumulative CRC risk was low — about 0.3% at 5 years and 0.8% at 10 years. Cumulative CRC deaths were even lower — 0.2% at 5 years and 0.7% at 10 years — while the risk of dying from something other than CRC was 20% at 5 years and 40% at 10 years.

“What this means to me is that patients who are 75 and older should think really carefully about whether they want to do surveillance,” said Dr. Gupta, who coauthored the AGA’s clinical practice update. “Someone who is very healthy and doesn’t have obvious medical problems can look at that risk for developing colon cancer and the risk of dying and make a decision about whether there’s enough concern to go ahead with surveillance.”

Those with competing health priorities, on other hand, should likely concentrate on those instead, he said, and feel reassured that even if they choose not to do surveillance, they’re probably not doing themselves any harm.

“The bottom line is that referring older adults or frail adults for surveillance colonoscopy shouldn’t be a rubber stamp or check-the-box action,” Dr. Gupta said. “We need to think about it carefully and give ourselves — as clinicians and patients — the room to decide that it may not need to take high priority.”
 

 

 

What to Tell Patients

Overall, older adults who have had prior colonoscopies, no or low-risk polyps, and low CRC risk will likely face greater risks from the procedure than benefits.

“The more invasive the screening the test, the more dangerous it could be,” Dr. Rothberg noted.

Many patients, however, are open to stopping and often trust their primary care provider in the decision-making process, said Audrey Calderwood, MD, director of the Comprehensive Gastroenterology Center at Dartmouth Hitchcock Medical Center. “But the systems we have in place don’t optimally support that decision-making at the time it matters most.”

For example, at a prior colonoscopy, a gastroenterologist may recommend surveillance again in 5-7 years. But in the interim, the patient could have new medications or develop comorbidities and other health issues. Rather than defer to the gastroenterologist’s recommendations from years ago, clinicians and patients can reassess the pros and cons of screening or surveillance based on current circumstances, Dr. Calderwood said.

“There should be lines of communication and systems of support to allow primary care providers to decide whether it is still needed,” she said.

While some may be ready to stop, other patients are going to continue to want and ask about CRC screening or surveillance, Dr. Rothberg said.

In these instances, communication style matters.

“You don’t want to tell a patient that they’re not going to be screened because they’re not going to live long enough to benefit,” Dr. Rothberg said.

However, steering people toward less invasive tests or telling them it’s important to give other health problems priority may be more sensitive ways to communicate that it’s time to ramp down or halt screening.

“Sometimes when you say you’re going to stop cancer screening, older adults misperceive that you’re giving up on them,” Dr. Gupta said. “We spend 30-40 years driving home the message that prevention and screening are important, and then it feels like we’re taking it away, so we need to find the best way to discuss it and make the choice that’s comfortable for them.”

Dr. Rothberg, Dr. Gupta, and Dr. Calderwood disclosed no relevant conflicts of interest.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

TYK2 Inhibitor Effective for Psoriasis in Phase 2 Study

Article Type
Changed
Tue, 08/27/2024 - 11:32

 

TOPLINE:

Zasocitinib, a tyrosine kinase 2 (TYK2) inhibitor, at oral doses of ≥ 5 mg led to greater skin clearance than placebo over a period of 12 weeks, in a phase 2b study.
 

METHODOLOGY:

  • Researchers performed a phase 2b, randomized, double-blind trial to assess the efficacy, safety, and tolerability of different doses of zasocitinib in adults with moderate to severe psoriasis (mean age, 47 years; 32% women) at 47 centers in the United States and eight centers in Canada. Most (83%) were White, 7% were Black, and 8% were Asian.
  • A total of 287 patients were randomly assigned to receive one of the four oral doses of zasocitinib (2 mg, 5 mg, 15 mg, or 30 mg, once daily) or a matched placebo for 12 weeks, followed by a 4-week safety monitoring period.
  • The primary outcome was the proportion of patients achieving a ≥ 75% improvement in the Psoriasis Area and Severity Index score (PASI 75) from baseline at week 12.

TAKEAWAY:

  • At week 12, PASI 75 was achieved by 18%, 44%, 68%, and 67% of patients receiving zasocitinib at doses of 2 mg, 5 mg, 15 mg, and 30 mg, respectively, vs 6% of patients receiving placebo.
  • PASI 90 was achieved in 8%, 21%, 45%, and 46% of patients receiving zasocitinib at 2 mg, 5 mg, 15 mg, and 30 mg, respectively, and in no patients in the placebo group.
  • At week 12, 10%, 27%, 49%, and 52% of patients receiving zasocitinib at 2 mg, 5 mg, 15 mg, and 30 mg, respectively, had no or mild disease (a score of 0 or 1) according to the Physician Global Assessment tool vs 4% in the placebo group.
  • Treatment-emergent adverse events occurred in 53%-62% of patients in the zasocitinib groups compared with 44% in the placebo group. The most common were COVID-19, acne/acneiform dermatitis, and diarrhea. There were no reports of major adverse cardiovascular events, thromboembolic events, or opportunistic infections.

IN PRACTICE:

“Zasocitinib, an advanced, potent, and highly selective oral TYK2 inhibitor bioengineered to optimize target coverage and functional selectivity, achieved biologic-level efficacy with complete skin clearance observed after only a 12-week treatment period in up to one third of patients, with a low incidence of known tolerability issues and absence of serious toxic effects that are characteristic of [Janus kinase] 1-3 inhibition,” the authors wrote.

SOURCE:

The study was led by April W. Armstrong, MD, MPH, University of California, Los Angeles, and was published online on August 21, 2024, in JAMA Dermatology.
 

LIMITATIONS:

The study was limited by a relatively small sample size and a short duration. In addition, the inclusion of predominantly White patients may limit the generalizability of findings to a diverse population.
 

DISCLOSURES:

The study was funded by Nimbus Discovery, which includes Nimbus Therapeutics and Nimbus Lakshmi. Dr. Armstrong’s disclosures included receiving grants and/or personal fees from various pharmaceutical companies, including Nimbus Therapeutics and Nimbus. Three authors were employees of and reported holding equity, stocks, or shares in Nimbus. Several authors had disclosures related to pharmaceutical companies, including Nimbus.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

Zasocitinib, a tyrosine kinase 2 (TYK2) inhibitor, at oral doses of ≥ 5 mg led to greater skin clearance than placebo over a period of 12 weeks, in a phase 2b study.
 

METHODOLOGY:

  • Researchers performed a phase 2b, randomized, double-blind trial to assess the efficacy, safety, and tolerability of different doses of zasocitinib in adults with moderate to severe psoriasis (mean age, 47 years; 32% women) at 47 centers in the United States and eight centers in Canada. Most (83%) were White, 7% were Black, and 8% were Asian.
  • A total of 287 patients were randomly assigned to receive one of the four oral doses of zasocitinib (2 mg, 5 mg, 15 mg, or 30 mg, once daily) or a matched placebo for 12 weeks, followed by a 4-week safety monitoring period.
  • The primary outcome was the proportion of patients achieving a ≥ 75% improvement in the Psoriasis Area and Severity Index score (PASI 75) from baseline at week 12.

TAKEAWAY:

  • At week 12, PASI 75 was achieved by 18%, 44%, 68%, and 67% of patients receiving zasocitinib at doses of 2 mg, 5 mg, 15 mg, and 30 mg, respectively, vs 6% of patients receiving placebo.
  • PASI 90 was achieved in 8%, 21%, 45%, and 46% of patients receiving zasocitinib at 2 mg, 5 mg, 15 mg, and 30 mg, respectively, and in no patients in the placebo group.
  • At week 12, 10%, 27%, 49%, and 52% of patients receiving zasocitinib at 2 mg, 5 mg, 15 mg, and 30 mg, respectively, had no or mild disease (a score of 0 or 1) according to the Physician Global Assessment tool vs 4% in the placebo group.
  • Treatment-emergent adverse events occurred in 53%-62% of patients in the zasocitinib groups compared with 44% in the placebo group. The most common were COVID-19, acne/acneiform dermatitis, and diarrhea. There were no reports of major adverse cardiovascular events, thromboembolic events, or opportunistic infections.

IN PRACTICE:

“Zasocitinib, an advanced, potent, and highly selective oral TYK2 inhibitor bioengineered to optimize target coverage and functional selectivity, achieved biologic-level efficacy with complete skin clearance observed after only a 12-week treatment period in up to one third of patients, with a low incidence of known tolerability issues and absence of serious toxic effects that are characteristic of [Janus kinase] 1-3 inhibition,” the authors wrote.

SOURCE:

The study was led by April W. Armstrong, MD, MPH, University of California, Los Angeles, and was published online on August 21, 2024, in JAMA Dermatology.
 

LIMITATIONS:

The study was limited by a relatively small sample size and a short duration. In addition, the inclusion of predominantly White patients may limit the generalizability of findings to a diverse population.
 

DISCLOSURES:

The study was funded by Nimbus Discovery, which includes Nimbus Therapeutics and Nimbus Lakshmi. Dr. Armstrong’s disclosures included receiving grants and/or personal fees from various pharmaceutical companies, including Nimbus Therapeutics and Nimbus. Three authors were employees of and reported holding equity, stocks, or shares in Nimbus. Several authors had disclosures related to pharmaceutical companies, including Nimbus.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

 

TOPLINE:

Zasocitinib, a tyrosine kinase 2 (TYK2) inhibitor, at oral doses of ≥ 5 mg led to greater skin clearance than placebo over a period of 12 weeks, in a phase 2b study.
 

METHODOLOGY:

  • Researchers performed a phase 2b, randomized, double-blind trial to assess the efficacy, safety, and tolerability of different doses of zasocitinib in adults with moderate to severe psoriasis (mean age, 47 years; 32% women) at 47 centers in the United States and eight centers in Canada. Most (83%) were White, 7% were Black, and 8% were Asian.
  • A total of 287 patients were randomly assigned to receive one of the four oral doses of zasocitinib (2 mg, 5 mg, 15 mg, or 30 mg, once daily) or a matched placebo for 12 weeks, followed by a 4-week safety monitoring period.
  • The primary outcome was the proportion of patients achieving a ≥ 75% improvement in the Psoriasis Area and Severity Index score (PASI 75) from baseline at week 12.

TAKEAWAY:

  • At week 12, PASI 75 was achieved by 18%, 44%, 68%, and 67% of patients receiving zasocitinib at doses of 2 mg, 5 mg, 15 mg, and 30 mg, respectively, vs 6% of patients receiving placebo.
  • PASI 90 was achieved in 8%, 21%, 45%, and 46% of patients receiving zasocitinib at 2 mg, 5 mg, 15 mg, and 30 mg, respectively, and in no patients in the placebo group.
  • At week 12, 10%, 27%, 49%, and 52% of patients receiving zasocitinib at 2 mg, 5 mg, 15 mg, and 30 mg, respectively, had no or mild disease (a score of 0 or 1) according to the Physician Global Assessment tool vs 4% in the placebo group.
  • Treatment-emergent adverse events occurred in 53%-62% of patients in the zasocitinib groups compared with 44% in the placebo group. The most common were COVID-19, acne/acneiform dermatitis, and diarrhea. There were no reports of major adverse cardiovascular events, thromboembolic events, or opportunistic infections.

IN PRACTICE:

“Zasocitinib, an advanced, potent, and highly selective oral TYK2 inhibitor bioengineered to optimize target coverage and functional selectivity, achieved biologic-level efficacy with complete skin clearance observed after only a 12-week treatment period in up to one third of patients, with a low incidence of known tolerability issues and absence of serious toxic effects that are characteristic of [Janus kinase] 1-3 inhibition,” the authors wrote.

SOURCE:

The study was led by April W. Armstrong, MD, MPH, University of California, Los Angeles, and was published online on August 21, 2024, in JAMA Dermatology.
 

LIMITATIONS:

The study was limited by a relatively small sample size and a short duration. In addition, the inclusion of predominantly White patients may limit the generalizability of findings to a diverse population.
 

DISCLOSURES:

The study was funded by Nimbus Discovery, which includes Nimbus Therapeutics and Nimbus Lakshmi. Dr. Armstrong’s disclosures included receiving grants and/or personal fees from various pharmaceutical companies, including Nimbus Therapeutics and Nimbus. Three authors were employees of and reported holding equity, stocks, or shares in Nimbus. Several authors had disclosures related to pharmaceutical companies, including Nimbus.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

The Battle Against Recurrent UTIs in Welsh Women

Article Type
Changed
Tue, 09/03/2024 - 05:00

 

TOPLINE:

The prevalence of recurrent urinary tract infections (rUTIs) and the use of antibiotics for prevention are substantial among women in Wales, particularly among those over the age of 57 years. A high level of resistance to two recommended antibiotics was observed, suggesting that more frequent urine cultures could better guide antibiotic selection for treatment and prophylaxis.

METHODOLOGY:

  • The researchers conducted a retrospective cross-sectional study using a large databank of patients in Wales to describe the characteristics and urine profiles of women with rUTIs between 2010 and 2022.
  • They created two cohorts: One with 92,213 women (median age, 60 years) who experienced rUTIs, defined as two or more acute episodes within 6 months or three or more acute episodes within 12 months.
  • Another cohort comprised of 26,862 women (median age, 71 years) were prescribed prophylactic antibiotics, which was defined as receiving three or more consecutive prescriptions of the same UTI-specific antibiotic (trimethoprim, nitrofurantoin, or cefalexin), with intervals of 21-56 days between prescriptions.
  • Urine culture results in the 12 months before a rUTI diagnosis and 18 months before prophylactic antibiotic initiation and all urine culture results within 7 days of an acute UTI were analyzed to assess antibiotic resistance patterns.

TAKEAWAY:

  • Overall, 6% of women had rUTIs, 1.7% of which were prescribed prophylactic antibiotics with proportions increasing sharply after age 57.
  • Nearly half of the women (49%) who were prescribed a prophylactic antibiotic qualified as having rUTIs in the 18 months before initiation.
  • This study showed that 80.8% of women with rUTIs had a urine culture result documented in the 12 months preceding the diagnosis.
  • More than half (64%) of the women taking prophylactic antibiotics had a urine culture result documented before starting treatment, and 18% of those prescribed trimethoprim had resistance to the antibiotic.

IN PRACTICE:

“More frequent urine cultures in the workup of rUTI diagnosis and prophylactic antibiotic initiation could better inform antibiotic choice,” the authors wrote.

SOURCE:

The study was led by Leigh Sanyaolu, BSc (Hons), MRCS, MRCGP, PGDip, a general practitioner from the Division of Population Medicine and PRIME Centre Wales at Cardiff University in Cardiff, and was published online in the British Journal of General Practice.

LIMITATIONS:

The study’s reliance on electronic health records may have led to coding errors and missing data. The diagnosis of UTIs may have been difficult in older women with increased frailty as they can have fewer specific symptoms and asymptomatic bacteriuria, which can be misdiagnosed as a UTI.

DISCLOSURES:

This work was supported by Health and Care Research Wales. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

TOPLINE:

The prevalence of recurrent urinary tract infections (rUTIs) and the use of antibiotics for prevention are substantial among women in Wales, particularly among those over the age of 57 years. A high level of resistance to two recommended antibiotics was observed, suggesting that more frequent urine cultures could better guide antibiotic selection for treatment and prophylaxis.

METHODOLOGY:

  • The researchers conducted a retrospective cross-sectional study using a large databank of patients in Wales to describe the characteristics and urine profiles of women with rUTIs between 2010 and 2022.
  • They created two cohorts: One with 92,213 women (median age, 60 years) who experienced rUTIs, defined as two or more acute episodes within 6 months or three or more acute episodes within 12 months.
  • Another cohort comprised of 26,862 women (median age, 71 years) were prescribed prophylactic antibiotics, which was defined as receiving three or more consecutive prescriptions of the same UTI-specific antibiotic (trimethoprim, nitrofurantoin, or cefalexin), with intervals of 21-56 days between prescriptions.
  • Urine culture results in the 12 months before a rUTI diagnosis and 18 months before prophylactic antibiotic initiation and all urine culture results within 7 days of an acute UTI were analyzed to assess antibiotic resistance patterns.

TAKEAWAY:

  • Overall, 6% of women had rUTIs, 1.7% of which were prescribed prophylactic antibiotics with proportions increasing sharply after age 57.
  • Nearly half of the women (49%) who were prescribed a prophylactic antibiotic qualified as having rUTIs in the 18 months before initiation.
  • This study showed that 80.8% of women with rUTIs had a urine culture result documented in the 12 months preceding the diagnosis.
  • More than half (64%) of the women taking prophylactic antibiotics had a urine culture result documented before starting treatment, and 18% of those prescribed trimethoprim had resistance to the antibiotic.

IN PRACTICE:

“More frequent urine cultures in the workup of rUTI diagnosis and prophylactic antibiotic initiation could better inform antibiotic choice,” the authors wrote.

SOURCE:

The study was led by Leigh Sanyaolu, BSc (Hons), MRCS, MRCGP, PGDip, a general practitioner from the Division of Population Medicine and PRIME Centre Wales at Cardiff University in Cardiff, and was published online in the British Journal of General Practice.

LIMITATIONS:

The study’s reliance on electronic health records may have led to coding errors and missing data. The diagnosis of UTIs may have been difficult in older women with increased frailty as they can have fewer specific symptoms and asymptomatic bacteriuria, which can be misdiagnosed as a UTI.

DISCLOSURES:

This work was supported by Health and Care Research Wales. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

 

TOPLINE:

The prevalence of recurrent urinary tract infections (rUTIs) and the use of antibiotics for prevention are substantial among women in Wales, particularly among those over the age of 57 years. A high level of resistance to two recommended antibiotics was observed, suggesting that more frequent urine cultures could better guide antibiotic selection for treatment and prophylaxis.

METHODOLOGY:

  • The researchers conducted a retrospective cross-sectional study using a large databank of patients in Wales to describe the characteristics and urine profiles of women with rUTIs between 2010 and 2022.
  • They created two cohorts: One with 92,213 women (median age, 60 years) who experienced rUTIs, defined as two or more acute episodes within 6 months or three or more acute episodes within 12 months.
  • Another cohort comprised of 26,862 women (median age, 71 years) were prescribed prophylactic antibiotics, which was defined as receiving three or more consecutive prescriptions of the same UTI-specific antibiotic (trimethoprim, nitrofurantoin, or cefalexin), with intervals of 21-56 days between prescriptions.
  • Urine culture results in the 12 months before a rUTI diagnosis and 18 months before prophylactic antibiotic initiation and all urine culture results within 7 days of an acute UTI were analyzed to assess antibiotic resistance patterns.

TAKEAWAY:

  • Overall, 6% of women had rUTIs, 1.7% of which were prescribed prophylactic antibiotics with proportions increasing sharply after age 57.
  • Nearly half of the women (49%) who were prescribed a prophylactic antibiotic qualified as having rUTIs in the 18 months before initiation.
  • This study showed that 80.8% of women with rUTIs had a urine culture result documented in the 12 months preceding the diagnosis.
  • More than half (64%) of the women taking prophylactic antibiotics had a urine culture result documented before starting treatment, and 18% of those prescribed trimethoprim had resistance to the antibiotic.

IN PRACTICE:

“More frequent urine cultures in the workup of rUTI diagnosis and prophylactic antibiotic initiation could better inform antibiotic choice,” the authors wrote.

SOURCE:

The study was led by Leigh Sanyaolu, BSc (Hons), MRCS, MRCGP, PGDip, a general practitioner from the Division of Population Medicine and PRIME Centre Wales at Cardiff University in Cardiff, and was published online in the British Journal of General Practice.

LIMITATIONS:

The study’s reliance on electronic health records may have led to coding errors and missing data. The diagnosis of UTIs may have been difficult in older women with increased frailty as they can have fewer specific symptoms and asymptomatic bacteriuria, which can be misdiagnosed as a UTI.

DISCLOSURES:

This work was supported by Health and Care Research Wales. The authors declared no conflicts of interest.

This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Gabapentin: The Hope, the Harm, the Myth, the Reality

Article Type
Changed
Tue, 09/03/2024 - 10:09

Since gabapentin was approved by the US Food and Drug Administration (FDA) for treatment of partial-onset seizures and postherpetic neuralgia, it has been used in many different ways, many off-label indications, and with several recent safety warnings.

Early Problems

After FDA approval in 1993 (for partial seizures), gabapentin was promoted by its maker (Park-Davis) for off-label indications, especially for pain. There was no FDA approval for that indication and the studies the company had done were deemed to have been manipulated in a subsequent lawsuit.1 Gabapentin became the nonopioid go-to medication for treatment of pain despite underwhelming evidence.
 

Studies on Neuropathy

In the largest trial of gabapentin for diabetic peripheral neuropathy, Rauck and colleagues found no significant difference in pain relief between gabapentin and placebo.2 A Cochrane review of gabapentin for neuropathic pain concluded that about 30%-40% of patients taking gabapentin for diabetic neuropathy achieved meaningful pain relief with gabapentin use, with a number needed to treat (NNT) of 6.6.3 The review also concluded that for postherpetic neuralgia (an FDA-approved indication) 78% of patients had moderate to substantial benefit with gabapentin (NNT 4.8 for moderate benefit).

Dr. Douglas S. Paauw

Side Effects of Gabapentin

From the Cochrane review, the most common side effects were:  dizziness (19%), somnolence (14%), peripheral edema (7%), and gait disturbance (14%). The number needed to harm for gabapentin was 7.5 The two side effects listed here that are often overlooked that I want to highlight are peripheral edema and gait disturbance. I have seen these both fairly frequently over the years. A side effect not found in the Cochrane review was weight gain. Weight gain with gabapentin was reported in a meta-analysis of drugs that can cause weight gain.4

New Warnings

In December 2019, the FDA released a warning on the potential for serious respiratory problems with gabapentin and pregabalin in patients with certain risk factors: opioid use or use of other drugs that depress the central nervous system, COPD, and other severe lung diseases.5 Rahman and colleagues found that compared with nonuse, gabapentinoid use was associated with increased risk for severe COPD exacerbation (hazard ratio, 1.39; 95% confidence interval, 1.29-1.50).6

Off-Label Uses

Primary care professionals frequently use gabapentin for two off-label indications that are incorporated into practice guidelines. Ryan et al. studied gabapentin in patients with refractory, unexplained chronic cough.7 In a randomized, placebo-controlled trial, gabapentin improved cough-specific quality of life compared with placebo (P = .004; NNT 3.58). Use of gabapentin for treatment of unexplained, refractory cough has been included in several chronic cough practice guidelines.8,9

Gabapentin has been studied for the treatment of restless legs syndrome and has been recommended as an option to treat moderate to severe restless legs syndrome in the American Academy of Sleep Medicine Guidelines.10

Pearl of the Month:

Gabapentin is used widely for many different pain syndromes. The best evidence is for postherpetic neuralgia and diabetic neuropathy. Be aware of the side effects and risks of use in patients with pulmonary disease and who are taking CNS-depressant medications.

Dr. Paauw is professor of medicine in the division of general internal medicine at the University of Washington, Seattle, and he serves as third-year medical student clerkship director at the University of Washington. He is a member of the editorial advisory board of Internal Medicine News. Dr. Paauw has no conflicts to disclose. Contact him at imnews@mdedge.com.

References

1. Landefeld CS, Steinman MA. The Neurontin legacy: marketing through misinformation and manipulation. N Engl J Med. 2009;360(2):103-6.

2. Rauck R et al. A randomized, controlled trial of gabapentin enacarbil in subjects with neuropathic pain associated with diabetic peripheral neuropathy. Pain Pract. 2013;13(6):485-96.

3. Wiffen PJ et al. Gabapentin for chronic neuropathic pain in adults. Cochrane Database Syst Rev. 2017;6(6):CD007938.

4. Domecq JP et al. Clinical review: Drugs commonly associated with weight change: a systematic review and meta-analysis. J Clin Endocrinol Metab. 2015 Feb;100(2):363-70.

5. 12-19-2019 FDA Drug Safety Communication. FDA warns about serious breathing problems with seizure and nerve pain medicines gabapentin (Neurontin, Gralise, Horizant) and pregabalin (Lyrica, Lyrica CR).

6. Rahman AA et al. Gabapentinoids and risk for severe exacerbation in chronic obstructive pulmonary disease: A population-based cohort study. Ann Intern Med. 2024 Feb;177(2):144-54.

7. Ryan NM et al. Gabapentin for refractory chronic cough: a randomised, double-blind, placebo-controlled trial. Lancet 2012;380(9853):1583-9.

8. Gibson P et al. Treatment of unexplained chronic cough: CHEST guideline and expert panel report. Chest. 2016 Jan;149(1):27-44.

9. De Vincentis A et al. Chronic cough in adults: recommendations from an Italian intersociety consensus. Aging Clin Exp Res 2022;34:1529.

10. Aurora RN et al. The treatment of restless legs syndrome and periodic limb movement disorder in adults — an update for 2012: Practice parameters with an evidence-based systematic review and meta-analyses: An American Academy of Sleep Medicine Clinical Practice Guideline. Sleep 2012;35:1039.

Publications
Topics
Sections

Since gabapentin was approved by the US Food and Drug Administration (FDA) for treatment of partial-onset seizures and postherpetic neuralgia, it has been used in many different ways, many off-label indications, and with several recent safety warnings.

Early Problems

After FDA approval in 1993 (for partial seizures), gabapentin was promoted by its maker (Park-Davis) for off-label indications, especially for pain. There was no FDA approval for that indication and the studies the company had done were deemed to have been manipulated in a subsequent lawsuit.1 Gabapentin became the nonopioid go-to medication for treatment of pain despite underwhelming evidence.
 

Studies on Neuropathy

In the largest trial of gabapentin for diabetic peripheral neuropathy, Rauck and colleagues found no significant difference in pain relief between gabapentin and placebo.2 A Cochrane review of gabapentin for neuropathic pain concluded that about 30%-40% of patients taking gabapentin for diabetic neuropathy achieved meaningful pain relief with gabapentin use, with a number needed to treat (NNT) of 6.6.3 The review also concluded that for postherpetic neuralgia (an FDA-approved indication) 78% of patients had moderate to substantial benefit with gabapentin (NNT 4.8 for moderate benefit).

Dr. Douglas S. Paauw

Side Effects of Gabapentin

From the Cochrane review, the most common side effects were:  dizziness (19%), somnolence (14%), peripheral edema (7%), and gait disturbance (14%). The number needed to harm for gabapentin was 7.5 The two side effects listed here that are often overlooked that I want to highlight are peripheral edema and gait disturbance. I have seen these both fairly frequently over the years. A side effect not found in the Cochrane review was weight gain. Weight gain with gabapentin was reported in a meta-analysis of drugs that can cause weight gain.4

New Warnings

In December 2019, the FDA released a warning on the potential for serious respiratory problems with gabapentin and pregabalin in patients with certain risk factors: opioid use or use of other drugs that depress the central nervous system, COPD, and other severe lung diseases.5 Rahman and colleagues found that compared with nonuse, gabapentinoid use was associated with increased risk for severe COPD exacerbation (hazard ratio, 1.39; 95% confidence interval, 1.29-1.50).6

Off-Label Uses

Primary care professionals frequently use gabapentin for two off-label indications that are incorporated into practice guidelines. Ryan et al. studied gabapentin in patients with refractory, unexplained chronic cough.7 In a randomized, placebo-controlled trial, gabapentin improved cough-specific quality of life compared with placebo (P = .004; NNT 3.58). Use of gabapentin for treatment of unexplained, refractory cough has been included in several chronic cough practice guidelines.8,9

Gabapentin has been studied for the treatment of restless legs syndrome and has been recommended as an option to treat moderate to severe restless legs syndrome in the American Academy of Sleep Medicine Guidelines.10

Pearl of the Month:

Gabapentin is used widely for many different pain syndromes. The best evidence is for postherpetic neuralgia and diabetic neuropathy. Be aware of the side effects and risks of use in patients with pulmonary disease and who are taking CNS-depressant medications.

Dr. Paauw is professor of medicine in the division of general internal medicine at the University of Washington, Seattle, and he serves as third-year medical student clerkship director at the University of Washington. He is a member of the editorial advisory board of Internal Medicine News. Dr. Paauw has no conflicts to disclose. Contact him at imnews@mdedge.com.

References

1. Landefeld CS, Steinman MA. The Neurontin legacy: marketing through misinformation and manipulation. N Engl J Med. 2009;360(2):103-6.

2. Rauck R et al. A randomized, controlled trial of gabapentin enacarbil in subjects with neuropathic pain associated with diabetic peripheral neuropathy. Pain Pract. 2013;13(6):485-96.

3. Wiffen PJ et al. Gabapentin for chronic neuropathic pain in adults. Cochrane Database Syst Rev. 2017;6(6):CD007938.

4. Domecq JP et al. Clinical review: Drugs commonly associated with weight change: a systematic review and meta-analysis. J Clin Endocrinol Metab. 2015 Feb;100(2):363-70.

5. 12-19-2019 FDA Drug Safety Communication. FDA warns about serious breathing problems with seizure and nerve pain medicines gabapentin (Neurontin, Gralise, Horizant) and pregabalin (Lyrica, Lyrica CR).

6. Rahman AA et al. Gabapentinoids and risk for severe exacerbation in chronic obstructive pulmonary disease: A population-based cohort study. Ann Intern Med. 2024 Feb;177(2):144-54.

7. Ryan NM et al. Gabapentin for refractory chronic cough: a randomised, double-blind, placebo-controlled trial. Lancet 2012;380(9853):1583-9.

8. Gibson P et al. Treatment of unexplained chronic cough: CHEST guideline and expert panel report. Chest. 2016 Jan;149(1):27-44.

9. De Vincentis A et al. Chronic cough in adults: recommendations from an Italian intersociety consensus. Aging Clin Exp Res 2022;34:1529.

10. Aurora RN et al. The treatment of restless legs syndrome and periodic limb movement disorder in adults — an update for 2012: Practice parameters with an evidence-based systematic review and meta-analyses: An American Academy of Sleep Medicine Clinical Practice Guideline. Sleep 2012;35:1039.

Since gabapentin was approved by the US Food and Drug Administration (FDA) for treatment of partial-onset seizures and postherpetic neuralgia, it has been used in many different ways, many off-label indications, and with several recent safety warnings.

Early Problems

After FDA approval in 1993 (for partial seizures), gabapentin was promoted by its maker (Park-Davis) for off-label indications, especially for pain. There was no FDA approval for that indication and the studies the company had done were deemed to have been manipulated in a subsequent lawsuit.1 Gabapentin became the nonopioid go-to medication for treatment of pain despite underwhelming evidence.
 

Studies on Neuropathy

In the largest trial of gabapentin for diabetic peripheral neuropathy, Rauck and colleagues found no significant difference in pain relief between gabapentin and placebo.2 A Cochrane review of gabapentin for neuropathic pain concluded that about 30%-40% of patients taking gabapentin for diabetic neuropathy achieved meaningful pain relief with gabapentin use, with a number needed to treat (NNT) of 6.6.3 The review also concluded that for postherpetic neuralgia (an FDA-approved indication) 78% of patients had moderate to substantial benefit with gabapentin (NNT 4.8 for moderate benefit).

Dr. Douglas S. Paauw

Side Effects of Gabapentin

From the Cochrane review, the most common side effects were:  dizziness (19%), somnolence (14%), peripheral edema (7%), and gait disturbance (14%). The number needed to harm for gabapentin was 7.5 The two side effects listed here that are often overlooked that I want to highlight are peripheral edema and gait disturbance. I have seen these both fairly frequently over the years. A side effect not found in the Cochrane review was weight gain. Weight gain with gabapentin was reported in a meta-analysis of drugs that can cause weight gain.4

New Warnings

In December 2019, the FDA released a warning on the potential for serious respiratory problems with gabapentin and pregabalin in patients with certain risk factors: opioid use or use of other drugs that depress the central nervous system, COPD, and other severe lung diseases.5 Rahman and colleagues found that compared with nonuse, gabapentinoid use was associated with increased risk for severe COPD exacerbation (hazard ratio, 1.39; 95% confidence interval, 1.29-1.50).6

Off-Label Uses

Primary care professionals frequently use gabapentin for two off-label indications that are incorporated into practice guidelines. Ryan et al. studied gabapentin in patients with refractory, unexplained chronic cough.7 In a randomized, placebo-controlled trial, gabapentin improved cough-specific quality of life compared with placebo (P = .004; NNT 3.58). Use of gabapentin for treatment of unexplained, refractory cough has been included in several chronic cough practice guidelines.8,9

Gabapentin has been studied for the treatment of restless legs syndrome and has been recommended as an option to treat moderate to severe restless legs syndrome in the American Academy of Sleep Medicine Guidelines.10

Pearl of the Month:

Gabapentin is used widely for many different pain syndromes. The best evidence is for postherpetic neuralgia and diabetic neuropathy. Be aware of the side effects and risks of use in patients with pulmonary disease and who are taking CNS-depressant medications.

Dr. Paauw is professor of medicine in the division of general internal medicine at the University of Washington, Seattle, and he serves as third-year medical student clerkship director at the University of Washington. He is a member of the editorial advisory board of Internal Medicine News. Dr. Paauw has no conflicts to disclose. Contact him at imnews@mdedge.com.

References

1. Landefeld CS, Steinman MA. The Neurontin legacy: marketing through misinformation and manipulation. N Engl J Med. 2009;360(2):103-6.

2. Rauck R et al. A randomized, controlled trial of gabapentin enacarbil in subjects with neuropathic pain associated with diabetic peripheral neuropathy. Pain Pract. 2013;13(6):485-96.

3. Wiffen PJ et al. Gabapentin for chronic neuropathic pain in adults. Cochrane Database Syst Rev. 2017;6(6):CD007938.

4. Domecq JP et al. Clinical review: Drugs commonly associated with weight change: a systematic review and meta-analysis. J Clin Endocrinol Metab. 2015 Feb;100(2):363-70.

5. 12-19-2019 FDA Drug Safety Communication. FDA warns about serious breathing problems with seizure and nerve pain medicines gabapentin (Neurontin, Gralise, Horizant) and pregabalin (Lyrica, Lyrica CR).

6. Rahman AA et al. Gabapentinoids and risk for severe exacerbation in chronic obstructive pulmonary disease: A population-based cohort study. Ann Intern Med. 2024 Feb;177(2):144-54.

7. Ryan NM et al. Gabapentin for refractory chronic cough: a randomised, double-blind, placebo-controlled trial. Lancet 2012;380(9853):1583-9.

8. Gibson P et al. Treatment of unexplained chronic cough: CHEST guideline and expert panel report. Chest. 2016 Jan;149(1):27-44.

9. De Vincentis A et al. Chronic cough in adults: recommendations from an Italian intersociety consensus. Aging Clin Exp Res 2022;34:1529.

10. Aurora RN et al. The treatment of restless legs syndrome and periodic limb movement disorder in adults — an update for 2012: Practice parameters with an evidence-based systematic review and meta-analyses: An American Academy of Sleep Medicine Clinical Practice Guideline. Sleep 2012;35:1039.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Just A Single Night of Poor Sleep May Change Serum Proteins

Article Type
Changed
Tue, 08/27/2024 - 10:40

A single night of sleep deprivation had a significant impact on human blood serum, based on new data from an analysis of nearly 500 proteins. Compromised sleep has demonstrated negative effects on cardiovascular, immune, and neuronal systems, and previous studies have shown human serum proteome changes after a simulation of night shift work, wrote Alvhild Alette Bjørkum, MD, of Western Norway University of Applied Sciences, Bergen, and colleagues.

In a pilot study published in Sleep Advances, the researchers recruited eight healthy adult women aged 22-57 years with no history of neurologic or psychiatric problems to participate in a study of the effect of compromised sleep on protein profiles, with implications for effects on cells, tissues, and organ systems. Each of the participants served as their own controls, and blood samples were taken after 6 hours of sleep at night, and again after 6 hours of sleep deprivation the following night.

The researchers identified analyzed 494 proteins using mass spectrometry. Of these, 66 were differentially expressed after 6 hours of sleep deprivation. The top enriched biologic processes of these significantly changed proteins were protein activation cascade, platelet degranulation, blood coagulation, and hemostasis.

Further analysis using gene ontology showed changes in response to sleep deprivation in biologic process, molecular function, and immune system process categories, including specific associations related to wound healing, cholesterol transport, high-density lipoprotein particle receptor binding, and granulocyte chemotaxis.

The findings were limited by several factors including the small sample size, inclusion only of adult females, and the use of data from only 1 night of sleep deprivation, the researchers noted. However, the results support previous studies showing a negative impact of sleep deprivation on biologic functions, they said.

“Our study was able to reveal another set of human serum proteins that were altered by sleep deprivation and could connect similar biological processes to sleep deprivation that have been identified before with slightly different methods,” the researchers concluded. The study findings add to the knowledge base for the protein profiling of sleep deprivation, which may inform the development of tools to manage lack of sleep and mistimed sleep, particularly in shift workers.
 

Too Soon for Clinical Implications

“The adverse impact of poor sleep across many organ systems is gaining recognition, but the mechanisms underlying sleep-related pathology are not well understood,” Evan L. Brittain, MD, of Vanderbilt University, Nashville, Tennessee, said in an interview. “Studies like this begin to shed light on the mechanisms by which poor or reduced sleep affects specific bodily functions,” added Dr. Brittain, who was not involved in the study.

“The effects of other acute physiologic stressor such as exercise on the circulating proteome are well described. In that regard, it is not surprising that a brief episode of sleep deprivation would lead to detectable changes in the circulation,” Dr. Brittain said.

However, the specific changes reported in this study are difficult to interpret because of methodological and analytical concerns, particularly the small sample size, lack of an external validation cohort, and absence of appropriate statistical adjustments in the results, Dr. Brittain noted. These limitations prevent consideration of clinical implications without further study.

The study received no outside funding. Neither the researchers nor Dr. Brittain disclosed any conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

A single night of sleep deprivation had a significant impact on human blood serum, based on new data from an analysis of nearly 500 proteins. Compromised sleep has demonstrated negative effects on cardiovascular, immune, and neuronal systems, and previous studies have shown human serum proteome changes after a simulation of night shift work, wrote Alvhild Alette Bjørkum, MD, of Western Norway University of Applied Sciences, Bergen, and colleagues.

In a pilot study published in Sleep Advances, the researchers recruited eight healthy adult women aged 22-57 years with no history of neurologic or psychiatric problems to participate in a study of the effect of compromised sleep on protein profiles, with implications for effects on cells, tissues, and organ systems. Each of the participants served as their own controls, and blood samples were taken after 6 hours of sleep at night, and again after 6 hours of sleep deprivation the following night.

The researchers identified analyzed 494 proteins using mass spectrometry. Of these, 66 were differentially expressed after 6 hours of sleep deprivation. The top enriched biologic processes of these significantly changed proteins were protein activation cascade, platelet degranulation, blood coagulation, and hemostasis.

Further analysis using gene ontology showed changes in response to sleep deprivation in biologic process, molecular function, and immune system process categories, including specific associations related to wound healing, cholesterol transport, high-density lipoprotein particle receptor binding, and granulocyte chemotaxis.

The findings were limited by several factors including the small sample size, inclusion only of adult females, and the use of data from only 1 night of sleep deprivation, the researchers noted. However, the results support previous studies showing a negative impact of sleep deprivation on biologic functions, they said.

“Our study was able to reveal another set of human serum proteins that were altered by sleep deprivation and could connect similar biological processes to sleep deprivation that have been identified before with slightly different methods,” the researchers concluded. The study findings add to the knowledge base for the protein profiling of sleep deprivation, which may inform the development of tools to manage lack of sleep and mistimed sleep, particularly in shift workers.
 

Too Soon for Clinical Implications

“The adverse impact of poor sleep across many organ systems is gaining recognition, but the mechanisms underlying sleep-related pathology are not well understood,” Evan L. Brittain, MD, of Vanderbilt University, Nashville, Tennessee, said in an interview. “Studies like this begin to shed light on the mechanisms by which poor or reduced sleep affects specific bodily functions,” added Dr. Brittain, who was not involved in the study.

“The effects of other acute physiologic stressor such as exercise on the circulating proteome are well described. In that regard, it is not surprising that a brief episode of sleep deprivation would lead to detectable changes in the circulation,” Dr. Brittain said.

However, the specific changes reported in this study are difficult to interpret because of methodological and analytical concerns, particularly the small sample size, lack of an external validation cohort, and absence of appropriate statistical adjustments in the results, Dr. Brittain noted. These limitations prevent consideration of clinical implications without further study.

The study received no outside funding. Neither the researchers nor Dr. Brittain disclosed any conflicts of interest.

A version of this article first appeared on Medscape.com.

A single night of sleep deprivation had a significant impact on human blood serum, based on new data from an analysis of nearly 500 proteins. Compromised sleep has demonstrated negative effects on cardiovascular, immune, and neuronal systems, and previous studies have shown human serum proteome changes after a simulation of night shift work, wrote Alvhild Alette Bjørkum, MD, of Western Norway University of Applied Sciences, Bergen, and colleagues.

In a pilot study published in Sleep Advances, the researchers recruited eight healthy adult women aged 22-57 years with no history of neurologic or psychiatric problems to participate in a study of the effect of compromised sleep on protein profiles, with implications for effects on cells, tissues, and organ systems. Each of the participants served as their own controls, and blood samples were taken after 6 hours of sleep at night, and again after 6 hours of sleep deprivation the following night.

The researchers identified analyzed 494 proteins using mass spectrometry. Of these, 66 were differentially expressed after 6 hours of sleep deprivation. The top enriched biologic processes of these significantly changed proteins were protein activation cascade, platelet degranulation, blood coagulation, and hemostasis.

Further analysis using gene ontology showed changes in response to sleep deprivation in biologic process, molecular function, and immune system process categories, including specific associations related to wound healing, cholesterol transport, high-density lipoprotein particle receptor binding, and granulocyte chemotaxis.

The findings were limited by several factors including the small sample size, inclusion only of adult females, and the use of data from only 1 night of sleep deprivation, the researchers noted. However, the results support previous studies showing a negative impact of sleep deprivation on biologic functions, they said.

“Our study was able to reveal another set of human serum proteins that were altered by sleep deprivation and could connect similar biological processes to sleep deprivation that have been identified before with slightly different methods,” the researchers concluded. The study findings add to the knowledge base for the protein profiling of sleep deprivation, which may inform the development of tools to manage lack of sleep and mistimed sleep, particularly in shift workers.
 

Too Soon for Clinical Implications

“The adverse impact of poor sleep across many organ systems is gaining recognition, but the mechanisms underlying sleep-related pathology are not well understood,” Evan L. Brittain, MD, of Vanderbilt University, Nashville, Tennessee, said in an interview. “Studies like this begin to shed light on the mechanisms by which poor or reduced sleep affects specific bodily functions,” added Dr. Brittain, who was not involved in the study.

“The effects of other acute physiologic stressor such as exercise on the circulating proteome are well described. In that regard, it is not surprising that a brief episode of sleep deprivation would lead to detectable changes in the circulation,” Dr. Brittain said.

However, the specific changes reported in this study are difficult to interpret because of methodological and analytical concerns, particularly the small sample size, lack of an external validation cohort, and absence of appropriate statistical adjustments in the results, Dr. Brittain noted. These limitations prevent consideration of clinical implications without further study.

The study received no outside funding. Neither the researchers nor Dr. Brittain disclosed any conflicts of interest.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SLEEP ADVANCES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Wildfire Pollution May Increase Asthma Hospitalizations

Article Type
Changed
Tue, 08/27/2024 - 10:34

Higher levels of air pollution from wildfires were associated with significant spikes in hospitalizations for asthma and a slight increase in hospitalizations for COPD in surrounding areas, based on data from approximately 80,000 individuals.

Short-term increases in fine particulate matter (PM2.5) resulting from wildfire smoke are becoming a greater global problem and have been associated with poor asthma and COPD outcomes, wrote Benjamin D. Horne, PhD, of the Intermountain Medical Center Heart Institute, Salt Lake City, Utah, and colleagues. However, the effect of short-term increases in PM2.5 on hospitalizations for asthma and COPD has not been well studied, they noted.

“Our primary reason for studying the association of air pollution in the summer/fall wildfire season separately from the winter is that the drought conditions in the western United States from 2012-2022 resulted in more wildfires and increasingly large wildfires across the west,” Dr. Horne said in an interview. “In part, this provided a chance to measure an increase of fine particulate matter (PM2.5) air pollution from wildfires and also to track what happened to their health when people were exposed to the PM2.5 from wildfire,” he said. 

During 2020-2022, the PM2.5 produced during the wildfire season exceeded the PM2.5 levels measured in the winter for the first time, Dr. Horne said. In the part of Utah where the study was conducted, PM2.5 increases in winter because of a combination of concentrated PM2.5 from cars and industry and a weather phenomenon known as a temperature inversion, he said. 

A temperature inversion occurs when mountain topography traps pollutants near the ground where the people are, but only during times of cold and snowy weather, Dr. Horne said. 

“Past studies in the region were conducted with the assumption that the winter inversion was the primary source of pollution-related health risks, and public and healthcare guidance for health was based on avoiding winter air pollution,” Dr. Horne noted. However, “it may be that the smoke from wildfires requires people to also anticipate how to avoid exposure to PM2.5 during the summer,” he said. 

In a study published in CHEST Pulmonary, the researchers reviewed data from 63,976 patients hospitalized with asthma and 18,514 hospitalized with COPD between January 1999 and March 2022 who lived in an area of Utah in which PM2.5 and ozone are measured by the Environmental Protection Agency. The average age of the asthma patients was 22.6 years; 51.0% were women, 16.0% had hypertension, and 10.1% had a history of smoking. The average age of the COPD patients was 63.5 years, 50.3% were women, 69.1% had hypertension, and 42.3% had a history of smoking.

In a regression analysis, the risk for asthma was significantly associated with days of increased PM2.5 during wildfire season and similar to the winter inversion (when cold air traps air pollutants), with odds ratios (ORs) of 1.057 and 1.023 for every 10 µg per m3 of particulate matter, respectively. 

Although the risk for asthma hospitalization decreased after a week, a rebound occurred during wildfire season after a 4-week lag, with an OR of 1.098 for every 10 µg per m3 of particulate matter, the researchers wrote. A review of all months showed a significant association between a concurrent day increase in PM2.5 and asthma hospitalization (OR, 1.020 per every 10 µg per m3 of particulate matter, P = .0006).

By contrast, PM2.5 increases had only a weak association with hospitalizations for COPD during either wildfire season or winter inversion season, and ozone was not associated with increased risks for patients with asthma or COPD. 

The findings were limited by several factors including the observational design, potential for confounding, and relatively homogeneous study population, the researchers noted.

However, “these findings suggest that people should be aware of the risks from wildfire-generated PM2.5 during the summer and fall, including following best practices for people with asthma such as anticipating symptoms in warm months, carrying medications during summer activities, and expecting to stay indoors to avoid smoke exposure when wildfires have polluted the outdoor air,” Dr. Horne told this news organization.

In the current study, Dr. Horne and colleagues expected to see increases in the risk for asthma and COPD during summer wildfire season. “What was surprising was that the size of the risk of needing care of asthma appeared to occur just as rapidly after the PM2.5 became elevated during wildfire events as it did in the winter,” said Dr. Horne. “Further, the risk in the summer appeared to be greater than during the winter. Increases in hospitalization for asthma occurred on the same day and throughout the first week after a rise in air pollution in summer and early fall, and especially in children that risk remained increased for up to a month after the rise in air pollution,” he said. 

Clinicians should be aware of environmental sources of respiratory declines caused by wildfire smoke that may prompt patients to seek care during wildfire events, said Horne. Finally, the general population should recognize the smell of smoke during warm months as an alert that leads to greater caution about spending time outdoors during wildfire events, he said. “Short-term PM2.5 elevations may affect respiratory health and have other effects such as on heart health,” Dr. Horne said. “In general, people should avoid outdoor exercise when air pollution is elevated, since the amount of air that is breathed in during exercise is substantially increased,” he added. 

“Further research is needed regarding the mechanisms of effect from PM2.5 on health risk, including effects on respiratory and cardiovascular health,” said Dr. Horne. “This includes evaluating what biomarkers in the blood are changed by air pollution such as inflammatory factors, determining whether some medications may block or reduce the adverse effects of air pollution, and examining whether masks or indoor air purifiers have a meaningful benefit in protecting health during short-term air pollution elevations,” he said.
 

 

 

Data Reveal Respiratory Impact of Wildfires

“Fine particle air pollution has been linked to poor respiratory health outcomes, but relatively little is known about the specific impact of wildfire particulate pollution on patients living in urban population centers,” Alexander S. Rabin, MD, of the University of Michigan, Ann Arbor, said in an interview. 

“Although it is known that wildfire risk is increasing throughout the western United States, the increase in the number of days per month with elevated fine particulate matter from 1999 to 2022 was striking,” said Dr. Rabin, who was not involved in the current study. “Over the same period, there was a decrease in the number of high fine particulate matter air pollution days related to the wintertime temperature inversion phenomenon when air pollutants are trapped in Utah’s valleys,” he said. “These data underscore the increased risk of wildfire-related air pollution relative to ‘traditional sources of air pollution from industrial and transportation sources,” he added. 

Although the adverse effects of exposure to wildfire smoke and inversion season pollution on asthma were not unexpected, the degree of the effect size of wildfire smoke relative to inversion season was surprising, said Dr. Rabin.

“Why the wildfire smoke seems to have a worse impact on asthma outcomes could not be determined from this study, but there may be something inherently more dangerous about the cocktail of pollutants released when large wildfires burn uncontrolled,” he said. “I was surprised by the lack of association between wildfire smoke and adverse COPD outcomes; whether this relates to physiological differences or variations in healthcare-seeking behaviors between patients with asthma vs COPD is unknown,” he added. 

The current study underscores the harmful effects of fine particulate pollution from wildfire smoke on health, and the increased risk for hospitalization for those with asthma even in urban environments far from the source of the fire, Dr. Rabin said.

However, limitations include the use of estimates of fine particulate pollution taken from monitoring stations that were an average of 14 km from the participants’ primary residences, and air quality measurements may not have accurately reflected exposure, Dr. Rabin noted. “Additionally, the population studied was not reflective of the US population, with approximately 80% of study participants described as non-Hispanic white,” he said. “Patients of color may have increased vulnerability to adverse outcomes from air pollution and therefore additional study is needed in these populations,” Dr. Rabin added.

The study was supported in part by the AIRHEALTH program project and by internal institutional funds. Dr. Horne disclosed serving on the advisory board of Opsis Health, previously consulting for Pfizer regarding risk scores and serving as site principal investigator of a grant funded by the Task Force for Global Health and a grant from the Patient-Centered Outcomes Research Institute and the NIH-funded RECOVER initiative. Dr. Rabin had no financial conflicts to disclose.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Higher levels of air pollution from wildfires were associated with significant spikes in hospitalizations for asthma and a slight increase in hospitalizations for COPD in surrounding areas, based on data from approximately 80,000 individuals.

Short-term increases in fine particulate matter (PM2.5) resulting from wildfire smoke are becoming a greater global problem and have been associated with poor asthma and COPD outcomes, wrote Benjamin D. Horne, PhD, of the Intermountain Medical Center Heart Institute, Salt Lake City, Utah, and colleagues. However, the effect of short-term increases in PM2.5 on hospitalizations for asthma and COPD has not been well studied, they noted.

“Our primary reason for studying the association of air pollution in the summer/fall wildfire season separately from the winter is that the drought conditions in the western United States from 2012-2022 resulted in more wildfires and increasingly large wildfires across the west,” Dr. Horne said in an interview. “In part, this provided a chance to measure an increase of fine particulate matter (PM2.5) air pollution from wildfires and also to track what happened to their health when people were exposed to the PM2.5 from wildfire,” he said. 

During 2020-2022, the PM2.5 produced during the wildfire season exceeded the PM2.5 levels measured in the winter for the first time, Dr. Horne said. In the part of Utah where the study was conducted, PM2.5 increases in winter because of a combination of concentrated PM2.5 from cars and industry and a weather phenomenon known as a temperature inversion, he said. 

A temperature inversion occurs when mountain topography traps pollutants near the ground where the people are, but only during times of cold and snowy weather, Dr. Horne said. 

“Past studies in the region were conducted with the assumption that the winter inversion was the primary source of pollution-related health risks, and public and healthcare guidance for health was based on avoiding winter air pollution,” Dr. Horne noted. However, “it may be that the smoke from wildfires requires people to also anticipate how to avoid exposure to PM2.5 during the summer,” he said. 

In a study published in CHEST Pulmonary, the researchers reviewed data from 63,976 patients hospitalized with asthma and 18,514 hospitalized with COPD between January 1999 and March 2022 who lived in an area of Utah in which PM2.5 and ozone are measured by the Environmental Protection Agency. The average age of the asthma patients was 22.6 years; 51.0% were women, 16.0% had hypertension, and 10.1% had a history of smoking. The average age of the COPD patients was 63.5 years, 50.3% were women, 69.1% had hypertension, and 42.3% had a history of smoking.

In a regression analysis, the risk for asthma was significantly associated with days of increased PM2.5 during wildfire season and similar to the winter inversion (when cold air traps air pollutants), with odds ratios (ORs) of 1.057 and 1.023 for every 10 µg per m3 of particulate matter, respectively. 

Although the risk for asthma hospitalization decreased after a week, a rebound occurred during wildfire season after a 4-week lag, with an OR of 1.098 for every 10 µg per m3 of particulate matter, the researchers wrote. A review of all months showed a significant association between a concurrent day increase in PM2.5 and asthma hospitalization (OR, 1.020 per every 10 µg per m3 of particulate matter, P = .0006).

By contrast, PM2.5 increases had only a weak association with hospitalizations for COPD during either wildfire season or winter inversion season, and ozone was not associated with increased risks for patients with asthma or COPD. 

The findings were limited by several factors including the observational design, potential for confounding, and relatively homogeneous study population, the researchers noted.

However, “these findings suggest that people should be aware of the risks from wildfire-generated PM2.5 during the summer and fall, including following best practices for people with asthma such as anticipating symptoms in warm months, carrying medications during summer activities, and expecting to stay indoors to avoid smoke exposure when wildfires have polluted the outdoor air,” Dr. Horne told this news organization.

In the current study, Dr. Horne and colleagues expected to see increases in the risk for asthma and COPD during summer wildfire season. “What was surprising was that the size of the risk of needing care of asthma appeared to occur just as rapidly after the PM2.5 became elevated during wildfire events as it did in the winter,” said Dr. Horne. “Further, the risk in the summer appeared to be greater than during the winter. Increases in hospitalization for asthma occurred on the same day and throughout the first week after a rise in air pollution in summer and early fall, and especially in children that risk remained increased for up to a month after the rise in air pollution,” he said. 

Clinicians should be aware of environmental sources of respiratory declines caused by wildfire smoke that may prompt patients to seek care during wildfire events, said Horne. Finally, the general population should recognize the smell of smoke during warm months as an alert that leads to greater caution about spending time outdoors during wildfire events, he said. “Short-term PM2.5 elevations may affect respiratory health and have other effects such as on heart health,” Dr. Horne said. “In general, people should avoid outdoor exercise when air pollution is elevated, since the amount of air that is breathed in during exercise is substantially increased,” he added. 

“Further research is needed regarding the mechanisms of effect from PM2.5 on health risk, including effects on respiratory and cardiovascular health,” said Dr. Horne. “This includes evaluating what biomarkers in the blood are changed by air pollution such as inflammatory factors, determining whether some medications may block or reduce the adverse effects of air pollution, and examining whether masks or indoor air purifiers have a meaningful benefit in protecting health during short-term air pollution elevations,” he said.
 

 

 

Data Reveal Respiratory Impact of Wildfires

“Fine particle air pollution has been linked to poor respiratory health outcomes, but relatively little is known about the specific impact of wildfire particulate pollution on patients living in urban population centers,” Alexander S. Rabin, MD, of the University of Michigan, Ann Arbor, said in an interview. 

“Although it is known that wildfire risk is increasing throughout the western United States, the increase in the number of days per month with elevated fine particulate matter from 1999 to 2022 was striking,” said Dr. Rabin, who was not involved in the current study. “Over the same period, there was a decrease in the number of high fine particulate matter air pollution days related to the wintertime temperature inversion phenomenon when air pollutants are trapped in Utah’s valleys,” he said. “These data underscore the increased risk of wildfire-related air pollution relative to ‘traditional sources of air pollution from industrial and transportation sources,” he added. 

Although the adverse effects of exposure to wildfire smoke and inversion season pollution on asthma were not unexpected, the degree of the effect size of wildfire smoke relative to inversion season was surprising, said Dr. Rabin.

“Why the wildfire smoke seems to have a worse impact on asthma outcomes could not be determined from this study, but there may be something inherently more dangerous about the cocktail of pollutants released when large wildfires burn uncontrolled,” he said. “I was surprised by the lack of association between wildfire smoke and adverse COPD outcomes; whether this relates to physiological differences or variations in healthcare-seeking behaviors between patients with asthma vs COPD is unknown,” he added. 

The current study underscores the harmful effects of fine particulate pollution from wildfire smoke on health, and the increased risk for hospitalization for those with asthma even in urban environments far from the source of the fire, Dr. Rabin said.

However, limitations include the use of estimates of fine particulate pollution taken from monitoring stations that were an average of 14 km from the participants’ primary residences, and air quality measurements may not have accurately reflected exposure, Dr. Rabin noted. “Additionally, the population studied was not reflective of the US population, with approximately 80% of study participants described as non-Hispanic white,” he said. “Patients of color may have increased vulnerability to adverse outcomes from air pollution and therefore additional study is needed in these populations,” Dr. Rabin added.

The study was supported in part by the AIRHEALTH program project and by internal institutional funds. Dr. Horne disclosed serving on the advisory board of Opsis Health, previously consulting for Pfizer regarding risk scores and serving as site principal investigator of a grant funded by the Task Force for Global Health and a grant from the Patient-Centered Outcomes Research Institute and the NIH-funded RECOVER initiative. Dr. Rabin had no financial conflicts to disclose.
 

A version of this article first appeared on Medscape.com.

Higher levels of air pollution from wildfires were associated with significant spikes in hospitalizations for asthma and a slight increase in hospitalizations for COPD in surrounding areas, based on data from approximately 80,000 individuals.

Short-term increases in fine particulate matter (PM2.5) resulting from wildfire smoke are becoming a greater global problem and have been associated with poor asthma and COPD outcomes, wrote Benjamin D. Horne, PhD, of the Intermountain Medical Center Heart Institute, Salt Lake City, Utah, and colleagues. However, the effect of short-term increases in PM2.5 on hospitalizations for asthma and COPD has not been well studied, they noted.

“Our primary reason for studying the association of air pollution in the summer/fall wildfire season separately from the winter is that the drought conditions in the western United States from 2012-2022 resulted in more wildfires and increasingly large wildfires across the west,” Dr. Horne said in an interview. “In part, this provided a chance to measure an increase of fine particulate matter (PM2.5) air pollution from wildfires and also to track what happened to their health when people were exposed to the PM2.5 from wildfire,” he said. 

During 2020-2022, the PM2.5 produced during the wildfire season exceeded the PM2.5 levels measured in the winter for the first time, Dr. Horne said. In the part of Utah where the study was conducted, PM2.5 increases in winter because of a combination of concentrated PM2.5 from cars and industry and a weather phenomenon known as a temperature inversion, he said. 

A temperature inversion occurs when mountain topography traps pollutants near the ground where the people are, but only during times of cold and snowy weather, Dr. Horne said. 

“Past studies in the region were conducted with the assumption that the winter inversion was the primary source of pollution-related health risks, and public and healthcare guidance for health was based on avoiding winter air pollution,” Dr. Horne noted. However, “it may be that the smoke from wildfires requires people to also anticipate how to avoid exposure to PM2.5 during the summer,” he said. 

In a study published in CHEST Pulmonary, the researchers reviewed data from 63,976 patients hospitalized with asthma and 18,514 hospitalized with COPD between January 1999 and March 2022 who lived in an area of Utah in which PM2.5 and ozone are measured by the Environmental Protection Agency. The average age of the asthma patients was 22.6 years; 51.0% were women, 16.0% had hypertension, and 10.1% had a history of smoking. The average age of the COPD patients was 63.5 years, 50.3% were women, 69.1% had hypertension, and 42.3% had a history of smoking.

In a regression analysis, the risk for asthma was significantly associated with days of increased PM2.5 during wildfire season and similar to the winter inversion (when cold air traps air pollutants), with odds ratios (ORs) of 1.057 and 1.023 for every 10 µg per m3 of particulate matter, respectively. 

Although the risk for asthma hospitalization decreased after a week, a rebound occurred during wildfire season after a 4-week lag, with an OR of 1.098 for every 10 µg per m3 of particulate matter, the researchers wrote. A review of all months showed a significant association between a concurrent day increase in PM2.5 and asthma hospitalization (OR, 1.020 per every 10 µg per m3 of particulate matter, P = .0006).

By contrast, PM2.5 increases had only a weak association with hospitalizations for COPD during either wildfire season or winter inversion season, and ozone was not associated with increased risks for patients with asthma or COPD. 

The findings were limited by several factors including the observational design, potential for confounding, and relatively homogeneous study population, the researchers noted.

However, “these findings suggest that people should be aware of the risks from wildfire-generated PM2.5 during the summer and fall, including following best practices for people with asthma such as anticipating symptoms in warm months, carrying medications during summer activities, and expecting to stay indoors to avoid smoke exposure when wildfires have polluted the outdoor air,” Dr. Horne told this news organization.

In the current study, Dr. Horne and colleagues expected to see increases in the risk for asthma and COPD during summer wildfire season. “What was surprising was that the size of the risk of needing care of asthma appeared to occur just as rapidly after the PM2.5 became elevated during wildfire events as it did in the winter,” said Dr. Horne. “Further, the risk in the summer appeared to be greater than during the winter. Increases in hospitalization for asthma occurred on the same day and throughout the first week after a rise in air pollution in summer and early fall, and especially in children that risk remained increased for up to a month after the rise in air pollution,” he said. 

Clinicians should be aware of environmental sources of respiratory declines caused by wildfire smoke that may prompt patients to seek care during wildfire events, said Horne. Finally, the general population should recognize the smell of smoke during warm months as an alert that leads to greater caution about spending time outdoors during wildfire events, he said. “Short-term PM2.5 elevations may affect respiratory health and have other effects such as on heart health,” Dr. Horne said. “In general, people should avoid outdoor exercise when air pollution is elevated, since the amount of air that is breathed in during exercise is substantially increased,” he added. 

“Further research is needed regarding the mechanisms of effect from PM2.5 on health risk, including effects on respiratory and cardiovascular health,” said Dr. Horne. “This includes evaluating what biomarkers in the blood are changed by air pollution such as inflammatory factors, determining whether some medications may block or reduce the adverse effects of air pollution, and examining whether masks or indoor air purifiers have a meaningful benefit in protecting health during short-term air pollution elevations,” he said.
 

 

 

Data Reveal Respiratory Impact of Wildfires

“Fine particle air pollution has been linked to poor respiratory health outcomes, but relatively little is known about the specific impact of wildfire particulate pollution on patients living in urban population centers,” Alexander S. Rabin, MD, of the University of Michigan, Ann Arbor, said in an interview. 

“Although it is known that wildfire risk is increasing throughout the western United States, the increase in the number of days per month with elevated fine particulate matter from 1999 to 2022 was striking,” said Dr. Rabin, who was not involved in the current study. “Over the same period, there was a decrease in the number of high fine particulate matter air pollution days related to the wintertime temperature inversion phenomenon when air pollutants are trapped in Utah’s valleys,” he said. “These data underscore the increased risk of wildfire-related air pollution relative to ‘traditional sources of air pollution from industrial and transportation sources,” he added. 

Although the adverse effects of exposure to wildfire smoke and inversion season pollution on asthma were not unexpected, the degree of the effect size of wildfire smoke relative to inversion season was surprising, said Dr. Rabin.

“Why the wildfire smoke seems to have a worse impact on asthma outcomes could not be determined from this study, but there may be something inherently more dangerous about the cocktail of pollutants released when large wildfires burn uncontrolled,” he said. “I was surprised by the lack of association between wildfire smoke and adverse COPD outcomes; whether this relates to physiological differences or variations in healthcare-seeking behaviors between patients with asthma vs COPD is unknown,” he added. 

The current study underscores the harmful effects of fine particulate pollution from wildfire smoke on health, and the increased risk for hospitalization for those with asthma even in urban environments far from the source of the fire, Dr. Rabin said.

However, limitations include the use of estimates of fine particulate pollution taken from monitoring stations that were an average of 14 km from the participants’ primary residences, and air quality measurements may not have accurately reflected exposure, Dr. Rabin noted. “Additionally, the population studied was not reflective of the US population, with approximately 80% of study participants described as non-Hispanic white,” he said. “Patients of color may have increased vulnerability to adverse outcomes from air pollution and therefore additional study is needed in these populations,” Dr. Rabin added.

The study was supported in part by the AIRHEALTH program project and by internal institutional funds. Dr. Horne disclosed serving on the advisory board of Opsis Health, previously consulting for Pfizer regarding risk scores and serving as site principal investigator of a grant funded by the Task Force for Global Health and a grant from the Patient-Centered Outcomes Research Institute and the NIH-funded RECOVER initiative. Dr. Rabin had no financial conflicts to disclose.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

No Surprises Act: Private Equity Scores Big in Arbitrations

Article Type
Changed
Tue, 08/27/2024 - 09:40

Four organizations owned by private equity firms — including two provider groups — dominated the No Surprises Act’s disputed bill arbitration process in its first year, filing about 70% of 657,040 cases against insurers in 2023, a new report finds. 

The findings, recently published in Health Affairs, suggest that private equity–owned organizations are forcefully challenging insurers about payments for certain kinds of out-of-network care. 

Their fighting stance has paid off: The percentage of resolved arbitration cases won by providers jumped from 72% in the first quarter of 2023 to 85% in the last quarter, and they were awarded a median of more than 300% the contracted in-network rates for the services in question.

With many more out-of-network bills disputed by providers than expected, “the system is not working exactly the way it was anticipated when this law was written,” lead author Jack Hoadley, PhD, a research professor emeritus at Georgetown University’s McCourt School of Public Policy, Washington, DC, told this news organization.

And, he said, the public and the federal government may end up paying a price. 

Congress passed the No Surprises Act in 2020 and then-President Donald Trump signed it. The landmark bill, which went into effect in 2022, was designed to protect patients from unexpected and often exorbitant “surprise” bills after they received some kinds of out-of-network care. 

Now, many types of providers are forbidden from billing patients beyond normal in-network costs. In these cases, health plans and out-of-network providers — who don’t have mutual agreements — must wrangle over payment amounts, which are intended to not exceed inflation-adjusted 2019 median levels. 

A binding arbitration process kicks in when a provider and a health plan fail to agree about how much the plan will pay for a service. Then, a third-party arbitrator is called in to make a ruling that’s binding. The process is controversial, and a flurry of lawsuits from providers have challenged it. 

The new report, which updates an earlier analysis, examines data about disputed cases from all of 2023.

Of the 657,040 new cases filed in 2023, about 70% came from four private equity-funded organizations: Team Health, SCP Health, Radiology Partners, and Envision, which each provide physician services.

About half of the 2023 cases were from just four states: Texas, Florida, Tennessee, and Georgia. The report says the four organizations are especially active in those states. In contrast, Connecticut, Maryland, Massachusetts, and Washington state each had just 1500 or fewer cases filed last year. 

Health plans challenged a third of cases as ineligible, and 22% of all resolved cases were deemed ineligible.

Providers won 80% of resolved challenges in 2023, although it’s not clear how much money they reaped. Still, it’s clear that “in the vast majority of the cases, insurers have to pay larger amounts to the provider,” Dr. Hoadley said.

Radiologists made a median of at least 500% of the in-network rate in their cases. Surgeons and neurologists made even more money — a median of at least 800% of the in-network rate. Overall, providers made 322%-350% of in-network rates, depending on the quarter.

Dr. Hoadley cautioned that only a small percentage of medical payments are disputed. In those cases, “the amount that the insurer offers is accepted, and that’s the end of the story.”

Why are the providers often reaping much more than typical payments for in-network services? It’s “really hard to know,” Dr. Hoadley said. But one factor, he said, may be the fact that providers are able to offer evidence challenging that amounts that insurers say they paid previously: “Hey, when we were in network, we were paid this much.”

It’s not clear whether the dispute-and-arbitration system will cost insurers — and patients — more in the long run. The Congressional Budget Office actually thought the No Surprises Act might lower the growth of premiums slightly and save the federal government money, Dr. Hoadley said, but that could potentially not happen. The flood of litigation also contributes to uncertainty, he said. 

Alan Sager, PhD, professor of Health Law, Policy, and Management at Boston University School of Public Health, told this news organization that premiums are bound to rise as insurers react to higher costs. He also expects that providers will question the value of being in-network. “If you’re out-of-network and can obtain much higher payments, why would any doctor or hospital remain in-network, especially since they don’t lose out on patient volume?”

Why are provider groups owned by private equity firms so aggressive at challenging health plans? Loren Adler, a fellow and associate director of the Brookings Institution’s Center on Health Policy, told this news organization that these companies play large roles in fields affected by the No Surprises Act. These include emergency medicine, radiology, and anesthesiology, said Mr. Adler, who’s also studied the No Surprises Act’s dispute/arbitration system.

Mr. Adler added that larger companies “are better suited to deal with technical complexities of this process and spend the sort of upfront money to go through it.”

In the big picture, Mr. Adler said, the new study “raises question of whether Congress at some point wants to try to basically bring prices from the arbitration process back in line with average in-network prices.”

The study was funded by the Commonwealth Fund and Arnold Ventures. Dr. Hoadley, Dr. Sager, and Mr. Adler had no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Four organizations owned by private equity firms — including two provider groups — dominated the No Surprises Act’s disputed bill arbitration process in its first year, filing about 70% of 657,040 cases against insurers in 2023, a new report finds. 

The findings, recently published in Health Affairs, suggest that private equity–owned organizations are forcefully challenging insurers about payments for certain kinds of out-of-network care. 

Their fighting stance has paid off: The percentage of resolved arbitration cases won by providers jumped from 72% in the first quarter of 2023 to 85% in the last quarter, and they were awarded a median of more than 300% the contracted in-network rates for the services in question.

With many more out-of-network bills disputed by providers than expected, “the system is not working exactly the way it was anticipated when this law was written,” lead author Jack Hoadley, PhD, a research professor emeritus at Georgetown University’s McCourt School of Public Policy, Washington, DC, told this news organization.

And, he said, the public and the federal government may end up paying a price. 

Congress passed the No Surprises Act in 2020 and then-President Donald Trump signed it. The landmark bill, which went into effect in 2022, was designed to protect patients from unexpected and often exorbitant “surprise” bills after they received some kinds of out-of-network care. 

Now, many types of providers are forbidden from billing patients beyond normal in-network costs. In these cases, health plans and out-of-network providers — who don’t have mutual agreements — must wrangle over payment amounts, which are intended to not exceed inflation-adjusted 2019 median levels. 

A binding arbitration process kicks in when a provider and a health plan fail to agree about how much the plan will pay for a service. Then, a third-party arbitrator is called in to make a ruling that’s binding. The process is controversial, and a flurry of lawsuits from providers have challenged it. 

The new report, which updates an earlier analysis, examines data about disputed cases from all of 2023.

Of the 657,040 new cases filed in 2023, about 70% came from four private equity-funded organizations: Team Health, SCP Health, Radiology Partners, and Envision, which each provide physician services.

About half of the 2023 cases were from just four states: Texas, Florida, Tennessee, and Georgia. The report says the four organizations are especially active in those states. In contrast, Connecticut, Maryland, Massachusetts, and Washington state each had just 1500 or fewer cases filed last year. 

Health plans challenged a third of cases as ineligible, and 22% of all resolved cases were deemed ineligible.

Providers won 80% of resolved challenges in 2023, although it’s not clear how much money they reaped. Still, it’s clear that “in the vast majority of the cases, insurers have to pay larger amounts to the provider,” Dr. Hoadley said.

Radiologists made a median of at least 500% of the in-network rate in their cases. Surgeons and neurologists made even more money — a median of at least 800% of the in-network rate. Overall, providers made 322%-350% of in-network rates, depending on the quarter.

Dr. Hoadley cautioned that only a small percentage of medical payments are disputed. In those cases, “the amount that the insurer offers is accepted, and that’s the end of the story.”

Why are the providers often reaping much more than typical payments for in-network services? It’s “really hard to know,” Dr. Hoadley said. But one factor, he said, may be the fact that providers are able to offer evidence challenging that amounts that insurers say they paid previously: “Hey, when we were in network, we were paid this much.”

It’s not clear whether the dispute-and-arbitration system will cost insurers — and patients — more in the long run. The Congressional Budget Office actually thought the No Surprises Act might lower the growth of premiums slightly and save the federal government money, Dr. Hoadley said, but that could potentially not happen. The flood of litigation also contributes to uncertainty, he said. 

Alan Sager, PhD, professor of Health Law, Policy, and Management at Boston University School of Public Health, told this news organization that premiums are bound to rise as insurers react to higher costs. He also expects that providers will question the value of being in-network. “If you’re out-of-network and can obtain much higher payments, why would any doctor or hospital remain in-network, especially since they don’t lose out on patient volume?”

Why are provider groups owned by private equity firms so aggressive at challenging health plans? Loren Adler, a fellow and associate director of the Brookings Institution’s Center on Health Policy, told this news organization that these companies play large roles in fields affected by the No Surprises Act. These include emergency medicine, radiology, and anesthesiology, said Mr. Adler, who’s also studied the No Surprises Act’s dispute/arbitration system.

Mr. Adler added that larger companies “are better suited to deal with technical complexities of this process and spend the sort of upfront money to go through it.”

In the big picture, Mr. Adler said, the new study “raises question of whether Congress at some point wants to try to basically bring prices from the arbitration process back in line with average in-network prices.”

The study was funded by the Commonwealth Fund and Arnold Ventures. Dr. Hoadley, Dr. Sager, and Mr. Adler had no disclosures.

A version of this article first appeared on Medscape.com.

Four organizations owned by private equity firms — including two provider groups — dominated the No Surprises Act’s disputed bill arbitration process in its first year, filing about 70% of 657,040 cases against insurers in 2023, a new report finds. 

The findings, recently published in Health Affairs, suggest that private equity–owned organizations are forcefully challenging insurers about payments for certain kinds of out-of-network care. 

Their fighting stance has paid off: The percentage of resolved arbitration cases won by providers jumped from 72% in the first quarter of 2023 to 85% in the last quarter, and they were awarded a median of more than 300% the contracted in-network rates for the services in question.

With many more out-of-network bills disputed by providers than expected, “the system is not working exactly the way it was anticipated when this law was written,” lead author Jack Hoadley, PhD, a research professor emeritus at Georgetown University’s McCourt School of Public Policy, Washington, DC, told this news organization.

And, he said, the public and the federal government may end up paying a price. 

Congress passed the No Surprises Act in 2020 and then-President Donald Trump signed it. The landmark bill, which went into effect in 2022, was designed to protect patients from unexpected and often exorbitant “surprise” bills after they received some kinds of out-of-network care. 

Now, many types of providers are forbidden from billing patients beyond normal in-network costs. In these cases, health plans and out-of-network providers — who don’t have mutual agreements — must wrangle over payment amounts, which are intended to not exceed inflation-adjusted 2019 median levels. 

A binding arbitration process kicks in when a provider and a health plan fail to agree about how much the plan will pay for a service. Then, a third-party arbitrator is called in to make a ruling that’s binding. The process is controversial, and a flurry of lawsuits from providers have challenged it. 

The new report, which updates an earlier analysis, examines data about disputed cases from all of 2023.

Of the 657,040 new cases filed in 2023, about 70% came from four private equity-funded organizations: Team Health, SCP Health, Radiology Partners, and Envision, which each provide physician services.

About half of the 2023 cases were from just four states: Texas, Florida, Tennessee, and Georgia. The report says the four organizations are especially active in those states. In contrast, Connecticut, Maryland, Massachusetts, and Washington state each had just 1500 or fewer cases filed last year. 

Health plans challenged a third of cases as ineligible, and 22% of all resolved cases were deemed ineligible.

Providers won 80% of resolved challenges in 2023, although it’s not clear how much money they reaped. Still, it’s clear that “in the vast majority of the cases, insurers have to pay larger amounts to the provider,” Dr. Hoadley said.

Radiologists made a median of at least 500% of the in-network rate in their cases. Surgeons and neurologists made even more money — a median of at least 800% of the in-network rate. Overall, providers made 322%-350% of in-network rates, depending on the quarter.

Dr. Hoadley cautioned that only a small percentage of medical payments are disputed. In those cases, “the amount that the insurer offers is accepted, and that’s the end of the story.”

Why are the providers often reaping much more than typical payments for in-network services? It’s “really hard to know,” Dr. Hoadley said. But one factor, he said, may be the fact that providers are able to offer evidence challenging that amounts that insurers say they paid previously: “Hey, when we were in network, we were paid this much.”

It’s not clear whether the dispute-and-arbitration system will cost insurers — and patients — more in the long run. The Congressional Budget Office actually thought the No Surprises Act might lower the growth of premiums slightly and save the federal government money, Dr. Hoadley said, but that could potentially not happen. The flood of litigation also contributes to uncertainty, he said. 

Alan Sager, PhD, professor of Health Law, Policy, and Management at Boston University School of Public Health, told this news organization that premiums are bound to rise as insurers react to higher costs. He also expects that providers will question the value of being in-network. “If you’re out-of-network and can obtain much higher payments, why would any doctor or hospital remain in-network, especially since they don’t lose out on patient volume?”

Why are provider groups owned by private equity firms so aggressive at challenging health plans? Loren Adler, a fellow and associate director of the Brookings Institution’s Center on Health Policy, told this news organization that these companies play large roles in fields affected by the No Surprises Act. These include emergency medicine, radiology, and anesthesiology, said Mr. Adler, who’s also studied the No Surprises Act’s dispute/arbitration system.

Mr. Adler added that larger companies “are better suited to deal with technical complexities of this process and spend the sort of upfront money to go through it.”

In the big picture, Mr. Adler said, the new study “raises question of whether Congress at some point wants to try to basically bring prices from the arbitration process back in line with average in-network prices.”

The study was funded by the Commonwealth Fund and Arnold Ventures. Dr. Hoadley, Dr. Sager, and Mr. Adler had no disclosures.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

The New Formula for Stronger, Longer-Lasting Vaccines

Article Type
Changed
Tue, 08/27/2024 - 09:36

Vaccines work pretty well. But with a little help, they could work better.

Stanford researchers have developed a new vaccine helper that combines two kinds of adjuvants, ingredients that improve a vaccine’s efficacy, in a novel, customizable system.

In lab tests, the experimental additive improved the effectiveness of COVID-19 and HIV vaccine candidates, though it could be adapted to stimulate immune responses to a variety of pathogens, the researchers said. It could also be used one day to fine-tune vaccines for vulnerable groups like young children, older adults, and those with compromised immune systems.

“Current vaccines are not perfect,” said lead study author Ben Ou, a PhD candidate and researcher in the lab of Eric Appel, PhD, an associate professor of materials science and engineering, at Stanford University in California. “Many fail to generate long-lasting immunity or immunity against closely related strains [such as] flu or COVID vaccines. One way to improve them is to design more potent vaccine adjuvants.”

The study marks an advance in an area of growing scientific interest: Combining different adjuvants to enhance the immune-stimulating effect.

The Stanford scientists developed sphere-shaped nanoparticles, like tiny round cages, made of saponins, immune-stimulating molecules common in adjuvant development. To these nanoparticles, they attached Toll-like receptor (TLR) agonists, molecules that have become a focus in vaccine research because they stimulate a variety of immune responses.

Dr. Ou and the team tested the new adjuvant platform in COVID and HIV vaccines, comparing it to vaccines containing alum, a widely used adjuvant. (Alum is not used in COVID vaccines available in the United States.)

The nanoparticle-adjuvanted vaccines triggered stronger, longer-lasting effects. 

Notably, the combination of the new adjuvant system with a SARS-CoV-2 virus vaccine was effective in mice against the original SARS-CoV-2 virus and against Delta, Omicron, and other variants that emerged in the months and years after the initial outbreak. 

“Since our nanoparticle adjuvant platform is more potent than traditional/clinical vaccine adjuvants,” Dr. Ou said, “we expected mice to produce broadly neutralizing antibodies and better breadth responses.”
 

100 Years of Adjuvants

The first vaccine adjuvants were aluminum salts mixed into shots against pertussis, diphtheria, and tetanus in the 1920s. Today, alum is still used in many vaccines, including shots for diphtheria, tetanus, and pertussis; hepatitis A and B; human papillomavirus; and pneumococcal disease.

But since the 1990s, new adjuvants have come on the scene. Saponin-based compounds, harvested from the soapbark tree, are used in the Novavax COVID-19 Vaccine, Adjuvanted; a synthetic DNA adjuvant in the Heplisav-B vaccine against hepatitis B; and oil in water adjuvants using squalene in the Fluad and Fluad Quadrivalent influenza vaccines. Other vaccines, including those for chickenpox, cholera, measles, mumps, rubella, and mRNA-based COVID vaccines from Pfizer-BioNTech and Moderna, don’t contain adjuvants

TLR agonists have recently become research hotspots in vaccine science. 

“TLR agonists activate the innate immune system, putting it on a heightened alert state that can result in a higher antibody production and longer-lasting protection,” said David Burkhart, PhD, a research professor in biomedical and pharmaceutical sciences at the University of Montana in Missoula. He is also the chief operating officer of Inimmune, a biotech company developing vaccines and immunotherapies.

Dr. Burkhart studies TLR agonists in vaccines and other applications. “Different combinations activate different parts of the immune system,” he said. “TLR4 might activate the army, while TLR7 might activate the air force. You might need both in one vaccine.”

TLR agonists have also shown promise against Alzheimer’s disease, allergies, cancer, and even addiction. In immune’s experimental immunotherapy using TLR agonists for advanced solid tumors has just entered human trials, and the company is looking at a TLR agonist therapy for allergic rhinitis
 

 

 

Combining Forces

In the new study, researchers tested five different combinations of TLR agonists hooked to the saponin nanoparticle framework. Each elicited a slightly different response from the immune cells. 

“Our immune systems generate different downstream immune responses based on which TLRs are activated,” Dr. Ou said.

Ultimately, the advance could spur the development of vaccines tuned for stronger immune protection.

“We need different immune responses to fight different types of pathogens,” Dr. Ou said. “Depending on what specific virus/disease the vaccine is formulated for, activation of one specific TLR may confer better protection than another TLR.”

According to Dr. Burkhart, combining a saponin with a TLR agonist has found success before.

Biopharma company GSK (formerly GlaxoSmithKline) used the combination in its AS01 adjuvant, in the vaccine Shingrix against herpes zoster. The live-attenuated yellow fever vaccine, given to more than 600 million people around the world and considered one of the most powerful vaccines ever developed, uses several TLR agonists. 

The Stanford paper, Dr. Burkhart said, “is a nice demonstration of the enhanced efficacy [that] adjuvants can provide to vaccines by exploiting the synergy different adjuvants and TLR agonists can provide when used in combination.”
 

Tailoring Vaccines

The customizable aspect of TLR agonists is important too, Dr. Burkhart said. 

“The human immune system changes dramatically from birth to childhood into adulthood into older maturity,” he said. “It’s not a one-size-fits-all. Vaccines need to be tailored to these populations for maximum effectiveness and safety. TLRAs [TLR agonists] are a highly valuable tool in the vaccine toolbox. I think it’s inevitable we’ll have more in the future.”

That’s what the Stanford researchers hope for.

They noted in the study that the nanoparticle platform could easily be used to test different TLR agonist adjuvant combinations in vaccines.

But human studies are still a ways off. Tests in larger animals would likely come next, Dr. Ou said. 

“We now have a single nanoparticle adjuvant platform with formulations containing different TLRs,” Dr. Ou said. “Scientists can pick which specific formulation is the most suitable for their needs.”

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Vaccines work pretty well. But with a little help, they could work better.

Stanford researchers have developed a new vaccine helper that combines two kinds of adjuvants, ingredients that improve a vaccine’s efficacy, in a novel, customizable system.

In lab tests, the experimental additive improved the effectiveness of COVID-19 and HIV vaccine candidates, though it could be adapted to stimulate immune responses to a variety of pathogens, the researchers said. It could also be used one day to fine-tune vaccines for vulnerable groups like young children, older adults, and those with compromised immune systems.

“Current vaccines are not perfect,” said lead study author Ben Ou, a PhD candidate and researcher in the lab of Eric Appel, PhD, an associate professor of materials science and engineering, at Stanford University in California. “Many fail to generate long-lasting immunity or immunity against closely related strains [such as] flu or COVID vaccines. One way to improve them is to design more potent vaccine adjuvants.”

The study marks an advance in an area of growing scientific interest: Combining different adjuvants to enhance the immune-stimulating effect.

The Stanford scientists developed sphere-shaped nanoparticles, like tiny round cages, made of saponins, immune-stimulating molecules common in adjuvant development. To these nanoparticles, they attached Toll-like receptor (TLR) agonists, molecules that have become a focus in vaccine research because they stimulate a variety of immune responses.

Dr. Ou and the team tested the new adjuvant platform in COVID and HIV vaccines, comparing it to vaccines containing alum, a widely used adjuvant. (Alum is not used in COVID vaccines available in the United States.)

The nanoparticle-adjuvanted vaccines triggered stronger, longer-lasting effects. 

Notably, the combination of the new adjuvant system with a SARS-CoV-2 virus vaccine was effective in mice against the original SARS-CoV-2 virus and against Delta, Omicron, and other variants that emerged in the months and years after the initial outbreak. 

“Since our nanoparticle adjuvant platform is more potent than traditional/clinical vaccine adjuvants,” Dr. Ou said, “we expected mice to produce broadly neutralizing antibodies and better breadth responses.”
 

100 Years of Adjuvants

The first vaccine adjuvants were aluminum salts mixed into shots against pertussis, diphtheria, and tetanus in the 1920s. Today, alum is still used in many vaccines, including shots for diphtheria, tetanus, and pertussis; hepatitis A and B; human papillomavirus; and pneumococcal disease.

But since the 1990s, new adjuvants have come on the scene. Saponin-based compounds, harvested from the soapbark tree, are used in the Novavax COVID-19 Vaccine, Adjuvanted; a synthetic DNA adjuvant in the Heplisav-B vaccine against hepatitis B; and oil in water adjuvants using squalene in the Fluad and Fluad Quadrivalent influenza vaccines. Other vaccines, including those for chickenpox, cholera, measles, mumps, rubella, and mRNA-based COVID vaccines from Pfizer-BioNTech and Moderna, don’t contain adjuvants

TLR agonists have recently become research hotspots in vaccine science. 

“TLR agonists activate the innate immune system, putting it on a heightened alert state that can result in a higher antibody production and longer-lasting protection,” said David Burkhart, PhD, a research professor in biomedical and pharmaceutical sciences at the University of Montana in Missoula. He is also the chief operating officer of Inimmune, a biotech company developing vaccines and immunotherapies.

Dr. Burkhart studies TLR agonists in vaccines and other applications. “Different combinations activate different parts of the immune system,” he said. “TLR4 might activate the army, while TLR7 might activate the air force. You might need both in one vaccine.”

TLR agonists have also shown promise against Alzheimer’s disease, allergies, cancer, and even addiction. In immune’s experimental immunotherapy using TLR agonists for advanced solid tumors has just entered human trials, and the company is looking at a TLR agonist therapy for allergic rhinitis
 

 

 

Combining Forces

In the new study, researchers tested five different combinations of TLR agonists hooked to the saponin nanoparticle framework. Each elicited a slightly different response from the immune cells. 

“Our immune systems generate different downstream immune responses based on which TLRs are activated,” Dr. Ou said.

Ultimately, the advance could spur the development of vaccines tuned for stronger immune protection.

“We need different immune responses to fight different types of pathogens,” Dr. Ou said. “Depending on what specific virus/disease the vaccine is formulated for, activation of one specific TLR may confer better protection than another TLR.”

According to Dr. Burkhart, combining a saponin with a TLR agonist has found success before.

Biopharma company GSK (formerly GlaxoSmithKline) used the combination in its AS01 adjuvant, in the vaccine Shingrix against herpes zoster. The live-attenuated yellow fever vaccine, given to more than 600 million people around the world and considered one of the most powerful vaccines ever developed, uses several TLR agonists. 

The Stanford paper, Dr. Burkhart said, “is a nice demonstration of the enhanced efficacy [that] adjuvants can provide to vaccines by exploiting the synergy different adjuvants and TLR agonists can provide when used in combination.”
 

Tailoring Vaccines

The customizable aspect of TLR agonists is important too, Dr. Burkhart said. 

“The human immune system changes dramatically from birth to childhood into adulthood into older maturity,” he said. “It’s not a one-size-fits-all. Vaccines need to be tailored to these populations for maximum effectiveness and safety. TLRAs [TLR agonists] are a highly valuable tool in the vaccine toolbox. I think it’s inevitable we’ll have more in the future.”

That’s what the Stanford researchers hope for.

They noted in the study that the nanoparticle platform could easily be used to test different TLR agonist adjuvant combinations in vaccines.

But human studies are still a ways off. Tests in larger animals would likely come next, Dr. Ou said. 

“We now have a single nanoparticle adjuvant platform with formulations containing different TLRs,” Dr. Ou said. “Scientists can pick which specific formulation is the most suitable for their needs.”

A version of this article first appeared on Medscape.com.

Vaccines work pretty well. But with a little help, they could work better.

Stanford researchers have developed a new vaccine helper that combines two kinds of adjuvants, ingredients that improve a vaccine’s efficacy, in a novel, customizable system.

In lab tests, the experimental additive improved the effectiveness of COVID-19 and HIV vaccine candidates, though it could be adapted to stimulate immune responses to a variety of pathogens, the researchers said. It could also be used one day to fine-tune vaccines for vulnerable groups like young children, older adults, and those with compromised immune systems.

“Current vaccines are not perfect,” said lead study author Ben Ou, a PhD candidate and researcher in the lab of Eric Appel, PhD, an associate professor of materials science and engineering, at Stanford University in California. “Many fail to generate long-lasting immunity or immunity against closely related strains [such as] flu or COVID vaccines. One way to improve them is to design more potent vaccine adjuvants.”

The study marks an advance in an area of growing scientific interest: Combining different adjuvants to enhance the immune-stimulating effect.

The Stanford scientists developed sphere-shaped nanoparticles, like tiny round cages, made of saponins, immune-stimulating molecules common in adjuvant development. To these nanoparticles, they attached Toll-like receptor (TLR) agonists, molecules that have become a focus in vaccine research because they stimulate a variety of immune responses.

Dr. Ou and the team tested the new adjuvant platform in COVID and HIV vaccines, comparing it to vaccines containing alum, a widely used adjuvant. (Alum is not used in COVID vaccines available in the United States.)

The nanoparticle-adjuvanted vaccines triggered stronger, longer-lasting effects. 

Notably, the combination of the new adjuvant system with a SARS-CoV-2 virus vaccine was effective in mice against the original SARS-CoV-2 virus and against Delta, Omicron, and other variants that emerged in the months and years after the initial outbreak. 

“Since our nanoparticle adjuvant platform is more potent than traditional/clinical vaccine adjuvants,” Dr. Ou said, “we expected mice to produce broadly neutralizing antibodies and better breadth responses.”
 

100 Years of Adjuvants

The first vaccine adjuvants were aluminum salts mixed into shots against pertussis, diphtheria, and tetanus in the 1920s. Today, alum is still used in many vaccines, including shots for diphtheria, tetanus, and pertussis; hepatitis A and B; human papillomavirus; and pneumococcal disease.

But since the 1990s, new adjuvants have come on the scene. Saponin-based compounds, harvested from the soapbark tree, are used in the Novavax COVID-19 Vaccine, Adjuvanted; a synthetic DNA adjuvant in the Heplisav-B vaccine against hepatitis B; and oil in water adjuvants using squalene in the Fluad and Fluad Quadrivalent influenza vaccines. Other vaccines, including those for chickenpox, cholera, measles, mumps, rubella, and mRNA-based COVID vaccines from Pfizer-BioNTech and Moderna, don’t contain adjuvants

TLR agonists have recently become research hotspots in vaccine science. 

“TLR agonists activate the innate immune system, putting it on a heightened alert state that can result in a higher antibody production and longer-lasting protection,” said David Burkhart, PhD, a research professor in biomedical and pharmaceutical sciences at the University of Montana in Missoula. He is also the chief operating officer of Inimmune, a biotech company developing vaccines and immunotherapies.

Dr. Burkhart studies TLR agonists in vaccines and other applications. “Different combinations activate different parts of the immune system,” he said. “TLR4 might activate the army, while TLR7 might activate the air force. You might need both in one vaccine.”

TLR agonists have also shown promise against Alzheimer’s disease, allergies, cancer, and even addiction. In immune’s experimental immunotherapy using TLR agonists for advanced solid tumors has just entered human trials, and the company is looking at a TLR agonist therapy for allergic rhinitis
 

 

 

Combining Forces

In the new study, researchers tested five different combinations of TLR agonists hooked to the saponin nanoparticle framework. Each elicited a slightly different response from the immune cells. 

“Our immune systems generate different downstream immune responses based on which TLRs are activated,” Dr. Ou said.

Ultimately, the advance could spur the development of vaccines tuned for stronger immune protection.

“We need different immune responses to fight different types of pathogens,” Dr. Ou said. “Depending on what specific virus/disease the vaccine is formulated for, activation of one specific TLR may confer better protection than another TLR.”

According to Dr. Burkhart, combining a saponin with a TLR agonist has found success before.

Biopharma company GSK (formerly GlaxoSmithKline) used the combination in its AS01 adjuvant, in the vaccine Shingrix against herpes zoster. The live-attenuated yellow fever vaccine, given to more than 600 million people around the world and considered one of the most powerful vaccines ever developed, uses several TLR agonists. 

The Stanford paper, Dr. Burkhart said, “is a nice demonstration of the enhanced efficacy [that] adjuvants can provide to vaccines by exploiting the synergy different adjuvants and TLR agonists can provide when used in combination.”
 

Tailoring Vaccines

The customizable aspect of TLR agonists is important too, Dr. Burkhart said. 

“The human immune system changes dramatically from birth to childhood into adulthood into older maturity,” he said. “It’s not a one-size-fits-all. Vaccines need to be tailored to these populations for maximum effectiveness and safety. TLRAs [TLR agonists] are a highly valuable tool in the vaccine toolbox. I think it’s inevitable we’ll have more in the future.”

That’s what the Stanford researchers hope for.

They noted in the study that the nanoparticle platform could easily be used to test different TLR agonist adjuvant combinations in vaccines.

But human studies are still a ways off. Tests in larger animals would likely come next, Dr. Ou said. 

“We now have a single nanoparticle adjuvant platform with formulations containing different TLRs,” Dr. Ou said. “Scientists can pick which specific formulation is the most suitable for their needs.”

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM SCIENCE ADVANCES

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article