Ketamine may be a viable alternative to ECT for severe depression

Article Type
Changed
Wed, 06/07/2023 - 09:23

Electroconvulsive therapy (ECT) is the standard treatment for resistant depression, but results of a new randomized, head-to-head trial suggest intravenous ketamine is at least as effective and has fewer side effects.

“The take-home message right now is that if somebody is being referred for ECT, the treating clinician should think of offering ketamine first,” study investigator Amit Anand, MD, professor of psychiatry, Harvard Medical School, Boston, said in an interview.

The study was published online in the New England Journal of Medicine.
 

‘Preferred treatment’

More than one-third of cases of depression are treatment resistant, said Dr. Anand, who is also director of Psychiatry Translational Clinical Trials at Mass General Brigham. He noted that ECT has been the “gold standard for treating severe depression for over 80 years.”

Julia Hiebaum/Alamy

He added that although ECT is very effective and is fast acting, “it requires anesthesia, can be socially stigmatizing, and is associated with memory problems following the treatment.”

An anesthetic agent, ketamine has been shown to have rapid antidepressant effects and does not cause memory loss or carry the stigma associated with ECT, he added. For these reasons, the investigators examined whether it may be a viable alternative to ECT.

To date, no large, head-to-head trials have compared ECT to intravenous ketamine. A recent meta-analysis showed that ECT was superior to ketamine for major depression, but the total number of patients included in the analysis was small, Dr. Anand said.

In addition, most of the participants in that trial were drawn from a single center. Approximately 95 patients were enrolled in each arm of the trial, which included some participants with features of psychosis. “ECT is very effective for depression associated with psychotic features, which may be one reason ECT had a better response in that trial,” said Dr. Anand.

The investigators compared ECT to ketamine in a larger sample that excluded patients with psychosis. They randomly assigned 403 patients at five clinical sites in a 1:1 ratio to receive either ketamine or ECT (n = 200 and 203, respectively; 53% and 49.3% women, respectively; aged 45.6 ± 14.8 and 47.1 ± 14.1 years, respectively).

Patients were required to have had an unsatisfactory response to two or more adequate trials of antidepressant treatment.

Prior to initiation of the assigned treatment, 38 patients withdrew, leaving 195 in the ketamine group and 170 in the ECT group.

Treatment was administered over a 3-week period, during which patients received either ECT three times per week or ketamine (0.5 mg/kg of body weight) twice per week.

The primary outcome was treatment response, defined as a decrease of 50% or more from baseline in the16-item Quick Inventory of Depressive Symptomatology–Self-Report (QIDS-SR-16). Secondary outcomes included scores on memory tests and patient-reported quality of life.

Patients who had a response were followed for 6 months after the initial treatment phase.
 

More research needed

Following the 3-week treatment period, a total of 55.4% patients who received ketamine and 41.2% of patients who underwent ECT responded to treatment, which translates into a difference of 14.2 percentage points (95% confidence interval, 3.9-24.2; P < .001) – a finding that fell within the noninferiority threshold set by the investigators.

ECT was associated with decreased memory recall after the 3 weeks of treatment, with a mean (standard deviation) decrease in the T-score for delayed recall on the Hopkins Verbal Learning Test–Revised of –0.9 (1.1) in the ketamine group vs. –9.7 (1.2) in the ECT group (difference, –1.8 points [–2.8 to –0.8]).

Remission, determined on the basis of QIDS-SR-16 score, occurred in 32% of the ketamine group and in 20% in the ECT group. Similar findings were seen on the Montgomery-Åsberg Depression Rating Scale.

Both groups showed significant improvements in quality of life, with changes of 12.3 and 12.9 points, respectively, on the 16-item Quality of Life Scale.

“ECT was associated with musculoskeletal adverse events, whereas ketamine was associated with dissociation,” the investigators note.

During the 6-month follow-up period, there were differences in relapse rates between the groups (defined as QIDS-SRS-16 score > 11). At 1 month, the rates were 19.0% for those receiving ketamine and 35.4% for those receiving ECT. At 3 months, the rates were 25.0% and 50.9%, respectively; at 6 months, the rates were 34.5% and 56.3%, respectively.

ECT has been shown to be effective for older adults, patients with MDD and psychosis, and in inpatient and research settings. Future studies are needed to determine the comparative effectiveness of ketamine in these populations, the authors note.
 

Not life-changing

In a comment, Dan Iosifescu, MD, professor of psychiatry, NYU Langone Health, New York, called it an “extraordinarily important and clinically relevant study, large, well-designed, and well-conducted.”

Dr. Iosifescu, director of the clinical research division, Nathan Kline Institute, Orangeburg, N.Y., who was not involved with the study, noted that the study wasn’t powered to determine whether one treatment was superior to the other, but rather it assessed noninferiority.

“The main point of this study is that the two treatments are largely equivalent, although numerically, ketamine was slightly associated with more beneficial outcomes and fewer cognitive side effects,” he said.

The findings suggest “that people who have no contraindications and are candidates for both ketamine and ECT – which is the vast majority of people with treatment-resistant depression – should consider getting ketamine first because it is somewhat easier in terms of side effects and logistics and consider ECT afterwards if the ketamine doesn’t work.”

In an accompanying editorial, Robert Freedman, MD, clinical professor, University of Colorado at Denver, Aurora, noted that although “3 weeks of lightened mood is undoubtedly a gift ... the results of this current trial suggests that the 3-week treatment was not life-changing,” since effects had largely worn off by 6 months in both groups.

Longer-term treatment with ketamine “increases the likelihood of both drug dependence and cognitive adverse effects, including dissociation, paranoia, and other psychotic symptoms,” Dr. Freedman said.

He recommends that informed consent documents be used to caution patients and clinicians considering ketamine “that temporary relief may come with longer-term costs.”

The study was supported by a grant from PCORI to Dr. Anand. Dr. Freedman has disclosed no relevant financial relationships. In the past 2 years, Dr. Iosifescu has been a consultant for Axsome, Allergan, Biogen, Clexio, Jazz, Neumora, Relmada, and Sage. He has also received a research grant from Otsuka.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Electroconvulsive therapy (ECT) is the standard treatment for resistant depression, but results of a new randomized, head-to-head trial suggest intravenous ketamine is at least as effective and has fewer side effects.

“The take-home message right now is that if somebody is being referred for ECT, the treating clinician should think of offering ketamine first,” study investigator Amit Anand, MD, professor of psychiatry, Harvard Medical School, Boston, said in an interview.

The study was published online in the New England Journal of Medicine.
 

‘Preferred treatment’

More than one-third of cases of depression are treatment resistant, said Dr. Anand, who is also director of Psychiatry Translational Clinical Trials at Mass General Brigham. He noted that ECT has been the “gold standard for treating severe depression for over 80 years.”

Julia Hiebaum/Alamy

He added that although ECT is very effective and is fast acting, “it requires anesthesia, can be socially stigmatizing, and is associated with memory problems following the treatment.”

An anesthetic agent, ketamine has been shown to have rapid antidepressant effects and does not cause memory loss or carry the stigma associated with ECT, he added. For these reasons, the investigators examined whether it may be a viable alternative to ECT.

To date, no large, head-to-head trials have compared ECT to intravenous ketamine. A recent meta-analysis showed that ECT was superior to ketamine for major depression, but the total number of patients included in the analysis was small, Dr. Anand said.

In addition, most of the participants in that trial were drawn from a single center. Approximately 95 patients were enrolled in each arm of the trial, which included some participants with features of psychosis. “ECT is very effective for depression associated with psychotic features, which may be one reason ECT had a better response in that trial,” said Dr. Anand.

The investigators compared ECT to ketamine in a larger sample that excluded patients with psychosis. They randomly assigned 403 patients at five clinical sites in a 1:1 ratio to receive either ketamine or ECT (n = 200 and 203, respectively; 53% and 49.3% women, respectively; aged 45.6 ± 14.8 and 47.1 ± 14.1 years, respectively).

Patients were required to have had an unsatisfactory response to two or more adequate trials of antidepressant treatment.

Prior to initiation of the assigned treatment, 38 patients withdrew, leaving 195 in the ketamine group and 170 in the ECT group.

Treatment was administered over a 3-week period, during which patients received either ECT three times per week or ketamine (0.5 mg/kg of body weight) twice per week.

The primary outcome was treatment response, defined as a decrease of 50% or more from baseline in the16-item Quick Inventory of Depressive Symptomatology–Self-Report (QIDS-SR-16). Secondary outcomes included scores on memory tests and patient-reported quality of life.

Patients who had a response were followed for 6 months after the initial treatment phase.
 

More research needed

Following the 3-week treatment period, a total of 55.4% patients who received ketamine and 41.2% of patients who underwent ECT responded to treatment, which translates into a difference of 14.2 percentage points (95% confidence interval, 3.9-24.2; P < .001) – a finding that fell within the noninferiority threshold set by the investigators.

ECT was associated with decreased memory recall after the 3 weeks of treatment, with a mean (standard deviation) decrease in the T-score for delayed recall on the Hopkins Verbal Learning Test–Revised of –0.9 (1.1) in the ketamine group vs. –9.7 (1.2) in the ECT group (difference, –1.8 points [–2.8 to –0.8]).

Remission, determined on the basis of QIDS-SR-16 score, occurred in 32% of the ketamine group and in 20% in the ECT group. Similar findings were seen on the Montgomery-Åsberg Depression Rating Scale.

Both groups showed significant improvements in quality of life, with changes of 12.3 and 12.9 points, respectively, on the 16-item Quality of Life Scale.

“ECT was associated with musculoskeletal adverse events, whereas ketamine was associated with dissociation,” the investigators note.

During the 6-month follow-up period, there were differences in relapse rates between the groups (defined as QIDS-SRS-16 score > 11). At 1 month, the rates were 19.0% for those receiving ketamine and 35.4% for those receiving ECT. At 3 months, the rates were 25.0% and 50.9%, respectively; at 6 months, the rates were 34.5% and 56.3%, respectively.

ECT has been shown to be effective for older adults, patients with MDD and psychosis, and in inpatient and research settings. Future studies are needed to determine the comparative effectiveness of ketamine in these populations, the authors note.
 

Not life-changing

In a comment, Dan Iosifescu, MD, professor of psychiatry, NYU Langone Health, New York, called it an “extraordinarily important and clinically relevant study, large, well-designed, and well-conducted.”

Dr. Iosifescu, director of the clinical research division, Nathan Kline Institute, Orangeburg, N.Y., who was not involved with the study, noted that the study wasn’t powered to determine whether one treatment was superior to the other, but rather it assessed noninferiority.

“The main point of this study is that the two treatments are largely equivalent, although numerically, ketamine was slightly associated with more beneficial outcomes and fewer cognitive side effects,” he said.

The findings suggest “that people who have no contraindications and are candidates for both ketamine and ECT – which is the vast majority of people with treatment-resistant depression – should consider getting ketamine first because it is somewhat easier in terms of side effects and logistics and consider ECT afterwards if the ketamine doesn’t work.”

In an accompanying editorial, Robert Freedman, MD, clinical professor, University of Colorado at Denver, Aurora, noted that although “3 weeks of lightened mood is undoubtedly a gift ... the results of this current trial suggests that the 3-week treatment was not life-changing,” since effects had largely worn off by 6 months in both groups.

Longer-term treatment with ketamine “increases the likelihood of both drug dependence and cognitive adverse effects, including dissociation, paranoia, and other psychotic symptoms,” Dr. Freedman said.

He recommends that informed consent documents be used to caution patients and clinicians considering ketamine “that temporary relief may come with longer-term costs.”

The study was supported by a grant from PCORI to Dr. Anand. Dr. Freedman has disclosed no relevant financial relationships. In the past 2 years, Dr. Iosifescu has been a consultant for Axsome, Allergan, Biogen, Clexio, Jazz, Neumora, Relmada, and Sage. He has also received a research grant from Otsuka.

A version of this article first appeared on Medscape.com.

Electroconvulsive therapy (ECT) is the standard treatment for resistant depression, but results of a new randomized, head-to-head trial suggest intravenous ketamine is at least as effective and has fewer side effects.

“The take-home message right now is that if somebody is being referred for ECT, the treating clinician should think of offering ketamine first,” study investigator Amit Anand, MD, professor of psychiatry, Harvard Medical School, Boston, said in an interview.

The study was published online in the New England Journal of Medicine.
 

‘Preferred treatment’

More than one-third of cases of depression are treatment resistant, said Dr. Anand, who is also director of Psychiatry Translational Clinical Trials at Mass General Brigham. He noted that ECT has been the “gold standard for treating severe depression for over 80 years.”

Julia Hiebaum/Alamy

He added that although ECT is very effective and is fast acting, “it requires anesthesia, can be socially stigmatizing, and is associated with memory problems following the treatment.”

An anesthetic agent, ketamine has been shown to have rapid antidepressant effects and does not cause memory loss or carry the stigma associated with ECT, he added. For these reasons, the investigators examined whether it may be a viable alternative to ECT.

To date, no large, head-to-head trials have compared ECT to intravenous ketamine. A recent meta-analysis showed that ECT was superior to ketamine for major depression, but the total number of patients included in the analysis was small, Dr. Anand said.

In addition, most of the participants in that trial were drawn from a single center. Approximately 95 patients were enrolled in each arm of the trial, which included some participants with features of psychosis. “ECT is very effective for depression associated with psychotic features, which may be one reason ECT had a better response in that trial,” said Dr. Anand.

The investigators compared ECT to ketamine in a larger sample that excluded patients with psychosis. They randomly assigned 403 patients at five clinical sites in a 1:1 ratio to receive either ketamine or ECT (n = 200 and 203, respectively; 53% and 49.3% women, respectively; aged 45.6 ± 14.8 and 47.1 ± 14.1 years, respectively).

Patients were required to have had an unsatisfactory response to two or more adequate trials of antidepressant treatment.

Prior to initiation of the assigned treatment, 38 patients withdrew, leaving 195 in the ketamine group and 170 in the ECT group.

Treatment was administered over a 3-week period, during which patients received either ECT three times per week or ketamine (0.5 mg/kg of body weight) twice per week.

The primary outcome was treatment response, defined as a decrease of 50% or more from baseline in the16-item Quick Inventory of Depressive Symptomatology–Self-Report (QIDS-SR-16). Secondary outcomes included scores on memory tests and patient-reported quality of life.

Patients who had a response were followed for 6 months after the initial treatment phase.
 

More research needed

Following the 3-week treatment period, a total of 55.4% patients who received ketamine and 41.2% of patients who underwent ECT responded to treatment, which translates into a difference of 14.2 percentage points (95% confidence interval, 3.9-24.2; P < .001) – a finding that fell within the noninferiority threshold set by the investigators.

ECT was associated with decreased memory recall after the 3 weeks of treatment, with a mean (standard deviation) decrease in the T-score for delayed recall on the Hopkins Verbal Learning Test–Revised of –0.9 (1.1) in the ketamine group vs. –9.7 (1.2) in the ECT group (difference, –1.8 points [–2.8 to –0.8]).

Remission, determined on the basis of QIDS-SR-16 score, occurred in 32% of the ketamine group and in 20% in the ECT group. Similar findings were seen on the Montgomery-Åsberg Depression Rating Scale.

Both groups showed significant improvements in quality of life, with changes of 12.3 and 12.9 points, respectively, on the 16-item Quality of Life Scale.

“ECT was associated with musculoskeletal adverse events, whereas ketamine was associated with dissociation,” the investigators note.

During the 6-month follow-up period, there were differences in relapse rates between the groups (defined as QIDS-SRS-16 score > 11). At 1 month, the rates were 19.0% for those receiving ketamine and 35.4% for those receiving ECT. At 3 months, the rates were 25.0% and 50.9%, respectively; at 6 months, the rates were 34.5% and 56.3%, respectively.

ECT has been shown to be effective for older adults, patients with MDD and psychosis, and in inpatient and research settings. Future studies are needed to determine the comparative effectiveness of ketamine in these populations, the authors note.
 

Not life-changing

In a comment, Dan Iosifescu, MD, professor of psychiatry, NYU Langone Health, New York, called it an “extraordinarily important and clinically relevant study, large, well-designed, and well-conducted.”

Dr. Iosifescu, director of the clinical research division, Nathan Kline Institute, Orangeburg, N.Y., who was not involved with the study, noted that the study wasn’t powered to determine whether one treatment was superior to the other, but rather it assessed noninferiority.

“The main point of this study is that the two treatments are largely equivalent, although numerically, ketamine was slightly associated with more beneficial outcomes and fewer cognitive side effects,” he said.

The findings suggest “that people who have no contraindications and are candidates for both ketamine and ECT – which is the vast majority of people with treatment-resistant depression – should consider getting ketamine first because it is somewhat easier in terms of side effects and logistics and consider ECT afterwards if the ketamine doesn’t work.”

In an accompanying editorial, Robert Freedman, MD, clinical professor, University of Colorado at Denver, Aurora, noted that although “3 weeks of lightened mood is undoubtedly a gift ... the results of this current trial suggests that the 3-week treatment was not life-changing,” since effects had largely worn off by 6 months in both groups.

Longer-term treatment with ketamine “increases the likelihood of both drug dependence and cognitive adverse effects, including dissociation, paranoia, and other psychotic symptoms,” Dr. Freedman said.

He recommends that informed consent documents be used to caution patients and clinicians considering ketamine “that temporary relief may come with longer-term costs.”

The study was supported by a grant from PCORI to Dr. Anand. Dr. Freedman has disclosed no relevant financial relationships. In the past 2 years, Dr. Iosifescu has been a consultant for Axsome, Allergan, Biogen, Clexio, Jazz, Neumora, Relmada, and Sage. He has also received a research grant from Otsuka.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE NEW ENGLAND JOURNAL OF MEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Internet use a modifiable dementia risk factor in older adults?

Article Type
Changed
Mon, 05/22/2023 - 14:29

Self-reported, regular Internet use, but not overuse, in older adults is linked to a lower dementia risk, new research suggests.

Investigators followed more than 18,000 older individuals and found that regular Internet use was associated with about a 50% reduction in dementia risk, compared with their counterparts who did not use the Internet regularly.

They also found that longer duration of regular Internet use was associated with a reduced risk of dementia, although excessive daily Internet usage appeared to adversely affect dementia risk.

“Online engagement can develop and maintain cognitive reserve – resiliency against physiological damage to the brain – and increased cognitive reserve can, in turn, compensate for brain aging and reduce the risk of dementia,” study investigator Gawon Cho, a doctoral candidate at New York University School of Global Public Health, said in an interview.

The study was published online in the Journal of the American Geriatrics Society.
 

Unexamined benefits

Prior research has shown that older adult Internet users have “better overall cognitive performance, verbal reasoning, and memory,” compared with nonusers, the authors note.

However, because this body of research consists of cross-sectional analyses and longitudinal studies with brief follow-up periods, the long-term cognitive benefits of Internet usage remain “unexamined.”

In addition, despite “extensive evidence of a disproportionately high burden of dementia in people of color, individuals without higher education, and adults who experienced other socioeconomic hardships, little is known about whether the Internet has exacerbated population-level disparities in cognitive health,” the investigators add.

Another question concerns whether excessive Internet usage may actually be detrimental to neurocognitive outcomes. However, “existing evidence on the adverse effects of Internet usage is concentrated in younger populations whose brains are still undergoing maturation.”

Ms. Cho said the motivation for the study was the lack of longitudinal studies on this topic, especially those with sufficient follow-up periods. In addition, she said, there is insufficient evidence about how changes in Internet usage in older age are associated with prospective dementia risk.

For the study, investigators turned to participants in the Health and Retirement Study, an ongoing longitudinal survey of a nationally representative sample of U.S.-based older adults (aged ≥ 50 years).

All participants (n = 18,154; 47.36% male; median age, 55.17 years) were dementia-free, community-dwelling older adults who completed a 2002 baseline cognitive assessment and were asked about Internet usage every 2 years thereafter.

Participants were followed from 2002 to 2018 for a maximum of 17.1 years (median, 7.9 years), which is the longest follow-up period to date. Of the total sample, 64.76% were regular Internet users.

The study’s primary outcome was incident dementia, based on performance on the Modified Telephone Interview for Cognitive Status (TICS-M), which was administered every 2 years.

The exposure examined in the study was cumulative Internet usage in late adulthood, defined as “the number of biennial waves where participants used the Internet regularly during the first three waves.”

In addition, participants were asked how many hours they spent using the Internet during the past week for activities other than viewing television shows or movies.

The researchers also investigated whether the link between Internet usage and dementia risk varied by educational attainment, race-ethnicity, sex, and generational cohort.

Covariates included baseline TICS-M score, health, age, household income, marital status, and region of residence.
 

 

 

U-shaped curve

More than half of the sample (52.96%) showed no changes in Internet use from baseline during the study period, while one-fifth (20.54%) did show changes in use.

Investigators found a robust link between Internet usage and lower dementia risk (cause-specific hazard ratio, 0.57 [95% CI, 0.46-0.71]) – a finding that remained even after adjusting for self-selection into baseline usage (csHR, 0.54 [0.41-0.72]) and signs of cognitive decline at baseline (csHR, 0.62 [0.46-0.85]).

Each additional wave of regular Internet usage was associated with a 21% decrease in the risk of dementia (95% CI, 13%-29%), wherein additional regular periods were associated with reduced dementia risk (csHR, 0.80 [95% CI, 0.68-0.95]).

“The difference in risk between regular and nonregular users did not vary by educational attainment, race-ethnicity, sex, and generation,” the investigators note.

A U-shaped association was found between daily hours of online engagement, wherein the lowest risk was observed in those with 0.1-2 hours of usage (compared with 0 hours of usage). The risk increased in a “monotonic fashion” after 2 hours, with 6.1-8 hours of usage showing the highest risk.

This finding was not considered statistically significant, but the “consistent U-shaped trend offers a preliminary suggestion that excessive online engagement may have adverse cognitive effects on older adults,” the investigators note.

“Among older adults, regular Internet users may experience a lower risk of dementia compared to nonregular users, and longer periods of regular Internet usage in late adulthood may help reduce the risks of subsequent dementia incidence,” said Ms. Cho. “Nonetheless, using the Internet excessively daily may negatively affect the risk of dementia in older adults.”
 

Bidirectional relationship?

Commenting for this article, Claire Sexton, DPhil, Alzheimer’s Association senior director of scientific programs and outreach, noted that some risk factors for Alzheimer’s or other dementias can’t be changed, while others are modifiable, “either at a personal or a population level.”

She called the current research “important” because it “identifies a potentially modifiable factor that may influence dementia risk.”

However, cautioned Dr. Sexton, who was not involved with the study, the findings cannot establish cause and effect. In fact, the relationship may be bidirectional.

“It may be that regular Internet usage is associated with increased cognitive stimulation, and in turn reduced risk of dementia; or it may be that individuals with lower risk of dementia are more likely to engage in regular Internet usage,” she said. Thus, “interventional studies are able to shed more light on causation.”

The Health and Retirement Study is sponsored by the National Institute on Aging and is conducted by the University of Michigan, Ann Arbor. Ms. Cho, her coauthors, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

Self-reported, regular Internet use, but not overuse, in older adults is linked to a lower dementia risk, new research suggests.

Investigators followed more than 18,000 older individuals and found that regular Internet use was associated with about a 50% reduction in dementia risk, compared with their counterparts who did not use the Internet regularly.

They also found that longer duration of regular Internet use was associated with a reduced risk of dementia, although excessive daily Internet usage appeared to adversely affect dementia risk.

“Online engagement can develop and maintain cognitive reserve – resiliency against physiological damage to the brain – and increased cognitive reserve can, in turn, compensate for brain aging and reduce the risk of dementia,” study investigator Gawon Cho, a doctoral candidate at New York University School of Global Public Health, said in an interview.

The study was published online in the Journal of the American Geriatrics Society.
 

Unexamined benefits

Prior research has shown that older adult Internet users have “better overall cognitive performance, verbal reasoning, and memory,” compared with nonusers, the authors note.

However, because this body of research consists of cross-sectional analyses and longitudinal studies with brief follow-up periods, the long-term cognitive benefits of Internet usage remain “unexamined.”

In addition, despite “extensive evidence of a disproportionately high burden of dementia in people of color, individuals without higher education, and adults who experienced other socioeconomic hardships, little is known about whether the Internet has exacerbated population-level disparities in cognitive health,” the investigators add.

Another question concerns whether excessive Internet usage may actually be detrimental to neurocognitive outcomes. However, “existing evidence on the adverse effects of Internet usage is concentrated in younger populations whose brains are still undergoing maturation.”

Ms. Cho said the motivation for the study was the lack of longitudinal studies on this topic, especially those with sufficient follow-up periods. In addition, she said, there is insufficient evidence about how changes in Internet usage in older age are associated with prospective dementia risk.

For the study, investigators turned to participants in the Health and Retirement Study, an ongoing longitudinal survey of a nationally representative sample of U.S.-based older adults (aged ≥ 50 years).

All participants (n = 18,154; 47.36% male; median age, 55.17 years) were dementia-free, community-dwelling older adults who completed a 2002 baseline cognitive assessment and were asked about Internet usage every 2 years thereafter.

Participants were followed from 2002 to 2018 for a maximum of 17.1 years (median, 7.9 years), which is the longest follow-up period to date. Of the total sample, 64.76% were regular Internet users.

The study’s primary outcome was incident dementia, based on performance on the Modified Telephone Interview for Cognitive Status (TICS-M), which was administered every 2 years.

The exposure examined in the study was cumulative Internet usage in late adulthood, defined as “the number of biennial waves where participants used the Internet regularly during the first three waves.”

In addition, participants were asked how many hours they spent using the Internet during the past week for activities other than viewing television shows or movies.

The researchers also investigated whether the link between Internet usage and dementia risk varied by educational attainment, race-ethnicity, sex, and generational cohort.

Covariates included baseline TICS-M score, health, age, household income, marital status, and region of residence.
 

 

 

U-shaped curve

More than half of the sample (52.96%) showed no changes in Internet use from baseline during the study period, while one-fifth (20.54%) did show changes in use.

Investigators found a robust link between Internet usage and lower dementia risk (cause-specific hazard ratio, 0.57 [95% CI, 0.46-0.71]) – a finding that remained even after adjusting for self-selection into baseline usage (csHR, 0.54 [0.41-0.72]) and signs of cognitive decline at baseline (csHR, 0.62 [0.46-0.85]).

Each additional wave of regular Internet usage was associated with a 21% decrease in the risk of dementia (95% CI, 13%-29%), wherein additional regular periods were associated with reduced dementia risk (csHR, 0.80 [95% CI, 0.68-0.95]).

“The difference in risk between regular and nonregular users did not vary by educational attainment, race-ethnicity, sex, and generation,” the investigators note.

A U-shaped association was found between daily hours of online engagement, wherein the lowest risk was observed in those with 0.1-2 hours of usage (compared with 0 hours of usage). The risk increased in a “monotonic fashion” after 2 hours, with 6.1-8 hours of usage showing the highest risk.

This finding was not considered statistically significant, but the “consistent U-shaped trend offers a preliminary suggestion that excessive online engagement may have adverse cognitive effects on older adults,” the investigators note.

“Among older adults, regular Internet users may experience a lower risk of dementia compared to nonregular users, and longer periods of regular Internet usage in late adulthood may help reduce the risks of subsequent dementia incidence,” said Ms. Cho. “Nonetheless, using the Internet excessively daily may negatively affect the risk of dementia in older adults.”
 

Bidirectional relationship?

Commenting for this article, Claire Sexton, DPhil, Alzheimer’s Association senior director of scientific programs and outreach, noted that some risk factors for Alzheimer’s or other dementias can’t be changed, while others are modifiable, “either at a personal or a population level.”

She called the current research “important” because it “identifies a potentially modifiable factor that may influence dementia risk.”

However, cautioned Dr. Sexton, who was not involved with the study, the findings cannot establish cause and effect. In fact, the relationship may be bidirectional.

“It may be that regular Internet usage is associated with increased cognitive stimulation, and in turn reduced risk of dementia; or it may be that individuals with lower risk of dementia are more likely to engage in regular Internet usage,” she said. Thus, “interventional studies are able to shed more light on causation.”

The Health and Retirement Study is sponsored by the National Institute on Aging and is conducted by the University of Michigan, Ann Arbor. Ms. Cho, her coauthors, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Self-reported, regular Internet use, but not overuse, in older adults is linked to a lower dementia risk, new research suggests.

Investigators followed more than 18,000 older individuals and found that regular Internet use was associated with about a 50% reduction in dementia risk, compared with their counterparts who did not use the Internet regularly.

They also found that longer duration of regular Internet use was associated with a reduced risk of dementia, although excessive daily Internet usage appeared to adversely affect dementia risk.

“Online engagement can develop and maintain cognitive reserve – resiliency against physiological damage to the brain – and increased cognitive reserve can, in turn, compensate for brain aging and reduce the risk of dementia,” study investigator Gawon Cho, a doctoral candidate at New York University School of Global Public Health, said in an interview.

The study was published online in the Journal of the American Geriatrics Society.
 

Unexamined benefits

Prior research has shown that older adult Internet users have “better overall cognitive performance, verbal reasoning, and memory,” compared with nonusers, the authors note.

However, because this body of research consists of cross-sectional analyses and longitudinal studies with brief follow-up periods, the long-term cognitive benefits of Internet usage remain “unexamined.”

In addition, despite “extensive evidence of a disproportionately high burden of dementia in people of color, individuals without higher education, and adults who experienced other socioeconomic hardships, little is known about whether the Internet has exacerbated population-level disparities in cognitive health,” the investigators add.

Another question concerns whether excessive Internet usage may actually be detrimental to neurocognitive outcomes. However, “existing evidence on the adverse effects of Internet usage is concentrated in younger populations whose brains are still undergoing maturation.”

Ms. Cho said the motivation for the study was the lack of longitudinal studies on this topic, especially those with sufficient follow-up periods. In addition, she said, there is insufficient evidence about how changes in Internet usage in older age are associated with prospective dementia risk.

For the study, investigators turned to participants in the Health and Retirement Study, an ongoing longitudinal survey of a nationally representative sample of U.S.-based older adults (aged ≥ 50 years).

All participants (n = 18,154; 47.36% male; median age, 55.17 years) were dementia-free, community-dwelling older adults who completed a 2002 baseline cognitive assessment and were asked about Internet usage every 2 years thereafter.

Participants were followed from 2002 to 2018 for a maximum of 17.1 years (median, 7.9 years), which is the longest follow-up period to date. Of the total sample, 64.76% were regular Internet users.

The study’s primary outcome was incident dementia, based on performance on the Modified Telephone Interview for Cognitive Status (TICS-M), which was administered every 2 years.

The exposure examined in the study was cumulative Internet usage in late adulthood, defined as “the number of biennial waves where participants used the Internet regularly during the first three waves.”

In addition, participants were asked how many hours they spent using the Internet during the past week for activities other than viewing television shows or movies.

The researchers also investigated whether the link between Internet usage and dementia risk varied by educational attainment, race-ethnicity, sex, and generational cohort.

Covariates included baseline TICS-M score, health, age, household income, marital status, and region of residence.
 

 

 

U-shaped curve

More than half of the sample (52.96%) showed no changes in Internet use from baseline during the study period, while one-fifth (20.54%) did show changes in use.

Investigators found a robust link between Internet usage and lower dementia risk (cause-specific hazard ratio, 0.57 [95% CI, 0.46-0.71]) – a finding that remained even after adjusting for self-selection into baseline usage (csHR, 0.54 [0.41-0.72]) and signs of cognitive decline at baseline (csHR, 0.62 [0.46-0.85]).

Each additional wave of regular Internet usage was associated with a 21% decrease in the risk of dementia (95% CI, 13%-29%), wherein additional regular periods were associated with reduced dementia risk (csHR, 0.80 [95% CI, 0.68-0.95]).

“The difference in risk between regular and nonregular users did not vary by educational attainment, race-ethnicity, sex, and generation,” the investigators note.

A U-shaped association was found between daily hours of online engagement, wherein the lowest risk was observed in those with 0.1-2 hours of usage (compared with 0 hours of usage). The risk increased in a “monotonic fashion” after 2 hours, with 6.1-8 hours of usage showing the highest risk.

This finding was not considered statistically significant, but the “consistent U-shaped trend offers a preliminary suggestion that excessive online engagement may have adverse cognitive effects on older adults,” the investigators note.

“Among older adults, regular Internet users may experience a lower risk of dementia compared to nonregular users, and longer periods of regular Internet usage in late adulthood may help reduce the risks of subsequent dementia incidence,” said Ms. Cho. “Nonetheless, using the Internet excessively daily may negatively affect the risk of dementia in older adults.”
 

Bidirectional relationship?

Commenting for this article, Claire Sexton, DPhil, Alzheimer’s Association senior director of scientific programs and outreach, noted that some risk factors for Alzheimer’s or other dementias can’t be changed, while others are modifiable, “either at a personal or a population level.”

She called the current research “important” because it “identifies a potentially modifiable factor that may influence dementia risk.”

However, cautioned Dr. Sexton, who was not involved with the study, the findings cannot establish cause and effect. In fact, the relationship may be bidirectional.

“It may be that regular Internet usage is associated with increased cognitive stimulation, and in turn reduced risk of dementia; or it may be that individuals with lower risk of dementia are more likely to engage in regular Internet usage,” she said. Thus, “interventional studies are able to shed more light on causation.”

The Health and Retirement Study is sponsored by the National Institute on Aging and is conducted by the University of Michigan, Ann Arbor. Ms. Cho, her coauthors, and Dr. Sexton have disclosed no relevant financial relationships.

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF THE AMERICAN GERIATRICS SOCIETY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Disrupted gut microbiome a key driver of major depression?

Article Type
Changed
Fri, 04/28/2023 - 00:43

Major depressive disorder (MDD) is linked to disruptions in energy and lipid metabolism, possibly caused by the interplay of the gut microbiome and blood metabolome, new research suggests.

Investigators found that MDD had specific metabolic “signatures” consisting of 124 metabolites that spanned energy and lipid pathways, with some involving the tricarboxylic acid cycle in particular. These changes in metabolites were consistent with differences in composition of several gut microbiota.

The researchers found that fatty acids and intermediate and very large lipoproteins changed in association with the depressive disease process. However, high-density lipoproteins and metabolites in the tricarboxylic acid cycle did not.

“As we wait to establish causal influences through clinical trials, clinicians should advise patients suffering from mood disorders to modify their diet by increasing the intake of fresh fruits, vegetables, and whole grains, as these provide the required fuel/fiber to the gut microbiota for their enrichment, and more short-chain fatty acids are produced for the optimal functioning of the body,” study investigator Najaf Amin, PhD, DSc, senior researcher, Nuffield Department of Population Health, Oxford University, England, told this news organization.

“At the same time, patients should be advised to minimize the intake of sugars and processed foods, which are known to have an inverse impact on the gut microbiome and are associated with higher inflammation,” she said.

The study was published online in JAMA Psychiatry.
 

MDD poorly understood

Although most antidepressants target the monoamine pathway, “evidence is increasing for a more complex interplay of multiple pathways involving a wide range of metabolic alterations spanning energy and lipid metabolism,” the authors wrote.

Previous research using the Nightingale proton nuclear magnetic resonance (NMR) metabolomics platform showed a “shift” toward decreased levels of high-density lipoproteins (HDLs) and increased levels of very low-density lipoproteins (VLDLs) and triglycerides among patients with depression.

The gut microbiome, which is primarily modulated by diet, “has been shown to be a major determinant of circulating lipids, specifically triglycerides and HDLs, and to regulate mitochondrial function,” the investigators noted. Patients with MDD are known to have disruptions in the gut microbiome.

The gut microbiome may “explain part of the shift in VLDL and HDL levels observed in patients with depression and if the metabolic signatures of the disease based on Nightingale metabolites can be used as a tool to infer the association between gut microbiome and depression.”

Dr. Amin called depression “one of the most poorly understood diseases, as underlying mechanisms remain elusive.”

Large-scale genetic studies “have shown that the contribution of genetics to depression is modest,” she continued. On the other hand, initial animal studies suggest the gut microbiome “may potentially have a causal influence on depression.”

Several studies have evaluated the influence of gut microbiome on depression, “but, due to small sample sizes and inadequate control for confounding factors, most of their findings were not reproducible.”

Harnessing the power of the UK Biobank, the investigators studied 58,257 individuals who were between the ages of 37 and 73 years at recruitment. They used data on NMR spectroscopy–based plasma metabolites in depression. Individuals who didn’t report depression at baseline served as controls.

Logistic regression analysis was used to test the association of metabolite levels with depression in four models, each with an increasing number of covariates.

To identify patterns of correlation in the “metabolic signatures of MDD and the human gut biome,” they regressed the metabolic signatures of MDD on the metabolic signatures of the gut microbiota and then regressed the metabolic signature of gut microbiota on the metabolic signatures of MDD.

Bidirectional 2-sample Mendelian randomization was used to ascertain the direction of the association observed between metabolites and MDD.

Individuals with lifetime and recurrent MDD were compared with controls (6,811 vs. 51,446 and 4,370 vs. 62,508, respectively).

Participants with lifetime MDD were significantly younger (median [IQR] age, 56 [49-62] years vs. 58 [51-64] years) and were more likely to be female in comparison with controls (54% vs. 35%).
 

 

 

‘Novel findings’

In the fully adjusted analysis, metabolic signatures of MDD were found to consist of 124 metabolites that spanned energy and lipid metabolism pathways.

The investigators noted that these “novel findings” included 49 metabolites encompassing those involved in the tricarboxylic acid cycle – citrate and pyruvate.

The findings revealed that fatty acids and intermediate and VLDL changed in association with the disease process. On the other hand, HDL and the metabolites in the tricarboxylic acid cycle did not.

“We observed that the genera Sellimonas, Eggerthella, Hungatella, and Lachnoclostridium were more abundant, while genera Ruminococcaceae ... Coprococcus, Lachnospiraceae ... Eubacterium ventriosum, Subdoligranulum, and family Ruminococcaceae were depleted in the guts of individuals with more symptoms of depression,” said Dr. Amin. “Of these, genus Eggerthella showed statistical evidence of being involved in the causal pathway.”

These microbes are involved in the synthesis of important neurotransmitters, such as gamma aminobutyric acid, butyrate, glutamate, and serotonin, she noted.

Butyrate produced by the gut can cross the blood-brain barrier, enter the brain, and affect transcriptional and translational activity or be used by the cells for generating energy, she added. “So basically, butyrate can influence depression through several routes – i.e., via immune regulation, genomic transcript/translation, and/or affecting energy metabolism.”
 

No causality

Commenting on the study, Emeran Mayer, MD, distinguished research professor of medicine, G. Oppenheimer Center for Neurobiology of Stress and Resilience and UCLA Brain Gut Microbiome Center, called it the “largest, most comprehensive and best validated association study to date providing further evidence for an association between gut microbial taxa, previously identified in patients with MDD, blood metabolites (generated by host and by microbes) and questionnaire data.”

However, “despite its strengths, the study does not allow [us] to identify a causal role of the microbiome alterations in the observed microbial and metabolic changes (fatty acids, Krebs cycle components),” cautioned Dr. Mayer, who was not involved with the study.

Moreover, “causality of gut microbial changes on the behavioral phenotype of depression cannot been inferred,” he concluded.

Metabolomics data were provided by the Alzheimer’s Disease Metabolomics Consortium. The study was funded wholly or in part by grants from the National Institute on Aging and Foundation for the National Institutes of Health. It was further supported by a grant from ZonMW Memorabel. Dr. Amin reports no relevant financial relationships. The other authors’ disclosures are listed oin the original article. Dr. Mayer is a scientific advisory board member of Danone, Axial Therapeutics, Viome, Amare, Mahana Therapeutics, Pendulum, Bloom Biosciences, and APC Microbiome Ireland.
 

A version of this article originally appeared on Medscape.com.

Publications
Topics
Sections

Major depressive disorder (MDD) is linked to disruptions in energy and lipid metabolism, possibly caused by the interplay of the gut microbiome and blood metabolome, new research suggests.

Investigators found that MDD had specific metabolic “signatures” consisting of 124 metabolites that spanned energy and lipid pathways, with some involving the tricarboxylic acid cycle in particular. These changes in metabolites were consistent with differences in composition of several gut microbiota.

The researchers found that fatty acids and intermediate and very large lipoproteins changed in association with the depressive disease process. However, high-density lipoproteins and metabolites in the tricarboxylic acid cycle did not.

“As we wait to establish causal influences through clinical trials, clinicians should advise patients suffering from mood disorders to modify their diet by increasing the intake of fresh fruits, vegetables, and whole grains, as these provide the required fuel/fiber to the gut microbiota for their enrichment, and more short-chain fatty acids are produced for the optimal functioning of the body,” study investigator Najaf Amin, PhD, DSc, senior researcher, Nuffield Department of Population Health, Oxford University, England, told this news organization.

“At the same time, patients should be advised to minimize the intake of sugars and processed foods, which are known to have an inverse impact on the gut microbiome and are associated with higher inflammation,” she said.

The study was published online in JAMA Psychiatry.
 

MDD poorly understood

Although most antidepressants target the monoamine pathway, “evidence is increasing for a more complex interplay of multiple pathways involving a wide range of metabolic alterations spanning energy and lipid metabolism,” the authors wrote.

Previous research using the Nightingale proton nuclear magnetic resonance (NMR) metabolomics platform showed a “shift” toward decreased levels of high-density lipoproteins (HDLs) and increased levels of very low-density lipoproteins (VLDLs) and triglycerides among patients with depression.

The gut microbiome, which is primarily modulated by diet, “has been shown to be a major determinant of circulating lipids, specifically triglycerides and HDLs, and to regulate mitochondrial function,” the investigators noted. Patients with MDD are known to have disruptions in the gut microbiome.

The gut microbiome may “explain part of the shift in VLDL and HDL levels observed in patients with depression and if the metabolic signatures of the disease based on Nightingale metabolites can be used as a tool to infer the association between gut microbiome and depression.”

Dr. Amin called depression “one of the most poorly understood diseases, as underlying mechanisms remain elusive.”

Large-scale genetic studies “have shown that the contribution of genetics to depression is modest,” she continued. On the other hand, initial animal studies suggest the gut microbiome “may potentially have a causal influence on depression.”

Several studies have evaluated the influence of gut microbiome on depression, “but, due to small sample sizes and inadequate control for confounding factors, most of their findings were not reproducible.”

Harnessing the power of the UK Biobank, the investigators studied 58,257 individuals who were between the ages of 37 and 73 years at recruitment. They used data on NMR spectroscopy–based plasma metabolites in depression. Individuals who didn’t report depression at baseline served as controls.

Logistic regression analysis was used to test the association of metabolite levels with depression in four models, each with an increasing number of covariates.

To identify patterns of correlation in the “metabolic signatures of MDD and the human gut biome,” they regressed the metabolic signatures of MDD on the metabolic signatures of the gut microbiota and then regressed the metabolic signature of gut microbiota on the metabolic signatures of MDD.

Bidirectional 2-sample Mendelian randomization was used to ascertain the direction of the association observed between metabolites and MDD.

Individuals with lifetime and recurrent MDD were compared with controls (6,811 vs. 51,446 and 4,370 vs. 62,508, respectively).

Participants with lifetime MDD were significantly younger (median [IQR] age, 56 [49-62] years vs. 58 [51-64] years) and were more likely to be female in comparison with controls (54% vs. 35%).
 

 

 

‘Novel findings’

In the fully adjusted analysis, metabolic signatures of MDD were found to consist of 124 metabolites that spanned energy and lipid metabolism pathways.

The investigators noted that these “novel findings” included 49 metabolites encompassing those involved in the tricarboxylic acid cycle – citrate and pyruvate.

The findings revealed that fatty acids and intermediate and VLDL changed in association with the disease process. On the other hand, HDL and the metabolites in the tricarboxylic acid cycle did not.

“We observed that the genera Sellimonas, Eggerthella, Hungatella, and Lachnoclostridium were more abundant, while genera Ruminococcaceae ... Coprococcus, Lachnospiraceae ... Eubacterium ventriosum, Subdoligranulum, and family Ruminococcaceae were depleted in the guts of individuals with more symptoms of depression,” said Dr. Amin. “Of these, genus Eggerthella showed statistical evidence of being involved in the causal pathway.”

These microbes are involved in the synthesis of important neurotransmitters, such as gamma aminobutyric acid, butyrate, glutamate, and serotonin, she noted.

Butyrate produced by the gut can cross the blood-brain barrier, enter the brain, and affect transcriptional and translational activity or be used by the cells for generating energy, she added. “So basically, butyrate can influence depression through several routes – i.e., via immune regulation, genomic transcript/translation, and/or affecting energy metabolism.”
 

No causality

Commenting on the study, Emeran Mayer, MD, distinguished research professor of medicine, G. Oppenheimer Center for Neurobiology of Stress and Resilience and UCLA Brain Gut Microbiome Center, called it the “largest, most comprehensive and best validated association study to date providing further evidence for an association between gut microbial taxa, previously identified in patients with MDD, blood metabolites (generated by host and by microbes) and questionnaire data.”

However, “despite its strengths, the study does not allow [us] to identify a causal role of the microbiome alterations in the observed microbial and metabolic changes (fatty acids, Krebs cycle components),” cautioned Dr. Mayer, who was not involved with the study.

Moreover, “causality of gut microbial changes on the behavioral phenotype of depression cannot been inferred,” he concluded.

Metabolomics data were provided by the Alzheimer’s Disease Metabolomics Consortium. The study was funded wholly or in part by grants from the National Institute on Aging and Foundation for the National Institutes of Health. It was further supported by a grant from ZonMW Memorabel. Dr. Amin reports no relevant financial relationships. The other authors’ disclosures are listed oin the original article. Dr. Mayer is a scientific advisory board member of Danone, Axial Therapeutics, Viome, Amare, Mahana Therapeutics, Pendulum, Bloom Biosciences, and APC Microbiome Ireland.
 

A version of this article originally appeared on Medscape.com.

Major depressive disorder (MDD) is linked to disruptions in energy and lipid metabolism, possibly caused by the interplay of the gut microbiome and blood metabolome, new research suggests.

Investigators found that MDD had specific metabolic “signatures” consisting of 124 metabolites that spanned energy and lipid pathways, with some involving the tricarboxylic acid cycle in particular. These changes in metabolites were consistent with differences in composition of several gut microbiota.

The researchers found that fatty acids and intermediate and very large lipoproteins changed in association with the depressive disease process. However, high-density lipoproteins and metabolites in the tricarboxylic acid cycle did not.

“As we wait to establish causal influences through clinical trials, clinicians should advise patients suffering from mood disorders to modify their diet by increasing the intake of fresh fruits, vegetables, and whole grains, as these provide the required fuel/fiber to the gut microbiota for their enrichment, and more short-chain fatty acids are produced for the optimal functioning of the body,” study investigator Najaf Amin, PhD, DSc, senior researcher, Nuffield Department of Population Health, Oxford University, England, told this news organization.

“At the same time, patients should be advised to minimize the intake of sugars and processed foods, which are known to have an inverse impact on the gut microbiome and are associated with higher inflammation,” she said.

The study was published online in JAMA Psychiatry.
 

MDD poorly understood

Although most antidepressants target the monoamine pathway, “evidence is increasing for a more complex interplay of multiple pathways involving a wide range of metabolic alterations spanning energy and lipid metabolism,” the authors wrote.

Previous research using the Nightingale proton nuclear magnetic resonance (NMR) metabolomics platform showed a “shift” toward decreased levels of high-density lipoproteins (HDLs) and increased levels of very low-density lipoproteins (VLDLs) and triglycerides among patients with depression.

The gut microbiome, which is primarily modulated by diet, “has been shown to be a major determinant of circulating lipids, specifically triglycerides and HDLs, and to regulate mitochondrial function,” the investigators noted. Patients with MDD are known to have disruptions in the gut microbiome.

The gut microbiome may “explain part of the shift in VLDL and HDL levels observed in patients with depression and if the metabolic signatures of the disease based on Nightingale metabolites can be used as a tool to infer the association between gut microbiome and depression.”

Dr. Amin called depression “one of the most poorly understood diseases, as underlying mechanisms remain elusive.”

Large-scale genetic studies “have shown that the contribution of genetics to depression is modest,” she continued. On the other hand, initial animal studies suggest the gut microbiome “may potentially have a causal influence on depression.”

Several studies have evaluated the influence of gut microbiome on depression, “but, due to small sample sizes and inadequate control for confounding factors, most of their findings were not reproducible.”

Harnessing the power of the UK Biobank, the investigators studied 58,257 individuals who were between the ages of 37 and 73 years at recruitment. They used data on NMR spectroscopy–based plasma metabolites in depression. Individuals who didn’t report depression at baseline served as controls.

Logistic regression analysis was used to test the association of metabolite levels with depression in four models, each with an increasing number of covariates.

To identify patterns of correlation in the “metabolic signatures of MDD and the human gut biome,” they regressed the metabolic signatures of MDD on the metabolic signatures of the gut microbiota and then regressed the metabolic signature of gut microbiota on the metabolic signatures of MDD.

Bidirectional 2-sample Mendelian randomization was used to ascertain the direction of the association observed between metabolites and MDD.

Individuals with lifetime and recurrent MDD were compared with controls (6,811 vs. 51,446 and 4,370 vs. 62,508, respectively).

Participants with lifetime MDD were significantly younger (median [IQR] age, 56 [49-62] years vs. 58 [51-64] years) and were more likely to be female in comparison with controls (54% vs. 35%).
 

 

 

‘Novel findings’

In the fully adjusted analysis, metabolic signatures of MDD were found to consist of 124 metabolites that spanned energy and lipid metabolism pathways.

The investigators noted that these “novel findings” included 49 metabolites encompassing those involved in the tricarboxylic acid cycle – citrate and pyruvate.

The findings revealed that fatty acids and intermediate and VLDL changed in association with the disease process. On the other hand, HDL and the metabolites in the tricarboxylic acid cycle did not.

“We observed that the genera Sellimonas, Eggerthella, Hungatella, and Lachnoclostridium were more abundant, while genera Ruminococcaceae ... Coprococcus, Lachnospiraceae ... Eubacterium ventriosum, Subdoligranulum, and family Ruminococcaceae were depleted in the guts of individuals with more symptoms of depression,” said Dr. Amin. “Of these, genus Eggerthella showed statistical evidence of being involved in the causal pathway.”

These microbes are involved in the synthesis of important neurotransmitters, such as gamma aminobutyric acid, butyrate, glutamate, and serotonin, she noted.

Butyrate produced by the gut can cross the blood-brain barrier, enter the brain, and affect transcriptional and translational activity or be used by the cells for generating energy, she added. “So basically, butyrate can influence depression through several routes – i.e., via immune regulation, genomic transcript/translation, and/or affecting energy metabolism.”
 

No causality

Commenting on the study, Emeran Mayer, MD, distinguished research professor of medicine, G. Oppenheimer Center for Neurobiology of Stress and Resilience and UCLA Brain Gut Microbiome Center, called it the “largest, most comprehensive and best validated association study to date providing further evidence for an association between gut microbial taxa, previously identified in patients with MDD, blood metabolites (generated by host and by microbes) and questionnaire data.”

However, “despite its strengths, the study does not allow [us] to identify a causal role of the microbiome alterations in the observed microbial and metabolic changes (fatty acids, Krebs cycle components),” cautioned Dr. Mayer, who was not involved with the study.

Moreover, “causality of gut microbial changes on the behavioral phenotype of depression cannot been inferred,” he concluded.

Metabolomics data were provided by the Alzheimer’s Disease Metabolomics Consortium. The study was funded wholly or in part by grants from the National Institute on Aging and Foundation for the National Institutes of Health. It was further supported by a grant from ZonMW Memorabel. Dr. Amin reports no relevant financial relationships. The other authors’ disclosures are listed oin the original article. Dr. Mayer is a scientific advisory board member of Danone, Axial Therapeutics, Viome, Amare, Mahana Therapeutics, Pendulum, Bloom Biosciences, and APC Microbiome Ireland.
 

A version of this article originally appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JAMA PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Walnuts linked to improved attention, psychological maturity in teens

Article Type
Changed
Fri, 04/28/2023 - 00:44

Walnuts have been associated with better cognitive development and psychological maturation in teens, new research shows. Adolescents who consumed walnuts for at least 100 days showed improved sustained attention and fluid intelligence as well as a reduction in symptoms of attension deficit hyperactivity disorder, compared with matched controls who did not consume the nuts. However, there were no statistically significant changes between the groups in other parameters, such as working memory and executive function.

Clinicians should advise adolescents “to eat a handful of walnuts three times a week for the rest of their lives. They may have a healthier brain with better cognitive function,” said senior investigator Jordi Julvez, PhD, group leader at the Institute of Health Research Pere Virgili, Barcelona, and associated researcher at the Barcelona Institute for Global Health.

The study was published online in eClinicalMedicine.
 

Rich source of omega-3s

Adolescence is “a period of refinement of brain connectivity and complex behaviors,” the investigators noted.  

Previous research suggests polyunsaturated fatty acids are key in central nervous system architecture and function during times of neural development, with three specific PUFAs playing an “essential developmental role.”

Two omega-3 fatty acids – docosahexaenoic acid and eicosapentaenoic acid – are PUFAs that must be obtained through diet, mainly from seafood. Walnuts are “among the richest sources” of plant-derived omega-3 fatty acids, particularly alpha-linolenic acid (ALA), a precursor for longer-chain EPA and DHA.

ALA independently “has positive effects on brain function and plasticity,” the authors wrote. In addition, walnut constituents – particularly polyphenols and other bioactive compounds – “may act synergistically with ALA to foster brain health.”

Earlier small studies have found positive associations between walnut consumption and cognitive function in children, adolescents, and young adults, but to date, no randomized controlled trial has focused on the effect of walnut consumption on adolescent neuropsychological function.

The researchers studied 771 healthy adolescents (aged 11-16 years, mean age 14) drawn from 12 Spanish high schools. Participants were instructed to follow healthy eating recommendations and were randomly assigned 1:1 to the intervention (n = 386) or the control group (n = 385).

At baseline and after 6 months, they completed neuropsychological tests and behavioral rating scales. The Attention Network Test assessed attention, and the N-back test was used to assess working memory. The Tests of Primary Mental Abilities assessed fluid intelligence. Risky decision-making was tested using the Roulettes Task.
 

Fruit and nuts

Participants also completed the Strengths and Difficulties Questionnaire, which provided a total score of problem behavior. Teachers filled out the ADHD DSM-IV form list to provide additional information about ADHD behaviors.

The intervention group received 30 grams/day of raw California walnut kernels to incorporate into their daily diet. It is estimated that this walnut contains about 9 g of ALA per 100 g.

All participants received a seasonal fruit calendar and were asked to eat at least one piece of seasonal fruit daily.

Parents reported their child’s daily walnut consumption, with adherence defined as 100 or more days of eating walnuts during the 6-month period.

All main analyses were based on an intention-to-treat method (participants were analyzed according to their original group assignment, regardless of their adherence to the intervention).

The researchers also conducted a secondary per-protocol analysis, comparing the intervention and control groups to estimate the effect if all participants had adhered to their assigned intervention. They censored data for participants who reported eating walnuts for less than 100 days during the 6-month trial period.

Secondary outcomes included changes in height, weight, waist circumference, and BMI, as well as red blood cell proportions of omega-3 fatty acids (DHA, EPA, and ALA) at baseline and after 6 months.
 

 

 

Adherence counts

Most participants had “medium” or “high” levels of adherence to the Mediterranean diet, with “no meaningful differences” at baseline between the intervention and control groups in lifestyle characteristics or mean scores in all primary endpoints.

In the ITT analysis, there were no statistically significant differences in primary outcomes between the groups following the intervention. As for secondary outcomes, the RBC ALA significantly increased in the walnuts group but not the control group (coefficient, 0.04%; 95% confidence interval, 0.03%-0.06%; P < .0001).

However, there were differences in primary outcomes between the groups in the per-protocol analysis: The adherence-adjusted effect on improvement in attention score was −11.26 ms; 95% CI, −19.92 to −2.60; P = .011) for the intervention versus the control group.

The per-protocol analysis showed other differences: an improvement in fluid intelligence score (1.78; 95% CI, 0.90 - 2.67; P < .0001) and a reduction in ADHD symptom score (−2.18; 95% CI, −3.70 to −0.67; P = .0050).

“Overall, no significant differences were found in the intervention group in relation to the control group,” Dr. Julvez said in a news release. “But if the adherence factor is considered, then positive results are observed, since participants who most closely followed the guidelines – in terms of the recommended dose of walnuts and the number of days of consumption – did show improvements in the neuropsychological functions evaluated.”

Adolescence “is a time of great biological changes. Hormonal transformation occurs, which in turn is responsible for stimulating the synaptic growth of the frontal lobe,” he continued, adding that this brain region “enables neuropsychological maturation of more complex emotional and cognitive functions.”

“Neurons that are well nourished with these types of fatty acids will be able to grow and form new, stronger synapses,” he said.
 

Food as medicine

Uma Naidoo, MD, director of nutritional and lifestyle psychiatry at Massachusetts General Hospital, Boston, “commends” the researchers for conducting an RCT with a “robust” sample size and said she is “excited to see research like this furthering functional nutrition for mental health,” as she believes that “food is medicine.”

Dr. Naidoo, a professional chef, nutritional biologist, and author of the book “This Is Your Brain on Food,” said the findings “align” with her own approach to nutritional psychiatry and are also “in line” with her clinical practice.

However, although these results are “promising,” more research is needed across more diverse populations to “make sure these results are truly generalizable,” said Dr. Naidoo, a faculty member at Harvard Medical School, Boston, who was not involved with the study.

She “envisions a future where the research is so advanced that we can ‘dose’ these healthy whole foods for specific psychiatric symptoms and conditions.”

This study was supported by Instituto de Salud Carlos III (co-funded by European Union Regional Development Fund “A way to make Europe”). The California Walnut Commission has given support by supplying the walnuts for free for the Walnuts Smart Snack Dietary Intervention Trial. Dr. Julvez holds a Miguel Servet-II contract awarded by the Instituto de Salud Carlos III (co-funded by European Union Social Fund). The other authors’ disclosures are listed in the original article. Dr. Naidoo reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Walnuts have been associated with better cognitive development and psychological maturation in teens, new research shows. Adolescents who consumed walnuts for at least 100 days showed improved sustained attention and fluid intelligence as well as a reduction in symptoms of attension deficit hyperactivity disorder, compared with matched controls who did not consume the nuts. However, there were no statistically significant changes between the groups in other parameters, such as working memory and executive function.

Clinicians should advise adolescents “to eat a handful of walnuts three times a week for the rest of their lives. They may have a healthier brain with better cognitive function,” said senior investigator Jordi Julvez, PhD, group leader at the Institute of Health Research Pere Virgili, Barcelona, and associated researcher at the Barcelona Institute for Global Health.

The study was published online in eClinicalMedicine.
 

Rich source of omega-3s

Adolescence is “a period of refinement of brain connectivity and complex behaviors,” the investigators noted.  

Previous research suggests polyunsaturated fatty acids are key in central nervous system architecture and function during times of neural development, with three specific PUFAs playing an “essential developmental role.”

Two omega-3 fatty acids – docosahexaenoic acid and eicosapentaenoic acid – are PUFAs that must be obtained through diet, mainly from seafood. Walnuts are “among the richest sources” of plant-derived omega-3 fatty acids, particularly alpha-linolenic acid (ALA), a precursor for longer-chain EPA and DHA.

ALA independently “has positive effects on brain function and plasticity,” the authors wrote. In addition, walnut constituents – particularly polyphenols and other bioactive compounds – “may act synergistically with ALA to foster brain health.”

Earlier small studies have found positive associations between walnut consumption and cognitive function in children, adolescents, and young adults, but to date, no randomized controlled trial has focused on the effect of walnut consumption on adolescent neuropsychological function.

The researchers studied 771 healthy adolescents (aged 11-16 years, mean age 14) drawn from 12 Spanish high schools. Participants were instructed to follow healthy eating recommendations and were randomly assigned 1:1 to the intervention (n = 386) or the control group (n = 385).

At baseline and after 6 months, they completed neuropsychological tests and behavioral rating scales. The Attention Network Test assessed attention, and the N-back test was used to assess working memory. The Tests of Primary Mental Abilities assessed fluid intelligence. Risky decision-making was tested using the Roulettes Task.
 

Fruit and nuts

Participants also completed the Strengths and Difficulties Questionnaire, which provided a total score of problem behavior. Teachers filled out the ADHD DSM-IV form list to provide additional information about ADHD behaviors.

The intervention group received 30 grams/day of raw California walnut kernels to incorporate into their daily diet. It is estimated that this walnut contains about 9 g of ALA per 100 g.

All participants received a seasonal fruit calendar and were asked to eat at least one piece of seasonal fruit daily.

Parents reported their child’s daily walnut consumption, with adherence defined as 100 or more days of eating walnuts during the 6-month period.

All main analyses were based on an intention-to-treat method (participants were analyzed according to their original group assignment, regardless of their adherence to the intervention).

The researchers also conducted a secondary per-protocol analysis, comparing the intervention and control groups to estimate the effect if all participants had adhered to their assigned intervention. They censored data for participants who reported eating walnuts for less than 100 days during the 6-month trial period.

Secondary outcomes included changes in height, weight, waist circumference, and BMI, as well as red blood cell proportions of omega-3 fatty acids (DHA, EPA, and ALA) at baseline and after 6 months.
 

 

 

Adherence counts

Most participants had “medium” or “high” levels of adherence to the Mediterranean diet, with “no meaningful differences” at baseline between the intervention and control groups in lifestyle characteristics or mean scores in all primary endpoints.

In the ITT analysis, there were no statistically significant differences in primary outcomes between the groups following the intervention. As for secondary outcomes, the RBC ALA significantly increased in the walnuts group but not the control group (coefficient, 0.04%; 95% confidence interval, 0.03%-0.06%; P < .0001).

However, there were differences in primary outcomes between the groups in the per-protocol analysis: The adherence-adjusted effect on improvement in attention score was −11.26 ms; 95% CI, −19.92 to −2.60; P = .011) for the intervention versus the control group.

The per-protocol analysis showed other differences: an improvement in fluid intelligence score (1.78; 95% CI, 0.90 - 2.67; P < .0001) and a reduction in ADHD symptom score (−2.18; 95% CI, −3.70 to −0.67; P = .0050).

“Overall, no significant differences were found in the intervention group in relation to the control group,” Dr. Julvez said in a news release. “But if the adherence factor is considered, then positive results are observed, since participants who most closely followed the guidelines – in terms of the recommended dose of walnuts and the number of days of consumption – did show improvements in the neuropsychological functions evaluated.”

Adolescence “is a time of great biological changes. Hormonal transformation occurs, which in turn is responsible for stimulating the synaptic growth of the frontal lobe,” he continued, adding that this brain region “enables neuropsychological maturation of more complex emotional and cognitive functions.”

“Neurons that are well nourished with these types of fatty acids will be able to grow and form new, stronger synapses,” he said.
 

Food as medicine

Uma Naidoo, MD, director of nutritional and lifestyle psychiatry at Massachusetts General Hospital, Boston, “commends” the researchers for conducting an RCT with a “robust” sample size and said she is “excited to see research like this furthering functional nutrition for mental health,” as she believes that “food is medicine.”

Dr. Naidoo, a professional chef, nutritional biologist, and author of the book “This Is Your Brain on Food,” said the findings “align” with her own approach to nutritional psychiatry and are also “in line” with her clinical practice.

However, although these results are “promising,” more research is needed across more diverse populations to “make sure these results are truly generalizable,” said Dr. Naidoo, a faculty member at Harvard Medical School, Boston, who was not involved with the study.

She “envisions a future where the research is so advanced that we can ‘dose’ these healthy whole foods for specific psychiatric symptoms and conditions.”

This study was supported by Instituto de Salud Carlos III (co-funded by European Union Regional Development Fund “A way to make Europe”). The California Walnut Commission has given support by supplying the walnuts for free for the Walnuts Smart Snack Dietary Intervention Trial. Dr. Julvez holds a Miguel Servet-II contract awarded by the Instituto de Salud Carlos III (co-funded by European Union Social Fund). The other authors’ disclosures are listed in the original article. Dr. Naidoo reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Walnuts have been associated with better cognitive development and psychological maturation in teens, new research shows. Adolescents who consumed walnuts for at least 100 days showed improved sustained attention and fluid intelligence as well as a reduction in symptoms of attension deficit hyperactivity disorder, compared with matched controls who did not consume the nuts. However, there were no statistically significant changes between the groups in other parameters, such as working memory and executive function.

Clinicians should advise adolescents “to eat a handful of walnuts three times a week for the rest of their lives. They may have a healthier brain with better cognitive function,” said senior investigator Jordi Julvez, PhD, group leader at the Institute of Health Research Pere Virgili, Barcelona, and associated researcher at the Barcelona Institute for Global Health.

The study was published online in eClinicalMedicine.
 

Rich source of omega-3s

Adolescence is “a period of refinement of brain connectivity and complex behaviors,” the investigators noted.  

Previous research suggests polyunsaturated fatty acids are key in central nervous system architecture and function during times of neural development, with three specific PUFAs playing an “essential developmental role.”

Two omega-3 fatty acids – docosahexaenoic acid and eicosapentaenoic acid – are PUFAs that must be obtained through diet, mainly from seafood. Walnuts are “among the richest sources” of plant-derived omega-3 fatty acids, particularly alpha-linolenic acid (ALA), a precursor for longer-chain EPA and DHA.

ALA independently “has positive effects on brain function and plasticity,” the authors wrote. In addition, walnut constituents – particularly polyphenols and other bioactive compounds – “may act synergistically with ALA to foster brain health.”

Earlier small studies have found positive associations between walnut consumption and cognitive function in children, adolescents, and young adults, but to date, no randomized controlled trial has focused on the effect of walnut consumption on adolescent neuropsychological function.

The researchers studied 771 healthy adolescents (aged 11-16 years, mean age 14) drawn from 12 Spanish high schools. Participants were instructed to follow healthy eating recommendations and were randomly assigned 1:1 to the intervention (n = 386) or the control group (n = 385).

At baseline and after 6 months, they completed neuropsychological tests and behavioral rating scales. The Attention Network Test assessed attention, and the N-back test was used to assess working memory. The Tests of Primary Mental Abilities assessed fluid intelligence. Risky decision-making was tested using the Roulettes Task.
 

Fruit and nuts

Participants also completed the Strengths and Difficulties Questionnaire, which provided a total score of problem behavior. Teachers filled out the ADHD DSM-IV form list to provide additional information about ADHD behaviors.

The intervention group received 30 grams/day of raw California walnut kernels to incorporate into their daily diet. It is estimated that this walnut contains about 9 g of ALA per 100 g.

All participants received a seasonal fruit calendar and were asked to eat at least one piece of seasonal fruit daily.

Parents reported their child’s daily walnut consumption, with adherence defined as 100 or more days of eating walnuts during the 6-month period.

All main analyses were based on an intention-to-treat method (participants were analyzed according to their original group assignment, regardless of their adherence to the intervention).

The researchers also conducted a secondary per-protocol analysis, comparing the intervention and control groups to estimate the effect if all participants had adhered to their assigned intervention. They censored data for participants who reported eating walnuts for less than 100 days during the 6-month trial period.

Secondary outcomes included changes in height, weight, waist circumference, and BMI, as well as red blood cell proportions of omega-3 fatty acids (DHA, EPA, and ALA) at baseline and after 6 months.
 

 

 

Adherence counts

Most participants had “medium” or “high” levels of adherence to the Mediterranean diet, with “no meaningful differences” at baseline between the intervention and control groups in lifestyle characteristics or mean scores in all primary endpoints.

In the ITT analysis, there were no statistically significant differences in primary outcomes between the groups following the intervention. As for secondary outcomes, the RBC ALA significantly increased in the walnuts group but not the control group (coefficient, 0.04%; 95% confidence interval, 0.03%-0.06%; P < .0001).

However, there were differences in primary outcomes between the groups in the per-protocol analysis: The adherence-adjusted effect on improvement in attention score was −11.26 ms; 95% CI, −19.92 to −2.60; P = .011) for the intervention versus the control group.

The per-protocol analysis showed other differences: an improvement in fluid intelligence score (1.78; 95% CI, 0.90 - 2.67; P < .0001) and a reduction in ADHD symptom score (−2.18; 95% CI, −3.70 to −0.67; P = .0050).

“Overall, no significant differences were found in the intervention group in relation to the control group,” Dr. Julvez said in a news release. “But if the adherence factor is considered, then positive results are observed, since participants who most closely followed the guidelines – in terms of the recommended dose of walnuts and the number of days of consumption – did show improvements in the neuropsychological functions evaluated.”

Adolescence “is a time of great biological changes. Hormonal transformation occurs, which in turn is responsible for stimulating the synaptic growth of the frontal lobe,” he continued, adding that this brain region “enables neuropsychological maturation of more complex emotional and cognitive functions.”

“Neurons that are well nourished with these types of fatty acids will be able to grow and form new, stronger synapses,” he said.
 

Food as medicine

Uma Naidoo, MD, director of nutritional and lifestyle psychiatry at Massachusetts General Hospital, Boston, “commends” the researchers for conducting an RCT with a “robust” sample size and said she is “excited to see research like this furthering functional nutrition for mental health,” as she believes that “food is medicine.”

Dr. Naidoo, a professional chef, nutritional biologist, and author of the book “This Is Your Brain on Food,” said the findings “align” with her own approach to nutritional psychiatry and are also “in line” with her clinical practice.

However, although these results are “promising,” more research is needed across more diverse populations to “make sure these results are truly generalizable,” said Dr. Naidoo, a faculty member at Harvard Medical School, Boston, who was not involved with the study.

She “envisions a future where the research is so advanced that we can ‘dose’ these healthy whole foods for specific psychiatric symptoms and conditions.”

This study was supported by Instituto de Salud Carlos III (co-funded by European Union Regional Development Fund “A way to make Europe”). The California Walnut Commission has given support by supplying the walnuts for free for the Walnuts Smart Snack Dietary Intervention Trial. Dr. Julvez holds a Miguel Servet-II contract awarded by the Instituto de Salud Carlos III (co-funded by European Union Social Fund). The other authors’ disclosures are listed in the original article. Dr. Naidoo reports no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ECLINICALMEDICINE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Predicting BPD vs. bipolar treatment response: New imaging data

Article Type
Changed
Tue, 04/25/2023 - 17:03

A new study identifies specific brain regions involved in treatment response in bipolar disorder (BD) and borderline personality disorder (BPD), potentially paving the way for more targeted treatment.

In a meta-analysis of 34 studies that used neuroimaging to investigate changes in brain activation following psychotherapy and pharmacotherapy for BD and BPD, investigators found most brain regions showing abnormal activation in both conditions improved after treatment. In particular, changes in brain activity after psychotherapy were found primarily in the frontal areas, whereas pharmacotherapy largely altered the limbic areas.

This study can help clinicians with clinical prediction of treatment efficacy between BD and BPD and clarify the neural mechanism of treatment for these two diseases,” senior investigator Xiaoming Li, PhD, professor, department of medical psychology, Anhui Medical University, Hefei, China, told this news organization.

“It may also contribute to the identification of more accurate neuroimaging biomarkers for treatment of the two disorders and to the finding of more effective therapy,” Dr. Li said.

The study was published online in the Journal of Clinical Psychiatry.
 

Blurred boundary

Dr. Li called BDs and BPDs “difficult to diagnose and differentiate,” noting that the comorbidity rate is “very high.” Underestimating the boundary between BD and BPD “increases the risk of improper or harmful drug exposure,” since mood stabilizing drugs are “considered to be the key therapeutic intervention for BD, while psychotherapy is the key treatment for BPD.”

The “blurred boundary between BD and BPD is one of the reasons it is important to study the relationship between these two diseases,” the authors said.

Previous studies comparing the relationship between BD and BPD “did not explore the similarities and differences in brain mechanisms between these two disorders after treatment,” they pointed out.

Patients with BD have a different disease course and response to therapy, compared to patient with BPD patients. “Misdiagnosis may result in the patients receiving ineffective treatment, so it is particularly important to explore the neural mechanisms of the treatment of these two diseases,” Dr. Li said.

To investigate, the researchers used activation likelihood estimation (ALE) – a technique that examines coordinates of neuroimaging data gleaned from published studies – after searching several databases from inception until June 2021.

This approach was used to “evaluate the similarities and differences in the activation of different brain regions in patients with BD and BPD after treatment with psychotherapy and drug therapy.”

Studies were required to focus on patients with a clinical diagnosis of BD or BPD; neuroimaging studies using functional MRI; coordinates of the peak activations in the stereotactic space of the Montreal Neurologic Institute or Talairach; treatment (pharmacologic or psychological) for patients with BD or BPD; and results of changes in brain activation after treatment, relative to a before-treatment condition.

Of 1,592 records, 34 studies (n = 912 subjects) met inclusion criteria and were selected and used in extracting the activation coordinates. The researchers extracted a total of 186 activity increase points and 90 activity decrease points. After combining these calculations, they found 12 increased activation clusters and 2 decreased activation clusters.

Of the studies, 23 focused on BD and 11 on BPD; 14 used psychotherapy, 18 used drug therapy, and 2 used a combination of both approaches.
 

 

 

Normalizing activation levels

Both treatments were associated with convergent activity increases and decreases in several brain regions: the anterior cingulate cortex, medial frontal gyrus, inferior frontal gyrus, cingulate gyrus, parahippocampal gyrus, and the posterior cingulate cortex.

The researchers then examined studies based on treatment method – psychotherapy or pharmacotherapy and the effect on the two disorders.

“After psychotherapy, the frontal lobe and temporal lobe were the primary brain regions in which activation changed, indicating a top-down effect of this therapy type, while after drug therapy, the limbic area was the region in which activation changed, indicating a ‘bottom-up’ effect,” said Dr. Li.

Dr. Li cited previous research pointing to functional and structural abnormalities in both disorders – especially in the default mode network (DMN) and frontolimbic network.

In particular, alterations in the amygdala and the parahippocampal gyrus are reported more frequently in BPD than in BD, whereas dysfunctional frontolimbic brain regions seem to underlie the emotional dysfunction in BPD. Several studies have also associated the impulsivity of BD with dysfunctions in the interplay of cortical-limbic circuits.

Dr. Li said the study findings suggest “that treatment may change these brain activation levels by acting on the abnormal brain circuit, such as the DMN and the frontolimbic network so as to ‘normalize’ its activity and improve symptoms.”

Specifically, brain regions with abnormally increased activation “showed decreased activation after treatment, and brain regions with abnormally decreased activation showed increased activation after treatment.”
 

Discrete, overlapping mechanisms

Commenting on the study, Roger S. McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the Mood Disorders Psychopharmacology Unit, said the study “provides additional support for the underlying neurobiological signature of bipolar disorder and a commonly encountered co-occurring condition – borderline personality disorder – having both discrete yet overlapping mechanisms.”

Dr. Roger S. McIntyre

He found it interesting that “medications have a different principal target than psychosocial interventions, which has both academic and clinical implications.

“The academic implication is that we have reasons to believe that we will be in a position to parse the neurobiology of bipolar disorder or borderline personality disorder when we take an approach that isolates specific domains of psychopathology, which is what they [the authors] appear to be doing,” said Dr. McIntyre, who wasn’t associated with this research.  

In addition, “from the clinical perspective, this provides a rationale for why we should be integrating pharmacotherapy with psychotherapy in people who have comorbid conditions like borderline personality disorder, which affects 20% of people living with bipolar disorder and 60% to 70% have borderline traits,” he added.

The research was supported by the Anhui Natural Science Foundation and Grants for Scientific Research from Anhui Medical University. Dr. Li and coauthors declared no relevant financial relationships. Dr. McIntyre has received research grant support from CIHR/GACD/National Natural Science Foundation of China and the Milken Institute; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, Atai Life Sciences. Dr. McIntyre is a CEO of Braxia Scientific Corp.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

A new study identifies specific brain regions involved in treatment response in bipolar disorder (BD) and borderline personality disorder (BPD), potentially paving the way for more targeted treatment.

In a meta-analysis of 34 studies that used neuroimaging to investigate changes in brain activation following psychotherapy and pharmacotherapy for BD and BPD, investigators found most brain regions showing abnormal activation in both conditions improved after treatment. In particular, changes in brain activity after psychotherapy were found primarily in the frontal areas, whereas pharmacotherapy largely altered the limbic areas.

This study can help clinicians with clinical prediction of treatment efficacy between BD and BPD and clarify the neural mechanism of treatment for these two diseases,” senior investigator Xiaoming Li, PhD, professor, department of medical psychology, Anhui Medical University, Hefei, China, told this news organization.

“It may also contribute to the identification of more accurate neuroimaging biomarkers for treatment of the two disorders and to the finding of more effective therapy,” Dr. Li said.

The study was published online in the Journal of Clinical Psychiatry.
 

Blurred boundary

Dr. Li called BDs and BPDs “difficult to diagnose and differentiate,” noting that the comorbidity rate is “very high.” Underestimating the boundary between BD and BPD “increases the risk of improper or harmful drug exposure,” since mood stabilizing drugs are “considered to be the key therapeutic intervention for BD, while psychotherapy is the key treatment for BPD.”

The “blurred boundary between BD and BPD is one of the reasons it is important to study the relationship between these two diseases,” the authors said.

Previous studies comparing the relationship between BD and BPD “did not explore the similarities and differences in brain mechanisms between these two disorders after treatment,” they pointed out.

Patients with BD have a different disease course and response to therapy, compared to patient with BPD patients. “Misdiagnosis may result in the patients receiving ineffective treatment, so it is particularly important to explore the neural mechanisms of the treatment of these two diseases,” Dr. Li said.

To investigate, the researchers used activation likelihood estimation (ALE) – a technique that examines coordinates of neuroimaging data gleaned from published studies – after searching several databases from inception until June 2021.

This approach was used to “evaluate the similarities and differences in the activation of different brain regions in patients with BD and BPD after treatment with psychotherapy and drug therapy.”

Studies were required to focus on patients with a clinical diagnosis of BD or BPD; neuroimaging studies using functional MRI; coordinates of the peak activations in the stereotactic space of the Montreal Neurologic Institute or Talairach; treatment (pharmacologic or psychological) for patients with BD or BPD; and results of changes in brain activation after treatment, relative to a before-treatment condition.

Of 1,592 records, 34 studies (n = 912 subjects) met inclusion criteria and were selected and used in extracting the activation coordinates. The researchers extracted a total of 186 activity increase points and 90 activity decrease points. After combining these calculations, they found 12 increased activation clusters and 2 decreased activation clusters.

Of the studies, 23 focused on BD and 11 on BPD; 14 used psychotherapy, 18 used drug therapy, and 2 used a combination of both approaches.
 

 

 

Normalizing activation levels

Both treatments were associated with convergent activity increases and decreases in several brain regions: the anterior cingulate cortex, medial frontal gyrus, inferior frontal gyrus, cingulate gyrus, parahippocampal gyrus, and the posterior cingulate cortex.

The researchers then examined studies based on treatment method – psychotherapy or pharmacotherapy and the effect on the two disorders.

“After psychotherapy, the frontal lobe and temporal lobe were the primary brain regions in which activation changed, indicating a top-down effect of this therapy type, while after drug therapy, the limbic area was the region in which activation changed, indicating a ‘bottom-up’ effect,” said Dr. Li.

Dr. Li cited previous research pointing to functional and structural abnormalities in both disorders – especially in the default mode network (DMN) and frontolimbic network.

In particular, alterations in the amygdala and the parahippocampal gyrus are reported more frequently in BPD than in BD, whereas dysfunctional frontolimbic brain regions seem to underlie the emotional dysfunction in BPD. Several studies have also associated the impulsivity of BD with dysfunctions in the interplay of cortical-limbic circuits.

Dr. Li said the study findings suggest “that treatment may change these brain activation levels by acting on the abnormal brain circuit, such as the DMN and the frontolimbic network so as to ‘normalize’ its activity and improve symptoms.”

Specifically, brain regions with abnormally increased activation “showed decreased activation after treatment, and brain regions with abnormally decreased activation showed increased activation after treatment.”
 

Discrete, overlapping mechanisms

Commenting on the study, Roger S. McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the Mood Disorders Psychopharmacology Unit, said the study “provides additional support for the underlying neurobiological signature of bipolar disorder and a commonly encountered co-occurring condition – borderline personality disorder – having both discrete yet overlapping mechanisms.”

Dr. Roger S. McIntyre

He found it interesting that “medications have a different principal target than psychosocial interventions, which has both academic and clinical implications.

“The academic implication is that we have reasons to believe that we will be in a position to parse the neurobiology of bipolar disorder or borderline personality disorder when we take an approach that isolates specific domains of psychopathology, which is what they [the authors] appear to be doing,” said Dr. McIntyre, who wasn’t associated with this research.  

In addition, “from the clinical perspective, this provides a rationale for why we should be integrating pharmacotherapy with psychotherapy in people who have comorbid conditions like borderline personality disorder, which affects 20% of people living with bipolar disorder and 60% to 70% have borderline traits,” he added.

The research was supported by the Anhui Natural Science Foundation and Grants for Scientific Research from Anhui Medical University. Dr. Li and coauthors declared no relevant financial relationships. Dr. McIntyre has received research grant support from CIHR/GACD/National Natural Science Foundation of China and the Milken Institute; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, Atai Life Sciences. Dr. McIntyre is a CEO of Braxia Scientific Corp.

A version of this article first appeared on Medscape.com.

A new study identifies specific brain regions involved in treatment response in bipolar disorder (BD) and borderline personality disorder (BPD), potentially paving the way for more targeted treatment.

In a meta-analysis of 34 studies that used neuroimaging to investigate changes in brain activation following psychotherapy and pharmacotherapy for BD and BPD, investigators found most brain regions showing abnormal activation in both conditions improved after treatment. In particular, changes in brain activity after psychotherapy were found primarily in the frontal areas, whereas pharmacotherapy largely altered the limbic areas.

This study can help clinicians with clinical prediction of treatment efficacy between BD and BPD and clarify the neural mechanism of treatment for these two diseases,” senior investigator Xiaoming Li, PhD, professor, department of medical psychology, Anhui Medical University, Hefei, China, told this news organization.

“It may also contribute to the identification of more accurate neuroimaging biomarkers for treatment of the two disorders and to the finding of more effective therapy,” Dr. Li said.

The study was published online in the Journal of Clinical Psychiatry.
 

Blurred boundary

Dr. Li called BDs and BPDs “difficult to diagnose and differentiate,” noting that the comorbidity rate is “very high.” Underestimating the boundary between BD and BPD “increases the risk of improper or harmful drug exposure,” since mood stabilizing drugs are “considered to be the key therapeutic intervention for BD, while psychotherapy is the key treatment for BPD.”

The “blurred boundary between BD and BPD is one of the reasons it is important to study the relationship between these two diseases,” the authors said.

Previous studies comparing the relationship between BD and BPD “did not explore the similarities and differences in brain mechanisms between these two disorders after treatment,” they pointed out.

Patients with BD have a different disease course and response to therapy, compared to patient with BPD patients. “Misdiagnosis may result in the patients receiving ineffective treatment, so it is particularly important to explore the neural mechanisms of the treatment of these two diseases,” Dr. Li said.

To investigate, the researchers used activation likelihood estimation (ALE) – a technique that examines coordinates of neuroimaging data gleaned from published studies – after searching several databases from inception until June 2021.

This approach was used to “evaluate the similarities and differences in the activation of different brain regions in patients with BD and BPD after treatment with psychotherapy and drug therapy.”

Studies were required to focus on patients with a clinical diagnosis of BD or BPD; neuroimaging studies using functional MRI; coordinates of the peak activations in the stereotactic space of the Montreal Neurologic Institute or Talairach; treatment (pharmacologic or psychological) for patients with BD or BPD; and results of changes in brain activation after treatment, relative to a before-treatment condition.

Of 1,592 records, 34 studies (n = 912 subjects) met inclusion criteria and were selected and used in extracting the activation coordinates. The researchers extracted a total of 186 activity increase points and 90 activity decrease points. After combining these calculations, they found 12 increased activation clusters and 2 decreased activation clusters.

Of the studies, 23 focused on BD and 11 on BPD; 14 used psychotherapy, 18 used drug therapy, and 2 used a combination of both approaches.
 

 

 

Normalizing activation levels

Both treatments were associated with convergent activity increases and decreases in several brain regions: the anterior cingulate cortex, medial frontal gyrus, inferior frontal gyrus, cingulate gyrus, parahippocampal gyrus, and the posterior cingulate cortex.

The researchers then examined studies based on treatment method – psychotherapy or pharmacotherapy and the effect on the two disorders.

“After psychotherapy, the frontal lobe and temporal lobe were the primary brain regions in which activation changed, indicating a top-down effect of this therapy type, while after drug therapy, the limbic area was the region in which activation changed, indicating a ‘bottom-up’ effect,” said Dr. Li.

Dr. Li cited previous research pointing to functional and structural abnormalities in both disorders – especially in the default mode network (DMN) and frontolimbic network.

In particular, alterations in the amygdala and the parahippocampal gyrus are reported more frequently in BPD than in BD, whereas dysfunctional frontolimbic brain regions seem to underlie the emotional dysfunction in BPD. Several studies have also associated the impulsivity of BD with dysfunctions in the interplay of cortical-limbic circuits.

Dr. Li said the study findings suggest “that treatment may change these brain activation levels by acting on the abnormal brain circuit, such as the DMN and the frontolimbic network so as to ‘normalize’ its activity and improve symptoms.”

Specifically, brain regions with abnormally increased activation “showed decreased activation after treatment, and brain regions with abnormally decreased activation showed increased activation after treatment.”
 

Discrete, overlapping mechanisms

Commenting on the study, Roger S. McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the Mood Disorders Psychopharmacology Unit, said the study “provides additional support for the underlying neurobiological signature of bipolar disorder and a commonly encountered co-occurring condition – borderline personality disorder – having both discrete yet overlapping mechanisms.”

Dr. Roger S. McIntyre

He found it interesting that “medications have a different principal target than psychosocial interventions, which has both academic and clinical implications.

“The academic implication is that we have reasons to believe that we will be in a position to parse the neurobiology of bipolar disorder or borderline personality disorder when we take an approach that isolates specific domains of psychopathology, which is what they [the authors] appear to be doing,” said Dr. McIntyre, who wasn’t associated with this research.  

In addition, “from the clinical perspective, this provides a rationale for why we should be integrating pharmacotherapy with psychotherapy in people who have comorbid conditions like borderline personality disorder, which affects 20% of people living with bipolar disorder and 60% to 70% have borderline traits,” he added.

The research was supported by the Anhui Natural Science Foundation and Grants for Scientific Research from Anhui Medical University. Dr. Li and coauthors declared no relevant financial relationships. Dr. McIntyre has received research grant support from CIHR/GACD/National Natural Science Foundation of China and the Milken Institute; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Neumora Therapeutics, Boehringer Ingelheim, Sage, Biogen, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, Viatris, AbbVie, Atai Life Sciences. Dr. McIntyre is a CEO of Braxia Scientific Corp.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF CLINICAL PSYCHIATRY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Physical exercise tied to a reduction in suicide attempts

Article Type
Changed
Mon, 04/24/2023 - 14:26

Physical exercise is associated with a reduction in suicide attempts, new research suggests.

A meta-analysis of 17 randomized controlled trials (RCTs), which included more than 1,000 participants with mental or physical illnesses, showed there was a significant reduction in suicide attempts in participants randomly assigned to receive exercise interventions, compared with inactive controls. However, there were no differences between the exercise and the control groups in suicidal ideation or mortality.

On the other hand, there was also no significant difference in dropout rates between those randomly assigned to exercise versus inactive controls, suggesting that people with mental or physical impairments are able to adhere to exercise regimens.

“A common misconception is that patients, particularly those suffering from mental of physical illness, are not willing or motivated enough to participate in an exercise [regimen], and this has led to primary care providers underprescribing exercise to those with mental or physical illness,” lead author Nicholas Fabiano, MD, a resident in the department of psychiatry at the University of Ottawa, told this news organization.

As a result of the study findings, “we recommend that providers do not have apprehension about prescribing exercise to patients with physical or mental illness. Exercise may be an effective way to reduce suicidal behaviors” in these patients, he said.

The study was published online  in the Journal of Affective Disorders.
 

Physical, mental health strongly linked

Existing literature has “demonstrated a protective effect of physical activity on suicidal ideation in the general population,” but to date there have been no systematic reviews or meta-analyses investigating its impact on suicide-related outcomes in patients with physical or mental illness, the authors write.

“Those with mental or physical illness are at increased risk of suicide, compared to the general population,” Dr. Fabiano commented.

“We often split up ‘mental health’ and ‘physical health’ in medicine; however, I believe that the two are more on a continuum and a holistic term, such as ‘health,’ should be used instead,” he added.

He noted that mental and physical health are “inexorably intertwined” and those with physical illness are more prone to developing mental illness, whereas those with mental illness are more likely to suffer from a variety of other medical conditions. “Therefore, when treating those with mental illness, it is also imperative that we bolster one’s physical health through easily accessible activities such as exercise,” he said.

The goal of the study was to determine whether individuals with “any mental, physical, clinical, or subclinical condition” might benefit from exercise, particularly in relation to suicide-related outcomes. They searched multiple databases from inception to June 2022 to identify RCTs investigating exercise and suicidal ideation in participants with physical or mental conditions.

Of 673 studies, 17 met the inclusion criteria (total of 1,021 participants). Participants’ mean age was 42.7 years, 82% were female, and 54% were randomly assigned to an exercise intervention.

Most studies (82%) focused on clinical versus subclinical outcomes. Depression was the most commonly included condition (59%). Aerobic exercise (53%) was the most common form of exercise used in the active study groups. This was followed by mind-body exercise and strength training (53%, 17.6%, and 17.6%, respectively). The mean follow-up time was 10 weeks.
 

 

 

Reduced impulsivity

The researchers found a difference in post-intervention suicidal ideation when they compared exercise participants to all control and inactive control participants (standardized mean difference, –1.09; 95% confidence interval, –3.08 to 0.90; P = .20, k = 5). However, the difference was not statistically significant.

Similarly, there was no significant difference (P = .60) in suicidal ideation incidence for subgroup analyses that stratified data among participants with depression, sickle cell disease, and suicidality.

All-cause discontinuation also did not significantly differ between participants who were randomly assigned to exercise interventions versus all controls or inactive controls (odds ratio, 0.85; 95% CI, 0.38-1.94; P = .86, k = 12 and OR, 0.81; 95% CI, 0.25-2.68; P = .70). All-cause discontinuation also did not differ between participants randomized to exercise versus active controls (OR, 0.94; 95% CI, 0.38-2.32; P = .79, k = 3).

Likewise, there were nonsignificant differences between participants who underwent aerobic exercise and strength training (P = .20).

However, there were some nonsignificant differences when comparing participants with depression and stress who received the exercise intervention versus controls (P = .46).

There was a significant reduction in suicide attempts in individuals who participated in exercise interventions versus inactive controls (OR, 0.23; 95% CI, 0.09-0.67; P = .04, k = 2). On the other hand, there was no significant difference in mortality (P = .70).

Most of the studies (82%) were “at high risk of bias,” the authors note. In addition, the analysis was limited because the included studies were “few, underpowered, and heterogeneous.”

Dr. Fabiano hypothesized that the lack of effect on suicidal ideation or mortality is “likely due to the limited sample size.” As additional RCTs are conducted, Dr. Fabiano expects to see decreases in both suicidal ideation and suicide attempts.

The findings may “be explained by the ideation-to-action framework, which suggests that the development of suicidal ideation and the progression to suicide attempts are distinct processes with different influential factors,” he said.

Increased levels of exercise have been “shown to reduce emotional impulsivity and, as it has been shown that most suicide attempts are characterized by impulsivity and low lethality, we hypothesize that regular exercise serves as a protective factor against suicide attempts,” he said.
 

Not useful?

Commenting on the study, Fabien Legrand, PhD, a lecturer in clinical psychology, University of Reims Champagne-Ardenne, Reims, France, said that the impact of physical activity is of “particular interest” to him because it is closely linked to his research activity, where he has “been exploring the antidepressant effects of exercise for more than 15 years.”

A small pilot study conducted by Dr. Legrand and colleagues found rigorous physical activity to be helpful in reducing hopelessness in psychiatric patients, compared with controls. “This result is of particular relevance for suicidal patients, since it has long been documented that hopelessness is one of the main triggers of suicide ideation and suicide attempts,” he said.

Initially, Dr. Legrand “warmly welcomed” the current review and meta-analysis on the exercise and suicide. However, he felt that the paper fell short in accomplishing its intended goal. “After a thorough reading of the paper, I don’t think that the information provided can be used in any way,” he stated.

“The paper’s title – ‘Effects of Physical Exercise on Suicidal Ideation and Behavior’ – does not do justice to its content, since 9 of the included 17 RCTs did not measure changes in suicidal ideation and/or suicidal behavior following participation in an exercise program,” noted Dr. Legrand, who was not involved with authorship or the current analysis.

The study was funded by the University of Ottawa department of psychiatry. Dr. Fabiano declares no relevant financial relationships. The other authors’ disclosures are listed in the original article. Dr. Legrand declares no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Physical exercise is associated with a reduction in suicide attempts, new research suggests.

A meta-analysis of 17 randomized controlled trials (RCTs), which included more than 1,000 participants with mental or physical illnesses, showed there was a significant reduction in suicide attempts in participants randomly assigned to receive exercise interventions, compared with inactive controls. However, there were no differences between the exercise and the control groups in suicidal ideation or mortality.

On the other hand, there was also no significant difference in dropout rates between those randomly assigned to exercise versus inactive controls, suggesting that people with mental or physical impairments are able to adhere to exercise regimens.

“A common misconception is that patients, particularly those suffering from mental of physical illness, are not willing or motivated enough to participate in an exercise [regimen], and this has led to primary care providers underprescribing exercise to those with mental or physical illness,” lead author Nicholas Fabiano, MD, a resident in the department of psychiatry at the University of Ottawa, told this news organization.

As a result of the study findings, “we recommend that providers do not have apprehension about prescribing exercise to patients with physical or mental illness. Exercise may be an effective way to reduce suicidal behaviors” in these patients, he said.

The study was published online  in the Journal of Affective Disorders.
 

Physical, mental health strongly linked

Existing literature has “demonstrated a protective effect of physical activity on suicidal ideation in the general population,” but to date there have been no systematic reviews or meta-analyses investigating its impact on suicide-related outcomes in patients with physical or mental illness, the authors write.

“Those with mental or physical illness are at increased risk of suicide, compared to the general population,” Dr. Fabiano commented.

“We often split up ‘mental health’ and ‘physical health’ in medicine; however, I believe that the two are more on a continuum and a holistic term, such as ‘health,’ should be used instead,” he added.

He noted that mental and physical health are “inexorably intertwined” and those with physical illness are more prone to developing mental illness, whereas those with mental illness are more likely to suffer from a variety of other medical conditions. “Therefore, when treating those with mental illness, it is also imperative that we bolster one’s physical health through easily accessible activities such as exercise,” he said.

The goal of the study was to determine whether individuals with “any mental, physical, clinical, or subclinical condition” might benefit from exercise, particularly in relation to suicide-related outcomes. They searched multiple databases from inception to June 2022 to identify RCTs investigating exercise and suicidal ideation in participants with physical or mental conditions.

Of 673 studies, 17 met the inclusion criteria (total of 1,021 participants). Participants’ mean age was 42.7 years, 82% were female, and 54% were randomly assigned to an exercise intervention.

Most studies (82%) focused on clinical versus subclinical outcomes. Depression was the most commonly included condition (59%). Aerobic exercise (53%) was the most common form of exercise used in the active study groups. This was followed by mind-body exercise and strength training (53%, 17.6%, and 17.6%, respectively). The mean follow-up time was 10 weeks.
 

 

 

Reduced impulsivity

The researchers found a difference in post-intervention suicidal ideation when they compared exercise participants to all control and inactive control participants (standardized mean difference, –1.09; 95% confidence interval, –3.08 to 0.90; P = .20, k = 5). However, the difference was not statistically significant.

Similarly, there was no significant difference (P = .60) in suicidal ideation incidence for subgroup analyses that stratified data among participants with depression, sickle cell disease, and suicidality.

All-cause discontinuation also did not significantly differ between participants who were randomly assigned to exercise interventions versus all controls or inactive controls (odds ratio, 0.85; 95% CI, 0.38-1.94; P = .86, k = 12 and OR, 0.81; 95% CI, 0.25-2.68; P = .70). All-cause discontinuation also did not differ between participants randomized to exercise versus active controls (OR, 0.94; 95% CI, 0.38-2.32; P = .79, k = 3).

Likewise, there were nonsignificant differences between participants who underwent aerobic exercise and strength training (P = .20).

However, there were some nonsignificant differences when comparing participants with depression and stress who received the exercise intervention versus controls (P = .46).

There was a significant reduction in suicide attempts in individuals who participated in exercise interventions versus inactive controls (OR, 0.23; 95% CI, 0.09-0.67; P = .04, k = 2). On the other hand, there was no significant difference in mortality (P = .70).

Most of the studies (82%) were “at high risk of bias,” the authors note. In addition, the analysis was limited because the included studies were “few, underpowered, and heterogeneous.”

Dr. Fabiano hypothesized that the lack of effect on suicidal ideation or mortality is “likely due to the limited sample size.” As additional RCTs are conducted, Dr. Fabiano expects to see decreases in both suicidal ideation and suicide attempts.

The findings may “be explained by the ideation-to-action framework, which suggests that the development of suicidal ideation and the progression to suicide attempts are distinct processes with different influential factors,” he said.

Increased levels of exercise have been “shown to reduce emotional impulsivity and, as it has been shown that most suicide attempts are characterized by impulsivity and low lethality, we hypothesize that regular exercise serves as a protective factor against suicide attempts,” he said.
 

Not useful?

Commenting on the study, Fabien Legrand, PhD, a lecturer in clinical psychology, University of Reims Champagne-Ardenne, Reims, France, said that the impact of physical activity is of “particular interest” to him because it is closely linked to his research activity, where he has “been exploring the antidepressant effects of exercise for more than 15 years.”

A small pilot study conducted by Dr. Legrand and colleagues found rigorous physical activity to be helpful in reducing hopelessness in psychiatric patients, compared with controls. “This result is of particular relevance for suicidal patients, since it has long been documented that hopelessness is one of the main triggers of suicide ideation and suicide attempts,” he said.

Initially, Dr. Legrand “warmly welcomed” the current review and meta-analysis on the exercise and suicide. However, he felt that the paper fell short in accomplishing its intended goal. “After a thorough reading of the paper, I don’t think that the information provided can be used in any way,” he stated.

“The paper’s title – ‘Effects of Physical Exercise on Suicidal Ideation and Behavior’ – does not do justice to its content, since 9 of the included 17 RCTs did not measure changes in suicidal ideation and/or suicidal behavior following participation in an exercise program,” noted Dr. Legrand, who was not involved with authorship or the current analysis.

The study was funded by the University of Ottawa department of psychiatry. Dr. Fabiano declares no relevant financial relationships. The other authors’ disclosures are listed in the original article. Dr. Legrand declares no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Physical exercise is associated with a reduction in suicide attempts, new research suggests.

A meta-analysis of 17 randomized controlled trials (RCTs), which included more than 1,000 participants with mental or physical illnesses, showed there was a significant reduction in suicide attempts in participants randomly assigned to receive exercise interventions, compared with inactive controls. However, there were no differences between the exercise and the control groups in suicidal ideation or mortality.

On the other hand, there was also no significant difference in dropout rates between those randomly assigned to exercise versus inactive controls, suggesting that people with mental or physical impairments are able to adhere to exercise regimens.

“A common misconception is that patients, particularly those suffering from mental of physical illness, are not willing or motivated enough to participate in an exercise [regimen], and this has led to primary care providers underprescribing exercise to those with mental or physical illness,” lead author Nicholas Fabiano, MD, a resident in the department of psychiatry at the University of Ottawa, told this news organization.

As a result of the study findings, “we recommend that providers do not have apprehension about prescribing exercise to patients with physical or mental illness. Exercise may be an effective way to reduce suicidal behaviors” in these patients, he said.

The study was published online  in the Journal of Affective Disorders.
 

Physical, mental health strongly linked

Existing literature has “demonstrated a protective effect of physical activity on suicidal ideation in the general population,” but to date there have been no systematic reviews or meta-analyses investigating its impact on suicide-related outcomes in patients with physical or mental illness, the authors write.

“Those with mental or physical illness are at increased risk of suicide, compared to the general population,” Dr. Fabiano commented.

“We often split up ‘mental health’ and ‘physical health’ in medicine; however, I believe that the two are more on a continuum and a holistic term, such as ‘health,’ should be used instead,” he added.

He noted that mental and physical health are “inexorably intertwined” and those with physical illness are more prone to developing mental illness, whereas those with mental illness are more likely to suffer from a variety of other medical conditions. “Therefore, when treating those with mental illness, it is also imperative that we bolster one’s physical health through easily accessible activities such as exercise,” he said.

The goal of the study was to determine whether individuals with “any mental, physical, clinical, or subclinical condition” might benefit from exercise, particularly in relation to suicide-related outcomes. They searched multiple databases from inception to June 2022 to identify RCTs investigating exercise and suicidal ideation in participants with physical or mental conditions.

Of 673 studies, 17 met the inclusion criteria (total of 1,021 participants). Participants’ mean age was 42.7 years, 82% were female, and 54% were randomly assigned to an exercise intervention.

Most studies (82%) focused on clinical versus subclinical outcomes. Depression was the most commonly included condition (59%). Aerobic exercise (53%) was the most common form of exercise used in the active study groups. This was followed by mind-body exercise and strength training (53%, 17.6%, and 17.6%, respectively). The mean follow-up time was 10 weeks.
 

 

 

Reduced impulsivity

The researchers found a difference in post-intervention suicidal ideation when they compared exercise participants to all control and inactive control participants (standardized mean difference, –1.09; 95% confidence interval, –3.08 to 0.90; P = .20, k = 5). However, the difference was not statistically significant.

Similarly, there was no significant difference (P = .60) in suicidal ideation incidence for subgroup analyses that stratified data among participants with depression, sickle cell disease, and suicidality.

All-cause discontinuation also did not significantly differ between participants who were randomly assigned to exercise interventions versus all controls or inactive controls (odds ratio, 0.85; 95% CI, 0.38-1.94; P = .86, k = 12 and OR, 0.81; 95% CI, 0.25-2.68; P = .70). All-cause discontinuation also did not differ between participants randomized to exercise versus active controls (OR, 0.94; 95% CI, 0.38-2.32; P = .79, k = 3).

Likewise, there were nonsignificant differences between participants who underwent aerobic exercise and strength training (P = .20).

However, there were some nonsignificant differences when comparing participants with depression and stress who received the exercise intervention versus controls (P = .46).

There was a significant reduction in suicide attempts in individuals who participated in exercise interventions versus inactive controls (OR, 0.23; 95% CI, 0.09-0.67; P = .04, k = 2). On the other hand, there was no significant difference in mortality (P = .70).

Most of the studies (82%) were “at high risk of bias,” the authors note. In addition, the analysis was limited because the included studies were “few, underpowered, and heterogeneous.”

Dr. Fabiano hypothesized that the lack of effect on suicidal ideation or mortality is “likely due to the limited sample size.” As additional RCTs are conducted, Dr. Fabiano expects to see decreases in both suicidal ideation and suicide attempts.

The findings may “be explained by the ideation-to-action framework, which suggests that the development of suicidal ideation and the progression to suicide attempts are distinct processes with different influential factors,” he said.

Increased levels of exercise have been “shown to reduce emotional impulsivity and, as it has been shown that most suicide attempts are characterized by impulsivity and low lethality, we hypothesize that regular exercise serves as a protective factor against suicide attempts,” he said.
 

Not useful?

Commenting on the study, Fabien Legrand, PhD, a lecturer in clinical psychology, University of Reims Champagne-Ardenne, Reims, France, said that the impact of physical activity is of “particular interest” to him because it is closely linked to his research activity, where he has “been exploring the antidepressant effects of exercise for more than 15 years.”

A small pilot study conducted by Dr. Legrand and colleagues found rigorous physical activity to be helpful in reducing hopelessness in psychiatric patients, compared with controls. “This result is of particular relevance for suicidal patients, since it has long been documented that hopelessness is one of the main triggers of suicide ideation and suicide attempts,” he said.

Initially, Dr. Legrand “warmly welcomed” the current review and meta-analysis on the exercise and suicide. However, he felt that the paper fell short in accomplishing its intended goal. “After a thorough reading of the paper, I don’t think that the information provided can be used in any way,” he stated.

“The paper’s title – ‘Effects of Physical Exercise on Suicidal Ideation and Behavior’ – does not do justice to its content, since 9 of the included 17 RCTs did not measure changes in suicidal ideation and/or suicidal behavior following participation in an exercise program,” noted Dr. Legrand, who was not involved with authorship or the current analysis.

The study was funded by the University of Ottawa department of psychiatry. Dr. Fabiano declares no relevant financial relationships. The other authors’ disclosures are listed in the original article. Dr. Legrand declares no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM THE JOURNAL OF AFFECTIVE DISORDERS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Phototherapy a safe, effective, inexpensive new option for dementia?

Article Type
Changed
Thu, 04/13/2023 - 14:51

Phototherapy is a safe, effective, noninvasive, and inexpensive way of boosting cognition for patients with dementia, new research suggests. It may be “one of the most promising interventions for improving core symptoms” of the disease.

A new meta-analysis shows that patients with dementia who received phototherapy experienced significant cognitive improvement, compared with those who received usual treatment. However, there were no differences between study groups in terms of improved depression, agitation, or sleep problems.

“Our meta-analysis indicates that phototherapy improved cognitive function in patients with dementia. ... This suggests that phototherapy may be one of the most promising non-pharmacological interventions for improving core symptoms of dementia,” wrote the investigators, led by Xinlian Lu, Peking University, Beijing.

The study was published online in Brain and Behavior.
 

A new treatment option?

“As drug treatment for dementia has limitations such as medical contraindications, limited efficacy, and adverse effects, nonpharmacological therapy has been increasingly regarded as a critical part of comprehensive dementia care,” the investigators noted.

Phototherapy, which utilizes full-spectrum bright light (usually > 600 lux) or wavelength-specific light (for example, blue-enriched or blue-green), is a “promising nonpharmacological therapy” that is noninvasive, inexpensive, and safe.

Most studies of phototherapy have focused on sleep. Findings have shown “high heterogeneity” among the interventions and the populations in the studies, and results have been “inconsistent.” In addition, the effect of phototherapy on cognitive function and behavioral and psychological symptoms of dementia (BPSD) “still need to be clarified.”

In the systematic review and meta-analysis, the investigators examined the effects of phototherapy on cognitive function, BPSD, and sleep in older adults with dementia.

They searched several databases for randomized controlled trials that investigated phototherapy interventions for elderly patients. The primary outcome was cognitive function, which was assessed via the Mini-Mental State Examination (MMSE).

Secondary outcomes included BPSD, including agitation, anxiety, irritability, depression, anxiety, and sleep disturbances, as assessed by the Cornell Scale for Depression in Dementia (CSDD), the Cohen-Mansfield Agitation Inventory (CMAI), the Neuropsychiatric Inventory (NPI), and measures of sleep, including total sleep time (TST), sleep efficiency (SE), and sleep disorders, as assessed by the Sleep Disorder Inventory (SDI).

To be included in the analysis, individual studies had to focus on elderly adults who had some form of dementia. In addition, a group receiving a phototherapy intervention had to be compared with a nonintervention group, and the study had to specify one of the above-defined outcomes.

The review included phototherapy interventions of all forms, frequencies, and durations, including use of bright light, LED light, and blue or blue-green light.
 

Regulating circadian rhythm

Twelve studies met the researchers’ criteria. They included a total of 766 patients with dementia – 426 in the intervention group and 340 in the control group. The mean ages ranged from 73.73 to 85.9 years, and there was a greater number of female than male participants.

Of the studies, seven employed routine daily light in the control group, while the others used either dim light (≤ 50 lux) or devices without light.

The researchers found “significant positive intervention effects” for global cognitive function. Improvements in postintervention MMSE scores differed significantly between the experimental groups and control groups (mean difference, 2.68; 95% confidence interval, 1.38-3.98; I2 = 0%).

No significant differences were found in the effects of intervention on depression symptoms, as evidenced in CSDD scores (MD, −0.70; 95% CI, −3.10 to 1.70; I2 = 81%).

Among patients with higher CMAI scores, which indicate more severe agitation behaviors, there was a “trend of decreasing CMAI scores” after phototherapy (MD, −3.12; 95% CI, −8.05 to 1.82; I2 = 0%). No significant difference in NPI scores was observed between the two groups.

Similarly, no significant difference was found between the two groups in TST, SE, or SDI scores.

Adverse effects were infrequent and were not severe. Two of the 426 patients in the intervention group experienced mild ocular irritation, and one experienced slight transient redness of the forehead.

Light “may compensate for the reduction in the visual sensory input of patients with dementia and stimulate specific neurons in the suprachiasmatic nucleus of the hypothalamus to regulate circadian rhythm,” the researchers suggested.

“As circadian rhythms are involved in optimal brain function, light supplementation may act on the synchronizing/phase-shifting effects of circadian rhythms to improve cognitive function,” they added.

They note that the light box is the “most commonly used device in phototherapy.” Light boxes provide full-spectrum bright light, usually greater than 2,500 lux. The duration is 30 minutes in the daytime, and treatment lasts 4-8 weeks.

The investigators cautioned that the light box should be placed 60 cm away from the patient or above the patient’s eye level. They said that a ceiling-mounted light is a “good choice” for providing whole-day phototherapy, since such lights do not interfere with the patient’s daily routine, reduce the demand on staff, and contribute to better adherence.

Phototherapy helmets and glasses are also available. These portable devices “allow for better control of light intensity and are ergonomic without interfering with patients’ normal activities.”

The researchers noted that “further well-designed studies are needed to explore the most effective clinical implementation conditions, including device type, duration, frequency, and time.”
 

 

 

Easy to use

Mariana Figueiro, PhD, professor and director of the Light and Health Research Center, department of population health medicine, Icahn School of Medicine at Mount Sinai, New York, said light is the “major stimulus for the circadian system, and a robust light-dark pattern daily (which can be given by light therapy during the day) improves sleep and behavior and reduces depression and agitation.”

Dr. Figueiro, who was not involved with the current study, noted that patients with dementia “have sleep issues, which can further affect their cognition; improvement in sleep leads to improvement in cognition,” and this may be an underlying mechanism associated with these results.

The clinical significance of the study “is that this is a nonpharmacological intervention and can be easily applied in the homes or controlled facilities, and it can be used with any other medication,” she pointed out.

“More importantly, sleep medications have negative side effects, so the use of nonpharmacological interventions improving sleep and cognition is great for clinical practice,” she added.

However, she took issue with the finding that phototherapy was not effective for depression and agitation, noting that there were “too few studies to say for sure that light therapy is ineffective at improving these outcomes.”

The research received no external funding. The authors and Dr. Figueiro disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Phototherapy is a safe, effective, noninvasive, and inexpensive way of boosting cognition for patients with dementia, new research suggests. It may be “one of the most promising interventions for improving core symptoms” of the disease.

A new meta-analysis shows that patients with dementia who received phototherapy experienced significant cognitive improvement, compared with those who received usual treatment. However, there were no differences between study groups in terms of improved depression, agitation, or sleep problems.

“Our meta-analysis indicates that phototherapy improved cognitive function in patients with dementia. ... This suggests that phototherapy may be one of the most promising non-pharmacological interventions for improving core symptoms of dementia,” wrote the investigators, led by Xinlian Lu, Peking University, Beijing.

The study was published online in Brain and Behavior.
 

A new treatment option?

“As drug treatment for dementia has limitations such as medical contraindications, limited efficacy, and adverse effects, nonpharmacological therapy has been increasingly regarded as a critical part of comprehensive dementia care,” the investigators noted.

Phototherapy, which utilizes full-spectrum bright light (usually > 600 lux) or wavelength-specific light (for example, blue-enriched or blue-green), is a “promising nonpharmacological therapy” that is noninvasive, inexpensive, and safe.

Most studies of phototherapy have focused on sleep. Findings have shown “high heterogeneity” among the interventions and the populations in the studies, and results have been “inconsistent.” In addition, the effect of phototherapy on cognitive function and behavioral and psychological symptoms of dementia (BPSD) “still need to be clarified.”

In the systematic review and meta-analysis, the investigators examined the effects of phototherapy on cognitive function, BPSD, and sleep in older adults with dementia.

They searched several databases for randomized controlled trials that investigated phototherapy interventions for elderly patients. The primary outcome was cognitive function, which was assessed via the Mini-Mental State Examination (MMSE).

Secondary outcomes included BPSD, including agitation, anxiety, irritability, depression, anxiety, and sleep disturbances, as assessed by the Cornell Scale for Depression in Dementia (CSDD), the Cohen-Mansfield Agitation Inventory (CMAI), the Neuropsychiatric Inventory (NPI), and measures of sleep, including total sleep time (TST), sleep efficiency (SE), and sleep disorders, as assessed by the Sleep Disorder Inventory (SDI).

To be included in the analysis, individual studies had to focus on elderly adults who had some form of dementia. In addition, a group receiving a phototherapy intervention had to be compared with a nonintervention group, and the study had to specify one of the above-defined outcomes.

The review included phototherapy interventions of all forms, frequencies, and durations, including use of bright light, LED light, and blue or blue-green light.
 

Regulating circadian rhythm

Twelve studies met the researchers’ criteria. They included a total of 766 patients with dementia – 426 in the intervention group and 340 in the control group. The mean ages ranged from 73.73 to 85.9 years, and there was a greater number of female than male participants.

Of the studies, seven employed routine daily light in the control group, while the others used either dim light (≤ 50 lux) or devices without light.

The researchers found “significant positive intervention effects” for global cognitive function. Improvements in postintervention MMSE scores differed significantly between the experimental groups and control groups (mean difference, 2.68; 95% confidence interval, 1.38-3.98; I2 = 0%).

No significant differences were found in the effects of intervention on depression symptoms, as evidenced in CSDD scores (MD, −0.70; 95% CI, −3.10 to 1.70; I2 = 81%).

Among patients with higher CMAI scores, which indicate more severe agitation behaviors, there was a “trend of decreasing CMAI scores” after phototherapy (MD, −3.12; 95% CI, −8.05 to 1.82; I2 = 0%). No significant difference in NPI scores was observed between the two groups.

Similarly, no significant difference was found between the two groups in TST, SE, or SDI scores.

Adverse effects were infrequent and were not severe. Two of the 426 patients in the intervention group experienced mild ocular irritation, and one experienced slight transient redness of the forehead.

Light “may compensate for the reduction in the visual sensory input of patients with dementia and stimulate specific neurons in the suprachiasmatic nucleus of the hypothalamus to regulate circadian rhythm,” the researchers suggested.

“As circadian rhythms are involved in optimal brain function, light supplementation may act on the synchronizing/phase-shifting effects of circadian rhythms to improve cognitive function,” they added.

They note that the light box is the “most commonly used device in phototherapy.” Light boxes provide full-spectrum bright light, usually greater than 2,500 lux. The duration is 30 minutes in the daytime, and treatment lasts 4-8 weeks.

The investigators cautioned that the light box should be placed 60 cm away from the patient or above the patient’s eye level. They said that a ceiling-mounted light is a “good choice” for providing whole-day phototherapy, since such lights do not interfere with the patient’s daily routine, reduce the demand on staff, and contribute to better adherence.

Phototherapy helmets and glasses are also available. These portable devices “allow for better control of light intensity and are ergonomic without interfering with patients’ normal activities.”

The researchers noted that “further well-designed studies are needed to explore the most effective clinical implementation conditions, including device type, duration, frequency, and time.”
 

 

 

Easy to use

Mariana Figueiro, PhD, professor and director of the Light and Health Research Center, department of population health medicine, Icahn School of Medicine at Mount Sinai, New York, said light is the “major stimulus for the circadian system, and a robust light-dark pattern daily (which can be given by light therapy during the day) improves sleep and behavior and reduces depression and agitation.”

Dr. Figueiro, who was not involved with the current study, noted that patients with dementia “have sleep issues, which can further affect their cognition; improvement in sleep leads to improvement in cognition,” and this may be an underlying mechanism associated with these results.

The clinical significance of the study “is that this is a nonpharmacological intervention and can be easily applied in the homes or controlled facilities, and it can be used with any other medication,” she pointed out.

“More importantly, sleep medications have negative side effects, so the use of nonpharmacological interventions improving sleep and cognition is great for clinical practice,” she added.

However, she took issue with the finding that phototherapy was not effective for depression and agitation, noting that there were “too few studies to say for sure that light therapy is ineffective at improving these outcomes.”

The research received no external funding. The authors and Dr. Figueiro disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Phototherapy is a safe, effective, noninvasive, and inexpensive way of boosting cognition for patients with dementia, new research suggests. It may be “one of the most promising interventions for improving core symptoms” of the disease.

A new meta-analysis shows that patients with dementia who received phototherapy experienced significant cognitive improvement, compared with those who received usual treatment. However, there were no differences between study groups in terms of improved depression, agitation, or sleep problems.

“Our meta-analysis indicates that phototherapy improved cognitive function in patients with dementia. ... This suggests that phototherapy may be one of the most promising non-pharmacological interventions for improving core symptoms of dementia,” wrote the investigators, led by Xinlian Lu, Peking University, Beijing.

The study was published online in Brain and Behavior.
 

A new treatment option?

“As drug treatment for dementia has limitations such as medical contraindications, limited efficacy, and adverse effects, nonpharmacological therapy has been increasingly regarded as a critical part of comprehensive dementia care,” the investigators noted.

Phototherapy, which utilizes full-spectrum bright light (usually > 600 lux) or wavelength-specific light (for example, blue-enriched or blue-green), is a “promising nonpharmacological therapy” that is noninvasive, inexpensive, and safe.

Most studies of phototherapy have focused on sleep. Findings have shown “high heterogeneity” among the interventions and the populations in the studies, and results have been “inconsistent.” In addition, the effect of phototherapy on cognitive function and behavioral and psychological symptoms of dementia (BPSD) “still need to be clarified.”

In the systematic review and meta-analysis, the investigators examined the effects of phototherapy on cognitive function, BPSD, and sleep in older adults with dementia.

They searched several databases for randomized controlled trials that investigated phototherapy interventions for elderly patients. The primary outcome was cognitive function, which was assessed via the Mini-Mental State Examination (MMSE).

Secondary outcomes included BPSD, including agitation, anxiety, irritability, depression, anxiety, and sleep disturbances, as assessed by the Cornell Scale for Depression in Dementia (CSDD), the Cohen-Mansfield Agitation Inventory (CMAI), the Neuropsychiatric Inventory (NPI), and measures of sleep, including total sleep time (TST), sleep efficiency (SE), and sleep disorders, as assessed by the Sleep Disorder Inventory (SDI).

To be included in the analysis, individual studies had to focus on elderly adults who had some form of dementia. In addition, a group receiving a phototherapy intervention had to be compared with a nonintervention group, and the study had to specify one of the above-defined outcomes.

The review included phototherapy interventions of all forms, frequencies, and durations, including use of bright light, LED light, and blue or blue-green light.
 

Regulating circadian rhythm

Twelve studies met the researchers’ criteria. They included a total of 766 patients with dementia – 426 in the intervention group and 340 in the control group. The mean ages ranged from 73.73 to 85.9 years, and there was a greater number of female than male participants.

Of the studies, seven employed routine daily light in the control group, while the others used either dim light (≤ 50 lux) or devices without light.

The researchers found “significant positive intervention effects” for global cognitive function. Improvements in postintervention MMSE scores differed significantly between the experimental groups and control groups (mean difference, 2.68; 95% confidence interval, 1.38-3.98; I2 = 0%).

No significant differences were found in the effects of intervention on depression symptoms, as evidenced in CSDD scores (MD, −0.70; 95% CI, −3.10 to 1.70; I2 = 81%).

Among patients with higher CMAI scores, which indicate more severe agitation behaviors, there was a “trend of decreasing CMAI scores” after phototherapy (MD, −3.12; 95% CI, −8.05 to 1.82; I2 = 0%). No significant difference in NPI scores was observed between the two groups.

Similarly, no significant difference was found between the two groups in TST, SE, or SDI scores.

Adverse effects were infrequent and were not severe. Two of the 426 patients in the intervention group experienced mild ocular irritation, and one experienced slight transient redness of the forehead.

Light “may compensate for the reduction in the visual sensory input of patients with dementia and stimulate specific neurons in the suprachiasmatic nucleus of the hypothalamus to regulate circadian rhythm,” the researchers suggested.

“As circadian rhythms are involved in optimal brain function, light supplementation may act on the synchronizing/phase-shifting effects of circadian rhythms to improve cognitive function,” they added.

They note that the light box is the “most commonly used device in phototherapy.” Light boxes provide full-spectrum bright light, usually greater than 2,500 lux. The duration is 30 minutes in the daytime, and treatment lasts 4-8 weeks.

The investigators cautioned that the light box should be placed 60 cm away from the patient or above the patient’s eye level. They said that a ceiling-mounted light is a “good choice” for providing whole-day phototherapy, since such lights do not interfere with the patient’s daily routine, reduce the demand on staff, and contribute to better adherence.

Phototherapy helmets and glasses are also available. These portable devices “allow for better control of light intensity and are ergonomic without interfering with patients’ normal activities.”

The researchers noted that “further well-designed studies are needed to explore the most effective clinical implementation conditions, including device type, duration, frequency, and time.”
 

 

 

Easy to use

Mariana Figueiro, PhD, professor and director of the Light and Health Research Center, department of population health medicine, Icahn School of Medicine at Mount Sinai, New York, said light is the “major stimulus for the circadian system, and a robust light-dark pattern daily (which can be given by light therapy during the day) improves sleep and behavior and reduces depression and agitation.”

Dr. Figueiro, who was not involved with the current study, noted that patients with dementia “have sleep issues, which can further affect their cognition; improvement in sleep leads to improvement in cognition,” and this may be an underlying mechanism associated with these results.

The clinical significance of the study “is that this is a nonpharmacological intervention and can be easily applied in the homes or controlled facilities, and it can be used with any other medication,” she pointed out.

“More importantly, sleep medications have negative side effects, so the use of nonpharmacological interventions improving sleep and cognition is great for clinical practice,” she added.

However, she took issue with the finding that phototherapy was not effective for depression and agitation, noting that there were “too few studies to say for sure that light therapy is ineffective at improving these outcomes.”

The research received no external funding. The authors and Dr. Figueiro disclosed no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM BRAIN AND BEHAVIOR

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

New insight into the growing problem of gaming disorder

Article Type
Changed
Fri, 04/07/2023 - 14:16

Three studies provide new insight into the growing problem of gaming disorder (GD), including the condition’s genesis, effective treatments, and the need for a greater focus on recovery.

A team of international researchers led by Orsolya Király, PhD, of the Institute of Psychology, Eötvös Loránd University, Budapest, reviewed the characteristics and etiology of GD. They concluded that its genesis arises from the interaction of environmental factors, game-specific factors and individual factors, including personality traits, comorbid psychopathology, and genetic predisposition.

“The development of GD is a complex process and we identified three major factors involved,” study coauthor Mark Griffiths, PhD, distinguished professor of behavioral addiction and director of the international gaming research unit, psychology department, Nottingham (England) Trent University, said in an interview. Because of this complexity, “prevention and intervention in GD require multiprofessional action.”

The review was published in Comprehensive Psychiatry.

In a second paper, published online in Frontiers in Psychiatry, Chinese investigators reviewing randomized controlled trials (RCTs) presented “compelling evidence” to support four effective interventions for GD: group counseling, acceptance and cognitive restructuring intervention program (ACRIP), short-term cognitive-behavioral therapy (CBT), and craving behavioral intervention (CBI).

A third paper, published online in the Journal of Behavioral Addictions, in which researchers analyzed close to 50 studies of GD, found that the concept of “recovery” is rarely mentioned in GD research. Lead author Belle Gavriel-Fried, PhD, senior professor, Bob Shapell School of Social Work, Tel Aviv University, said in an interview that recovery is a “holistic concept that taps into many aspects of life.”

Understanding the “differences in the impact and availability” of negative and positive human resources and their effect on recovery “can help clinicians to customize treatment,” she said.
 

Complex interplay

GD is garnering increasing attention in the clinical community, especially since 2019, when the World Health Organization included it in the ICD-11.

“Although for most individuals, gaming is a recreational activity or even a passion, a small group of gamers experiences negative symptoms which impact their mental and physical health and cause functional impairment,” wrote Dr. Király and colleagues.

Dr. Griffiths explained that his team wanted to provide an “up-to-date primer – a ‘one-stop shop’ – on all things etiologic concerning gaming disorder for academics and practitioners” as well as others, such as health policy makers, teachers, and individuals in the gaming industry.

The researchers identified three factors that increase the risk of developing GD, the first being gaming-related factors, which make video games “addictive in a way that vulnerable individuals may develop GD.”

For example, GD is more prevalent among online versus offline game players, possibly because online multiplayer games “provide safe environments in which players can fulfill their social needs while remaining invisible and anonymous.”

Game genre also matters, with massively multiplayer online role-playing games, first-person/third-person shooter games, real-time strategy games, and multiplayer online battle arena games most implicated in problematic gaming. Moreover, the “monetization techniques” of certain games also increase their addictive potential.

The researchers point to individual factors that increase the risk of developing GD, including male sex and younger age, personality traits like impulsivity and sensation-seeking, and comorbidities including ADHD, anxiety, and depression.

Poor self-esteem and lack of social competencies make gaming “an easy and efficient way to compensate for these deficiencies, which in turn, heightens the risk for developing GD,” they add. Neurobiological processes and genetic predisposition also play a role.

Lastly, the authors mentioned environmental factors, including family and peer-group issues, problems at work or school, and cultural factors.

“The take-home messages are that problematic gaming has had a long history of empirical research; that the psychiatric community now views GD as a legitimate mental health issue; and that the reasons for GD are complex, with many different factors involved in the acquisition, development, and maintenance of GD,” said Dr. Griffiths.
 

 

 

Beneficial behavioral therapies

Yuzhou Chen and colleagues, Southwest University, Chongqing, China, conducted a systematic review of RCTs investigating interventions for treating GD. Despite the “large number of intervention approaches developed over the past decade, as yet, there are no authoritative guidelines for what makes an effective GD intervention,” they wrote.

Few studies have focused specifically on GD but instead have focused on a combination of internet addiction and GD. But the interventions used to treat internet addiction may not apply to GD. And few studies have utilized an RCT design. The researchers therefore set out to review studies that specifically used an RCT design to investigate interventions for GD.

They searched six databases to identify RCTs that tested GD interventions from the inception of each database until the end of 2021. To be included, participants had to be diagnosed with GD and receive either a “complete and systematic intervention” or be in a comparator control group receiving no intervention or placebo.

Seven studies met the inclusion criteria (n = 332 participants). The studies tested five interventions:
 

  • Group counseling with three different themes (interpersonal interaction, acceptance and commitment, cognition and behavior)
  • CBI, which addresses cravings
  • Transcranial direct current stimulation (tDCS)
  • ACRIP with the main objectives of reducing GD symptoms and improving psychological well-being
  • Short-term CBT, which addresses maladaptive cognitions

The mean duration of the interventions ranged from 3 to 15 weeks.

The primary outcome was GD severity, with secondary outcomes including depression, anxiety, cognition, game time, self-esteem, self-compassion, shyness, impulsivity, and psychological well-being.

Group counseling, CBI, ACRIP, and short-term CBT interventions had “a significant effect on decreasing the severity of GD,” while tDCS had “no significant effect.”

Behavioral therapy “exerts its effect on the behavioral mechanism of GD; for example, by reducing the association between game-related stimuli and the game player’s response to them,” the authors suggested.



Behavioral therapy “exerts its effect on the behavioral mechanism of GD; for example, by reducing the association between game-related stimuli and the game-player’s response to them,” the authors suggested.
 

Recovery vs. pathology

Recovery “traditionally represents the transition from trauma and illness to health,” Dr. Gavriel-Fried and colleagues noted.

Two paradigms of recovery are “deficit based” and “strength based.” The first assesses recovery in terms of abstinence, sobriety, and symptom reduction; and the second focuses on “growth, rather than a reduction in pathology.”

But although recovery is “embedded within mental health addiction policies and practice,” the concept has received “scant attention” in GD research.

The researchers therefore aimed to “map and summarize the state of the art on recovery from GD,” defining “recovery” as the “ability to handle conflicting feelings and emotions without external mediation.”

They conducted a scoping review of all literature regarding GD or internet GD published before February 2022 (47 studies, 2,924 participants with GD; mean age range, 13-26 years).

Most studies (n = 32) consisted of exclusively male subjects. Only 10 included both sexes, and female participants were in the minority.

Most studies (n = 42) did not address the concept of recovery, although all studies did report significant improvements in gaming-related pathology. Typical terminology used to describe changes in participants’ GD were “reduction” and/or “decrease” in symptom severity.

Although 18 studies mentioned the word “recovery,” only 5 actually discussed issues related to the notion of recovery, and only 5 used the term “abstinence.”

In addition, only 13 studies examined positive components of life in patients with GD, such as increased psychological well-being, life satisfaction, quality of life, improved emotional state, relational skills, and executive control, as well as improved self-care, hygiene, sleep, and interest in school studies.

“As a person and researcher who believes that words shape the way we perceive things, I think we should use the word ‘recovery’ rather than ‘pathology’ much more in research, therapy, and policy,” said Dr. Gavriel-Fried.

She noted that, because GD is a “relatively new behavioral addictive disorder, theories are still being developed and definitions of the symptoms are still being fine-tuned.”

“The field as a whole will benefit from future theoretical work that will lead to practical solutions for treating GD and ways to identify the risk factors,” Dr. Gavriel-Fried said.
 

 

 

Filling a research gap

In a comment, David Greenfield, MD, founder and medical director of the Connecticut-based Center for Internet and Technology Addiction, noted that 3 decades ago, there was almost no research into this area.

“The fact that we have these reviews and studies is good because all of the research adds to the science providing more data about an area we still don’t know that much about, where research is still in its infancy,” said Dr. Greenfield, who was not involved with the present study.

“Although we have definitions, there’s no complete agreement about the definitions of GD, and we do not yet have a unified approach,” continued Dr. Greenfield, who wrote the books Overcoming Internet Addiction for Dummies and Virtual Addiction.

He suggested that “recovery” is rarely used as a concept in GD research perhaps because there’s a “bifurcation in the field of addiction medicine in which behavioral addictions are not seen as equivalent to substance addictions,” and, particularly with GD, the principles of “recovery” have not yet matured.

“Recovery means meaningful life away from the screen, not just abstinence from the screen,” said Dr. Greenfield.

The study by Mr. Chen and colleagues was supported by grants from the National Social Science Foundation of China, the Chongqing Research Program of Basic Research and Frontier Technology, and the Fundamental Research Funds for the Central Universities. Dr. Griffiths has reported receiving research funding from Norsk Tipping (the gambling operator owned by the Norwegian government). The study by Dr. Király and colleagues received support from the Hungarian National Research Development and Innovation Office and the Janos Bolyai Research Scholarship Academy of Sciences to individual investigators. The study by Dr. Gavriel-Fried and colleagues received support from the Hungarian National Research Development and Innovation Office and the Janos Bolyai Research Scholarship Academy of Sciences to individual investigators. Dr. Gavriel-Fried has reported receiving grants from the Israel National Insurance Institute and the Committee for Independent Studies of the Israel Lottery. Dr. Greenfield reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

Three studies provide new insight into the growing problem of gaming disorder (GD), including the condition’s genesis, effective treatments, and the need for a greater focus on recovery.

A team of international researchers led by Orsolya Király, PhD, of the Institute of Psychology, Eötvös Loránd University, Budapest, reviewed the characteristics and etiology of GD. They concluded that its genesis arises from the interaction of environmental factors, game-specific factors and individual factors, including personality traits, comorbid psychopathology, and genetic predisposition.

“The development of GD is a complex process and we identified three major factors involved,” study coauthor Mark Griffiths, PhD, distinguished professor of behavioral addiction and director of the international gaming research unit, psychology department, Nottingham (England) Trent University, said in an interview. Because of this complexity, “prevention and intervention in GD require multiprofessional action.”

The review was published in Comprehensive Psychiatry.

In a second paper, published online in Frontiers in Psychiatry, Chinese investigators reviewing randomized controlled trials (RCTs) presented “compelling evidence” to support four effective interventions for GD: group counseling, acceptance and cognitive restructuring intervention program (ACRIP), short-term cognitive-behavioral therapy (CBT), and craving behavioral intervention (CBI).

A third paper, published online in the Journal of Behavioral Addictions, in which researchers analyzed close to 50 studies of GD, found that the concept of “recovery” is rarely mentioned in GD research. Lead author Belle Gavriel-Fried, PhD, senior professor, Bob Shapell School of Social Work, Tel Aviv University, said in an interview that recovery is a “holistic concept that taps into many aspects of life.”

Understanding the “differences in the impact and availability” of negative and positive human resources and their effect on recovery “can help clinicians to customize treatment,” she said.
 

Complex interplay

GD is garnering increasing attention in the clinical community, especially since 2019, when the World Health Organization included it in the ICD-11.

“Although for most individuals, gaming is a recreational activity or even a passion, a small group of gamers experiences negative symptoms which impact their mental and physical health and cause functional impairment,” wrote Dr. Király and colleagues.

Dr. Griffiths explained that his team wanted to provide an “up-to-date primer – a ‘one-stop shop’ – on all things etiologic concerning gaming disorder for academics and practitioners” as well as others, such as health policy makers, teachers, and individuals in the gaming industry.

The researchers identified three factors that increase the risk of developing GD, the first being gaming-related factors, which make video games “addictive in a way that vulnerable individuals may develop GD.”

For example, GD is more prevalent among online versus offline game players, possibly because online multiplayer games “provide safe environments in which players can fulfill their social needs while remaining invisible and anonymous.”

Game genre also matters, with massively multiplayer online role-playing games, first-person/third-person shooter games, real-time strategy games, and multiplayer online battle arena games most implicated in problematic gaming. Moreover, the “monetization techniques” of certain games also increase their addictive potential.

The researchers point to individual factors that increase the risk of developing GD, including male sex and younger age, personality traits like impulsivity and sensation-seeking, and comorbidities including ADHD, anxiety, and depression.

Poor self-esteem and lack of social competencies make gaming “an easy and efficient way to compensate for these deficiencies, which in turn, heightens the risk for developing GD,” they add. Neurobiological processes and genetic predisposition also play a role.

Lastly, the authors mentioned environmental factors, including family and peer-group issues, problems at work or school, and cultural factors.

“The take-home messages are that problematic gaming has had a long history of empirical research; that the psychiatric community now views GD as a legitimate mental health issue; and that the reasons for GD are complex, with many different factors involved in the acquisition, development, and maintenance of GD,” said Dr. Griffiths.
 

 

 

Beneficial behavioral therapies

Yuzhou Chen and colleagues, Southwest University, Chongqing, China, conducted a systematic review of RCTs investigating interventions for treating GD. Despite the “large number of intervention approaches developed over the past decade, as yet, there are no authoritative guidelines for what makes an effective GD intervention,” they wrote.

Few studies have focused specifically on GD but instead have focused on a combination of internet addiction and GD. But the interventions used to treat internet addiction may not apply to GD. And few studies have utilized an RCT design. The researchers therefore set out to review studies that specifically used an RCT design to investigate interventions for GD.

They searched six databases to identify RCTs that tested GD interventions from the inception of each database until the end of 2021. To be included, participants had to be diagnosed with GD and receive either a “complete and systematic intervention” or be in a comparator control group receiving no intervention or placebo.

Seven studies met the inclusion criteria (n = 332 participants). The studies tested five interventions:
 

  • Group counseling with three different themes (interpersonal interaction, acceptance and commitment, cognition and behavior)
  • CBI, which addresses cravings
  • Transcranial direct current stimulation (tDCS)
  • ACRIP with the main objectives of reducing GD symptoms and improving psychological well-being
  • Short-term CBT, which addresses maladaptive cognitions

The mean duration of the interventions ranged from 3 to 15 weeks.

The primary outcome was GD severity, with secondary outcomes including depression, anxiety, cognition, game time, self-esteem, self-compassion, shyness, impulsivity, and psychological well-being.

Group counseling, CBI, ACRIP, and short-term CBT interventions had “a significant effect on decreasing the severity of GD,” while tDCS had “no significant effect.”

Behavioral therapy “exerts its effect on the behavioral mechanism of GD; for example, by reducing the association between game-related stimuli and the game player’s response to them,” the authors suggested.



Behavioral therapy “exerts its effect on the behavioral mechanism of GD; for example, by reducing the association between game-related stimuli and the game-player’s response to them,” the authors suggested.
 

Recovery vs. pathology

Recovery “traditionally represents the transition from trauma and illness to health,” Dr. Gavriel-Fried and colleagues noted.

Two paradigms of recovery are “deficit based” and “strength based.” The first assesses recovery in terms of abstinence, sobriety, and symptom reduction; and the second focuses on “growth, rather than a reduction in pathology.”

But although recovery is “embedded within mental health addiction policies and practice,” the concept has received “scant attention” in GD research.

The researchers therefore aimed to “map and summarize the state of the art on recovery from GD,” defining “recovery” as the “ability to handle conflicting feelings and emotions without external mediation.”

They conducted a scoping review of all literature regarding GD or internet GD published before February 2022 (47 studies, 2,924 participants with GD; mean age range, 13-26 years).

Most studies (n = 32) consisted of exclusively male subjects. Only 10 included both sexes, and female participants were in the minority.

Most studies (n = 42) did not address the concept of recovery, although all studies did report significant improvements in gaming-related pathology. Typical terminology used to describe changes in participants’ GD were “reduction” and/or “decrease” in symptom severity.

Although 18 studies mentioned the word “recovery,” only 5 actually discussed issues related to the notion of recovery, and only 5 used the term “abstinence.”

In addition, only 13 studies examined positive components of life in patients with GD, such as increased psychological well-being, life satisfaction, quality of life, improved emotional state, relational skills, and executive control, as well as improved self-care, hygiene, sleep, and interest in school studies.

“As a person and researcher who believes that words shape the way we perceive things, I think we should use the word ‘recovery’ rather than ‘pathology’ much more in research, therapy, and policy,” said Dr. Gavriel-Fried.

She noted that, because GD is a “relatively new behavioral addictive disorder, theories are still being developed and definitions of the symptoms are still being fine-tuned.”

“The field as a whole will benefit from future theoretical work that will lead to practical solutions for treating GD and ways to identify the risk factors,” Dr. Gavriel-Fried said.
 

 

 

Filling a research gap

In a comment, David Greenfield, MD, founder and medical director of the Connecticut-based Center for Internet and Technology Addiction, noted that 3 decades ago, there was almost no research into this area.

“The fact that we have these reviews and studies is good because all of the research adds to the science providing more data about an area we still don’t know that much about, where research is still in its infancy,” said Dr. Greenfield, who was not involved with the present study.

“Although we have definitions, there’s no complete agreement about the definitions of GD, and we do not yet have a unified approach,” continued Dr. Greenfield, who wrote the books Overcoming Internet Addiction for Dummies and Virtual Addiction.

He suggested that “recovery” is rarely used as a concept in GD research perhaps because there’s a “bifurcation in the field of addiction medicine in which behavioral addictions are not seen as equivalent to substance addictions,” and, particularly with GD, the principles of “recovery” have not yet matured.

“Recovery means meaningful life away from the screen, not just abstinence from the screen,” said Dr. Greenfield.

The study by Mr. Chen and colleagues was supported by grants from the National Social Science Foundation of China, the Chongqing Research Program of Basic Research and Frontier Technology, and the Fundamental Research Funds for the Central Universities. Dr. Griffiths has reported receiving research funding from Norsk Tipping (the gambling operator owned by the Norwegian government). The study by Dr. Király and colleagues received support from the Hungarian National Research Development and Innovation Office and the Janos Bolyai Research Scholarship Academy of Sciences to individual investigators. The study by Dr. Gavriel-Fried and colleagues received support from the Hungarian National Research Development and Innovation Office and the Janos Bolyai Research Scholarship Academy of Sciences to individual investigators. Dr. Gavriel-Fried has reported receiving grants from the Israel National Insurance Institute and the Committee for Independent Studies of the Israel Lottery. Dr. Greenfield reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Three studies provide new insight into the growing problem of gaming disorder (GD), including the condition’s genesis, effective treatments, and the need for a greater focus on recovery.

A team of international researchers led by Orsolya Király, PhD, of the Institute of Psychology, Eötvös Loránd University, Budapest, reviewed the characteristics and etiology of GD. They concluded that its genesis arises from the interaction of environmental factors, game-specific factors and individual factors, including personality traits, comorbid psychopathology, and genetic predisposition.

“The development of GD is a complex process and we identified three major factors involved,” study coauthor Mark Griffiths, PhD, distinguished professor of behavioral addiction and director of the international gaming research unit, psychology department, Nottingham (England) Trent University, said in an interview. Because of this complexity, “prevention and intervention in GD require multiprofessional action.”

The review was published in Comprehensive Psychiatry.

In a second paper, published online in Frontiers in Psychiatry, Chinese investigators reviewing randomized controlled trials (RCTs) presented “compelling evidence” to support four effective interventions for GD: group counseling, acceptance and cognitive restructuring intervention program (ACRIP), short-term cognitive-behavioral therapy (CBT), and craving behavioral intervention (CBI).

A third paper, published online in the Journal of Behavioral Addictions, in which researchers analyzed close to 50 studies of GD, found that the concept of “recovery” is rarely mentioned in GD research. Lead author Belle Gavriel-Fried, PhD, senior professor, Bob Shapell School of Social Work, Tel Aviv University, said in an interview that recovery is a “holistic concept that taps into many aspects of life.”

Understanding the “differences in the impact and availability” of negative and positive human resources and their effect on recovery “can help clinicians to customize treatment,” she said.
 

Complex interplay

GD is garnering increasing attention in the clinical community, especially since 2019, when the World Health Organization included it in the ICD-11.

“Although for most individuals, gaming is a recreational activity or even a passion, a small group of gamers experiences negative symptoms which impact their mental and physical health and cause functional impairment,” wrote Dr. Király and colleagues.

Dr. Griffiths explained that his team wanted to provide an “up-to-date primer – a ‘one-stop shop’ – on all things etiologic concerning gaming disorder for academics and practitioners” as well as others, such as health policy makers, teachers, and individuals in the gaming industry.

The researchers identified three factors that increase the risk of developing GD, the first being gaming-related factors, which make video games “addictive in a way that vulnerable individuals may develop GD.”

For example, GD is more prevalent among online versus offline game players, possibly because online multiplayer games “provide safe environments in which players can fulfill their social needs while remaining invisible and anonymous.”

Game genre also matters, with massively multiplayer online role-playing games, first-person/third-person shooter games, real-time strategy games, and multiplayer online battle arena games most implicated in problematic gaming. Moreover, the “monetization techniques” of certain games also increase their addictive potential.

The researchers point to individual factors that increase the risk of developing GD, including male sex and younger age, personality traits like impulsivity and sensation-seeking, and comorbidities including ADHD, anxiety, and depression.

Poor self-esteem and lack of social competencies make gaming “an easy and efficient way to compensate for these deficiencies, which in turn, heightens the risk for developing GD,” they add. Neurobiological processes and genetic predisposition also play a role.

Lastly, the authors mentioned environmental factors, including family and peer-group issues, problems at work or school, and cultural factors.

“The take-home messages are that problematic gaming has had a long history of empirical research; that the psychiatric community now views GD as a legitimate mental health issue; and that the reasons for GD are complex, with many different factors involved in the acquisition, development, and maintenance of GD,” said Dr. Griffiths.
 

 

 

Beneficial behavioral therapies

Yuzhou Chen and colleagues, Southwest University, Chongqing, China, conducted a systematic review of RCTs investigating interventions for treating GD. Despite the “large number of intervention approaches developed over the past decade, as yet, there are no authoritative guidelines for what makes an effective GD intervention,” they wrote.

Few studies have focused specifically on GD but instead have focused on a combination of internet addiction and GD. But the interventions used to treat internet addiction may not apply to GD. And few studies have utilized an RCT design. The researchers therefore set out to review studies that specifically used an RCT design to investigate interventions for GD.

They searched six databases to identify RCTs that tested GD interventions from the inception of each database until the end of 2021. To be included, participants had to be diagnosed with GD and receive either a “complete and systematic intervention” or be in a comparator control group receiving no intervention or placebo.

Seven studies met the inclusion criteria (n = 332 participants). The studies tested five interventions:
 

  • Group counseling with three different themes (interpersonal interaction, acceptance and commitment, cognition and behavior)
  • CBI, which addresses cravings
  • Transcranial direct current stimulation (tDCS)
  • ACRIP with the main objectives of reducing GD symptoms and improving psychological well-being
  • Short-term CBT, which addresses maladaptive cognitions

The mean duration of the interventions ranged from 3 to 15 weeks.

The primary outcome was GD severity, with secondary outcomes including depression, anxiety, cognition, game time, self-esteem, self-compassion, shyness, impulsivity, and psychological well-being.

Group counseling, CBI, ACRIP, and short-term CBT interventions had “a significant effect on decreasing the severity of GD,” while tDCS had “no significant effect.”

Behavioral therapy “exerts its effect on the behavioral mechanism of GD; for example, by reducing the association between game-related stimuli and the game player’s response to them,” the authors suggested.



Behavioral therapy “exerts its effect on the behavioral mechanism of GD; for example, by reducing the association between game-related stimuli and the game-player’s response to them,” the authors suggested.
 

Recovery vs. pathology

Recovery “traditionally represents the transition from trauma and illness to health,” Dr. Gavriel-Fried and colleagues noted.

Two paradigms of recovery are “deficit based” and “strength based.” The first assesses recovery in terms of abstinence, sobriety, and symptom reduction; and the second focuses on “growth, rather than a reduction in pathology.”

But although recovery is “embedded within mental health addiction policies and practice,” the concept has received “scant attention” in GD research.

The researchers therefore aimed to “map and summarize the state of the art on recovery from GD,” defining “recovery” as the “ability to handle conflicting feelings and emotions without external mediation.”

They conducted a scoping review of all literature regarding GD or internet GD published before February 2022 (47 studies, 2,924 participants with GD; mean age range, 13-26 years).

Most studies (n = 32) consisted of exclusively male subjects. Only 10 included both sexes, and female participants were in the minority.

Most studies (n = 42) did not address the concept of recovery, although all studies did report significant improvements in gaming-related pathology. Typical terminology used to describe changes in participants’ GD were “reduction” and/or “decrease” in symptom severity.

Although 18 studies mentioned the word “recovery,” only 5 actually discussed issues related to the notion of recovery, and only 5 used the term “abstinence.”

In addition, only 13 studies examined positive components of life in patients with GD, such as increased psychological well-being, life satisfaction, quality of life, improved emotional state, relational skills, and executive control, as well as improved self-care, hygiene, sleep, and interest in school studies.

“As a person and researcher who believes that words shape the way we perceive things, I think we should use the word ‘recovery’ rather than ‘pathology’ much more in research, therapy, and policy,” said Dr. Gavriel-Fried.

She noted that, because GD is a “relatively new behavioral addictive disorder, theories are still being developed and definitions of the symptoms are still being fine-tuned.”

“The field as a whole will benefit from future theoretical work that will lead to practical solutions for treating GD and ways to identify the risk factors,” Dr. Gavriel-Fried said.
 

 

 

Filling a research gap

In a comment, David Greenfield, MD, founder and medical director of the Connecticut-based Center for Internet and Technology Addiction, noted that 3 decades ago, there was almost no research into this area.

“The fact that we have these reviews and studies is good because all of the research adds to the science providing more data about an area we still don’t know that much about, where research is still in its infancy,” said Dr. Greenfield, who was not involved with the present study.

“Although we have definitions, there’s no complete agreement about the definitions of GD, and we do not yet have a unified approach,” continued Dr. Greenfield, who wrote the books Overcoming Internet Addiction for Dummies and Virtual Addiction.

He suggested that “recovery” is rarely used as a concept in GD research perhaps because there’s a “bifurcation in the field of addiction medicine in which behavioral addictions are not seen as equivalent to substance addictions,” and, particularly with GD, the principles of “recovery” have not yet matured.

“Recovery means meaningful life away from the screen, not just abstinence from the screen,” said Dr. Greenfield.

The study by Mr. Chen and colleagues was supported by grants from the National Social Science Foundation of China, the Chongqing Research Program of Basic Research and Frontier Technology, and the Fundamental Research Funds for the Central Universities. Dr. Griffiths has reported receiving research funding from Norsk Tipping (the gambling operator owned by the Norwegian government). The study by Dr. Király and colleagues received support from the Hungarian National Research Development and Innovation Office and the Janos Bolyai Research Scholarship Academy of Sciences to individual investigators. The study by Dr. Gavriel-Fried and colleagues received support from the Hungarian National Research Development and Innovation Office and the Janos Bolyai Research Scholarship Academy of Sciences to individual investigators. Dr. Gavriel-Fried has reported receiving grants from the Israel National Insurance Institute and the Committee for Independent Studies of the Israel Lottery. Dr. Greenfield reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Heart rate, cardiac phase influence perception of time

Article Type
Changed
Fri, 04/07/2023 - 08:14

 

People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.

Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.

The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.

“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.

The study was published online  in Psychophysiology.

In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.

The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.

“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
 

Temporal ‘wrinkles’

“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”

“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.

Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.

The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”

To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).

Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.

In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
 

 

 

‘Classical’ response

“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.

The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.

They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.

When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.

“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”

She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”

A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”

It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
 

Bidirectional relationship

“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”

The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”

This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”

To do so, they conducted two experiments.

In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.

Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.

The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.

They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.

“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.

The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.

In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.

These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).

“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”

The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.

She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
 

 

 

Converging evidence

Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”

Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.

The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.

“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.

No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.

Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

 

People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.

Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.

The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.

“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.

The study was published online  in Psychophysiology.

In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.

The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.

“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
 

Temporal ‘wrinkles’

“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”

“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.

Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.

The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”

To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).

Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.

In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
 

 

 

‘Classical’ response

“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.

The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.

They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.

When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.

“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”

She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”

A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”

It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
 

Bidirectional relationship

“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”

The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”

This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”

To do so, they conducted two experiments.

In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.

Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.

The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.

They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.

“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.

The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.

In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.

These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).

“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”

The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.

She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
 

 

 

Converging evidence

Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”

Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.

The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.

“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.

No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.

Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

 

People’s perception of time is subjective and based not only on their emotional state but also on heartbeat and heart rate (HR), two new studies suggest.

Researchers studied young adults with an electrocardiogram (ECG), measuring electrical activity at millisecond resolution while participants listened to tones that varied in duration. Participants were asked to report whether certain tones were longer or shorter, in relation to others.

The researchers found that the momentary perception of time was not continuous but rather expanded or contracted with each heartbeat. When the heartbeat preceding a tone was shorter, participants regarded the tone as longer in duration; but when the preceding heartbeat was longer, the participants experienced the tone as shorter.

“Our findings suggest that there is a unique role that cardiac dynamics play in the momentary experience of time,” lead author Saeedah Sadeghi, MSc, a doctoral candidate in the department of psychology at Cornell University, Ithaca, N.Y., said in an interview.

The study was published online  in Psychophysiology.

In a second study, published in the journal Current Biology, a separate team of researchers asked participants to judge whether a brief event – the presentation of a tone or an image – was shorter or longer than a reference duration. ECG was used to track systole and diastole when participants were presented with these events.

The researchers found that the durations were underestimated during systole and overestimated during diastole, suggesting that time seemed to “speed up” or “slow down,” based on cardiac contraction and relaxation. When participants rated the events as more arousing, their perceived durations contracted, even during diastole.

“In our new paper, we show that our heart shapes the perceived duration of events, so time passes quicker when the heart contracts but slower when the heart relaxes,” lead author Irena Arslanova, PhD, postdoctoral researcher in cognitive neuroscience, Royal Holloway University of London, told this news organization.
 

Temporal ‘wrinkles’

“Subjective time is malleable,” observed Ms. Sadeghi and colleagues in their report. “Rather than being a uniform dimension, perceived duration has ‘wrinkles,’ with certain intervals appearing to dilate or contract relative to objective time” – a phenomenon sometimes referred to as “distortion.”

“We have known that people aren’t always consistent in how they perceive time, and objective duration doesn’t always explain subjective perception of time,” Ms. Sadeghi said.

Although the potential role of the heart in the experience of time has been hypothesized, research into the heart-time connection has been limited, with previous studies focusing primarily on estimating the average cardiac measures on longer time scales over seconds to minutes.

The current study sought to investigate “the beat-by-beat fluctuations of the heart period on the experience of brief moments in time” because, compared with longer time scales, subsecond temporal perception “has different underlying mechanisms” and a subsecond stimulus can be a “small fraction of a heartbeat.”

To home in on this small fraction, the researchers studied 45 participants (aged 18-21), who listened to 210 tones ranging in duration from 80 ms (short) to 188 ms (long). The tones were linearly spaced at 18-ms increments (80, 98, 116, 134, 152, 170, 188).

Participants were asked to categorize each tone as “short” or “long.” All tones were randomly assigned to be synchronized either with the systolic or diastolic phase of the cardiac cycle (50% each). The tones were triggered by participants’ heartbeats.

In addition, participants engaged in a heartbeat-counting activity, in which they were asked not to touch their pulse but to count their heartbeats by tuning in to their bodily sensations at intervals of 25, 35, and 45 seconds.
 

 

 

‘Classical’ response

“Participants exhibited an increased heart period after tone onset, which returned to baseline following an average canonical bell shape,” the authors reported.

The researchers performed regression analyses to determine how, on average, the heart rate before the tone was related to perceived duration or how the amount of change after the tone was related to perceived duration.

They found that when the heart rate was higher before the tone, participants tended to be more accurate in their time perception. When the heartbeat preceding a tone was shorter, participants experienced the tone as longer; conversely, when the heartbeat was longer, they experienced the duration of the identical sound as shorter.

When participants focused their attention on the sounds, their heart rate was affected such that their orienting responses actually changed their heart rate and, in turn, their temporal perception.

“The orienting response is classical,” Ms. Sadeghi said. “When you attend to something unpredictable or novel, the act of orienting attention decreases the HR.”

She explained that the heartbeats are “noise to the brain.” When people need to perceive external events, “a decrease in HR facilitates the intake of things from outside and facilitates sensory intake.”

A lower HR “makes it easier for the person to take in the tone and perceive it, so it feels as though they perceive more of the tone and the duration seems longer – similarly, when the HR decreases.”

It is unknown whether this is a causal relationship, she cautioned, “but it seems as though the decrease in HR somehow makes it easier to ‘get’ more of the tone, which then appears to have longer duration.”
 

Bidirectional relationship

“We know that experienced time can be distorted,” said Dr. Arslanova. “Time flies by when we’re busy or having fun but drags on when we’re bored or waiting for something, yet we still don’t know how the brain gives rise to such elastic experience of time.”

The brain controls the heart in response to the information the heart provides about the state of the body, she noted, “but we have begun to see more research showing that the heart–brain relationship is bidirectional.”

This means that the heart plays a role in shaping “how we process information and experience emotions.” In this analysis, Dr. Arslanova and colleagues “wanted to study whether the heart also shapes the experience of time.”

To do so, they conducted two experiments.

In the first, participants (n = 28) were presented with brief events during systole or during diastole. The events took the form of an emotionally neutral visual shape or auditory tone, shown for durations of 200 to 400 ms.

Participants were asked whether these events were of longer or shorter duration, compared with a reference duration.

The researchers found significant main effect of cardiac phase systole (F(1,27) = 8.1, P =.01), with stimuli presented at diastole regarded, on average, as 7 ms longer than those presented at systole.

They also found a significant main effect of modality (F(1,27) = 5.7, P = .02), with tones judged, on average, as 13 ms longer than visual stimuli.

“This means that time ‘sped up’ during the heart’s contraction and ‘slowed down’ during the heart’s relaxation,” Dr. Arslanova said.

The effect of cardiac phase on duration perception was independent of changes in HR, the authors noted.

In the second experiment, participants performed a similar task, but this time, it involved the images of faces containing emotional expressions. The researchers again observed a similar pattern of time appearing to speed up during systole and slow down during diastole, with stimuli present at diastole regarded as being an average 9 ms longer than those presented at systole.

These opposing effects of systole and diastole on time perception were present only for low and average arousal ratings (b = 14.4 [SE 3.2], P < .001 and b = 9.2 [2.3], P <.001, respectively). However, this effect disappeared when arousal ratings increased (b = 4.1 [3.2] P =.21).

“Interestingly, when participants rated the events as more arousing, their perceived durations contracted, even during the heart’s relaxation,” Dr. Arslanova observed. “This means that in a nonaroused state, the two cardiac phases pull the experienced duration in opposite directions – time contracts, then expands.”

The findings “also predict that increasing HR would speed up passing time, making events seem shorter, because there will be a stronger influence from the heart’s contractions,” she said.

She described the relationship between time perception and emotion as complex, noting that the findings are important because they show “that the way we experience time cannot be examined in isolation from our body,” she said.
 

 

 

Converging evidence

Martin Wiener, PhD, assistant professor, George Mason University, Fairfax, Va., said both papers “provide converging evidence on the role of the heart in our perception of time.”

Together, “the results share that our sense of time – that is, our incoming sensory perception of the present ‘moment’ – is adjusted or ‘gated’ by both our HR and cardiac phase,” said Dr. Wiener, executive director of the Timing Research Forum.

The studies “provide a link between the body and the brain, in terms of our perception, and that we cannot study one without the context of the other,” said Dr. Wiener, who was not involved with the current study.

“All of this opens up a new avenue of research, and so it is very exciting to see,” Dr. Wiener stated.

No source of funding was listed for the study by Ms. Sadeghi and coauthors. They declared no relevant financial relationships.

Dr. Arslanova and coauthors declared no competing interests. Senior author Manos Tsakiris, PhD, receives funding from the European Research Council Consolidator Grant. Dr. Wiener declared no relevant financial relationships.
 

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM PSYCHOPHYSIOLOGY

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

What’s driving the "world’s fastest-growing brain disease"?

Article Type
Changed
Thu, 04/13/2023 - 14:57

A common chemical that is used in correction fluid, paint removers, gun cleaners, aerosol cleaning products, and dry cleaning may be the key culprit behind the dramatic increase in Parkinson’s disease (PD), researchers say.

An international team of researchers reviewed previous research and cited data that suggest the chemical trichloroethylene (TCE) is associated with as much as a 500% increased risk for Parkinson’s disease (PD).

Lead investigator Ray Dorsey, MD, professor of neurology, University of Rochester, N.Y., called PD “the world’s fastest-growing brain disease,” and told this news organization that it “may be largely preventable.”

“Countless people have died over generations from cancer and other disease linked to TCE [and] Parkinson’s may be the latest,” he said. “Banning these chemicals, containing contaminated sites, and protecting homes, schools, and buildings at risk may all create a world where Parkinson’s is increasingly rare, not common.”

The paper was published online in the Journal of Parkinson’s Disease.
 

Invisible, ubiquitous

TCE was first synthesized in a lab in 1864, with commercial production beginning in 1920, the researchers noted.

“Because of its unique properties, TCE has had countless industrial, commercial, military, and medical applications,” including producing refrigerants, cleaning electronics, and degreasing engine parts.

In addition, it’s been used in dry cleaning, although a similar chemical (perchloroethylene [PCE]) is currently more widely used for that purpose. Nevertheless, the authors noted, in anaerobic conditions, perchloroethylene often transforms into TCE “and their toxicity may be similar.”

Consumer products in which TCE is found include typewriter correction fluid, paint removers, gun cleaners, and aerosol cleaning products. Up until the 1970s, it was used to decaffeinate coffee.

TCE exposure isn’t confined to those who work with it. It also pollutes outdoor air, taints groundwater, and contaminates indoor air. It’s present in a substantial amount of groundwater in the United States and it “evaporates from underlying soil and groundwater and enters homes, workplaces, or schools, often undetected,” the researchers noted.

“Exposure can come via occupation or the environment and is often largely unknown at the time it occurs,” Dr. Dorsey said.

He noted that the rapid increase in PD incidence cannot be explained by genetic factors alone, which affect only about 15% of patients with PD, nor can it be explained by aging alone. “Certain pesticides ... are likely causes but would not explain the high prevalence of PD in urban areas, as is the case in the U.S.” Rather, “other factors” are involved, and “TCE is likely one such factor.”

Yet, “despite widespread contamination and increasing industrial, commercial, and military use, clinical investigations of TCE and PD have been limited.”

To fill this knowledge gap, Dr. Dorsey and his coauthors of the book, “Ending Parkinson’s Disease: A Prescription for Action,” took a deep dive into studies focusing on the potential association of TCE and PD and presented seven cases to illustrate the association.

“Like many genetic mutations (e.g., Parkin) and other environmental toxicants ... TCE damages the energy-producing parts of cells, i.e., the mitochondria,” said Dr. Dorsey.

TCE and PCE “likely mediate their toxicity through a common metabolite.” Because both are lipophilic, they “readily distribute in the brain and body tissues and appear to cause mitochondrial dysfunction at high doses,” the researchers hypothesized.

Dopaminergic neurons are particularly sensitive to mitochondrial neurotoxicants, so this might “partially explain the link to PD.”

Animal studies have shown that TCE “caused selective loss of dopaminergic neurons.” Moreover, PD-related neuropathology was found in the substantia nigra of rodents exposed to TCE over time. In addition, studies as early as 1960 were showing an association between TCE and parkinsonism.

The authors describe TCE as “ubiquitous” in the 1970s, with 10 million Americans working with the chemical or other organic solvents daily. The review details an extensive list of industries and occupations in which TCE exposure continues to occur.

People working with TCE might inhale it or touch it; but “millions more encounter the chemical unknowingly through outdoor air, contaminated groundwater, and indoor air pollution.”

They noted that TCE contaminates up to one-third of U.S. drinking water, has polluted the groundwater in more than 20 different countries on five continents, and is found in half of the 1,300 most toxic “Superfund” sites that are “part of a federal clean-up program, including 15 in California’s Silicon Valley, where TCE was used to clean electronics.”

Although the U.S. military stopped using TCE, numerous sites have been contaminated, including Marine Corps Base Camp Lejeune in North Carolina, where TCE and PCE were found in drinking water at 280 times the recommended safety standards.

The researchers highlighted seven cases of individuals who developed PD after likely exposure to TCE, including NBA basketball player Brian Grant, who developed symptoms of PD in 2006 at the age of 34.

Mr. Grant and his family had lived in Camp Lejeune when he was a child, during which time he drank, bathed, and swam in contaminated water, “unaware of its toxicity.” His father also died of esophageal cancer, “which is linked to TCE,” the authors of the study wrote. Mr. Grant has created a foundation to inspire and support patients with PD.

All of the individuals either grew up in or spent time in an area where they were extensively exposed to TCE, PCE, or other chemicals, or experienced occupational exposure.

The authors acknowledged that the role of TCE in PD, as illustrated by the cases, is “far from definitive.” For example, exposure to TCE is often combined with exposure to other toxins, or with unmeasured genetic risk factors.

They highlighted the need for more research and called for cleaning and containing contaminated sites, monitoring TCE levels, and publicly communicating risk and a ban on TCE.
 

 

 

Recall bias?

Commenting for this news organization, Rebecca Gilbert, MD, PhD, chief scientific officer, American Parkinson Disease Association (APDA), noted that the authors “are very frank about the limitations of this approach [illustrative cases] as proof of causation between PD and TCE exposure.”

Another limitation is that TCE exposure is very common, “as argued in the paper.” But “most people with exposure do not develop PD,” Dr. Gilbert pointed out. “By probing the TCE exposure of those who already have PD, there is a danger of recall bias.”

Dr. Gilbert, associate professor of neurology at NYU Langone Health, who was not involved with the study, acknowledged that the authors “present their work as hypothesis and clearly state that more work is needed to understand the connection between TCE and PD.”

In the meantime, however, there are “well-established health risks of TCE exposure, including development of various cancers,” she said. Therefore, the authors’ goals appear to be educating the public about known health risks, working to clean up known sites of contamination, and advocating to ban future use of TCE.

These goals “do not need to wait for [proof of] firm causation between TCE and PD,” she stated.

Dr. Dorsey reported he has received honoraria for speaking at the American Academy of Neurology and at multiple other societies and foundations and has received compensation for consulting services from pharmaceutical companies, foundations, medical education companies, and medical publications; he owns stock in several companies. The other authors’ disclosures can be found in the original paper. Dr. Gilbert is employed by the American Parkinson Disease Association and Bellevue Hospital Center in New York City.
 

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Publications
Topics
Sections

A common chemical that is used in correction fluid, paint removers, gun cleaners, aerosol cleaning products, and dry cleaning may be the key culprit behind the dramatic increase in Parkinson’s disease (PD), researchers say.

An international team of researchers reviewed previous research and cited data that suggest the chemical trichloroethylene (TCE) is associated with as much as a 500% increased risk for Parkinson’s disease (PD).

Lead investigator Ray Dorsey, MD, professor of neurology, University of Rochester, N.Y., called PD “the world’s fastest-growing brain disease,” and told this news organization that it “may be largely preventable.”

“Countless people have died over generations from cancer and other disease linked to TCE [and] Parkinson’s may be the latest,” he said. “Banning these chemicals, containing contaminated sites, and protecting homes, schools, and buildings at risk may all create a world where Parkinson’s is increasingly rare, not common.”

The paper was published online in the Journal of Parkinson’s Disease.
 

Invisible, ubiquitous

TCE was first synthesized in a lab in 1864, with commercial production beginning in 1920, the researchers noted.

“Because of its unique properties, TCE has had countless industrial, commercial, military, and medical applications,” including producing refrigerants, cleaning electronics, and degreasing engine parts.

In addition, it’s been used in dry cleaning, although a similar chemical (perchloroethylene [PCE]) is currently more widely used for that purpose. Nevertheless, the authors noted, in anaerobic conditions, perchloroethylene often transforms into TCE “and their toxicity may be similar.”

Consumer products in which TCE is found include typewriter correction fluid, paint removers, gun cleaners, and aerosol cleaning products. Up until the 1970s, it was used to decaffeinate coffee.

TCE exposure isn’t confined to those who work with it. It also pollutes outdoor air, taints groundwater, and contaminates indoor air. It’s present in a substantial amount of groundwater in the United States and it “evaporates from underlying soil and groundwater and enters homes, workplaces, or schools, often undetected,” the researchers noted.

“Exposure can come via occupation or the environment and is often largely unknown at the time it occurs,” Dr. Dorsey said.

He noted that the rapid increase in PD incidence cannot be explained by genetic factors alone, which affect only about 15% of patients with PD, nor can it be explained by aging alone. “Certain pesticides ... are likely causes but would not explain the high prevalence of PD in urban areas, as is the case in the U.S.” Rather, “other factors” are involved, and “TCE is likely one such factor.”

Yet, “despite widespread contamination and increasing industrial, commercial, and military use, clinical investigations of TCE and PD have been limited.”

To fill this knowledge gap, Dr. Dorsey and his coauthors of the book, “Ending Parkinson’s Disease: A Prescription for Action,” took a deep dive into studies focusing on the potential association of TCE and PD and presented seven cases to illustrate the association.

“Like many genetic mutations (e.g., Parkin) and other environmental toxicants ... TCE damages the energy-producing parts of cells, i.e., the mitochondria,” said Dr. Dorsey.

TCE and PCE “likely mediate their toxicity through a common metabolite.” Because both are lipophilic, they “readily distribute in the brain and body tissues and appear to cause mitochondrial dysfunction at high doses,” the researchers hypothesized.

Dopaminergic neurons are particularly sensitive to mitochondrial neurotoxicants, so this might “partially explain the link to PD.”

Animal studies have shown that TCE “caused selective loss of dopaminergic neurons.” Moreover, PD-related neuropathology was found in the substantia nigra of rodents exposed to TCE over time. In addition, studies as early as 1960 were showing an association between TCE and parkinsonism.

The authors describe TCE as “ubiquitous” in the 1970s, with 10 million Americans working with the chemical or other organic solvents daily. The review details an extensive list of industries and occupations in which TCE exposure continues to occur.

People working with TCE might inhale it or touch it; but “millions more encounter the chemical unknowingly through outdoor air, contaminated groundwater, and indoor air pollution.”

They noted that TCE contaminates up to one-third of U.S. drinking water, has polluted the groundwater in more than 20 different countries on five continents, and is found in half of the 1,300 most toxic “Superfund” sites that are “part of a federal clean-up program, including 15 in California’s Silicon Valley, where TCE was used to clean electronics.”

Although the U.S. military stopped using TCE, numerous sites have been contaminated, including Marine Corps Base Camp Lejeune in North Carolina, where TCE and PCE were found in drinking water at 280 times the recommended safety standards.

The researchers highlighted seven cases of individuals who developed PD after likely exposure to TCE, including NBA basketball player Brian Grant, who developed symptoms of PD in 2006 at the age of 34.

Mr. Grant and his family had lived in Camp Lejeune when he was a child, during which time he drank, bathed, and swam in contaminated water, “unaware of its toxicity.” His father also died of esophageal cancer, “which is linked to TCE,” the authors of the study wrote. Mr. Grant has created a foundation to inspire and support patients with PD.

All of the individuals either grew up in or spent time in an area where they were extensively exposed to TCE, PCE, or other chemicals, or experienced occupational exposure.

The authors acknowledged that the role of TCE in PD, as illustrated by the cases, is “far from definitive.” For example, exposure to TCE is often combined with exposure to other toxins, or with unmeasured genetic risk factors.

They highlighted the need for more research and called for cleaning and containing contaminated sites, monitoring TCE levels, and publicly communicating risk and a ban on TCE.
 

 

 

Recall bias?

Commenting for this news organization, Rebecca Gilbert, MD, PhD, chief scientific officer, American Parkinson Disease Association (APDA), noted that the authors “are very frank about the limitations of this approach [illustrative cases] as proof of causation between PD and TCE exposure.”

Another limitation is that TCE exposure is very common, “as argued in the paper.” But “most people with exposure do not develop PD,” Dr. Gilbert pointed out. “By probing the TCE exposure of those who already have PD, there is a danger of recall bias.”

Dr. Gilbert, associate professor of neurology at NYU Langone Health, who was not involved with the study, acknowledged that the authors “present their work as hypothesis and clearly state that more work is needed to understand the connection between TCE and PD.”

In the meantime, however, there are “well-established health risks of TCE exposure, including development of various cancers,” she said. Therefore, the authors’ goals appear to be educating the public about known health risks, working to clean up known sites of contamination, and advocating to ban future use of TCE.

These goals “do not need to wait for [proof of] firm causation between TCE and PD,” she stated.

Dr. Dorsey reported he has received honoraria for speaking at the American Academy of Neurology and at multiple other societies and foundations and has received compensation for consulting services from pharmaceutical companies, foundations, medical education companies, and medical publications; he owns stock in several companies. The other authors’ disclosures can be found in the original paper. Dr. Gilbert is employed by the American Parkinson Disease Association and Bellevue Hospital Center in New York City.
 

A version of this article first appeared on Medscape.com.

A common chemical that is used in correction fluid, paint removers, gun cleaners, aerosol cleaning products, and dry cleaning may be the key culprit behind the dramatic increase in Parkinson’s disease (PD), researchers say.

An international team of researchers reviewed previous research and cited data that suggest the chemical trichloroethylene (TCE) is associated with as much as a 500% increased risk for Parkinson’s disease (PD).

Lead investigator Ray Dorsey, MD, professor of neurology, University of Rochester, N.Y., called PD “the world’s fastest-growing brain disease,” and told this news organization that it “may be largely preventable.”

“Countless people have died over generations from cancer and other disease linked to TCE [and] Parkinson’s may be the latest,” he said. “Banning these chemicals, containing contaminated sites, and protecting homes, schools, and buildings at risk may all create a world where Parkinson’s is increasingly rare, not common.”

The paper was published online in the Journal of Parkinson’s Disease.
 

Invisible, ubiquitous

TCE was first synthesized in a lab in 1864, with commercial production beginning in 1920, the researchers noted.

“Because of its unique properties, TCE has had countless industrial, commercial, military, and medical applications,” including producing refrigerants, cleaning electronics, and degreasing engine parts.

In addition, it’s been used in dry cleaning, although a similar chemical (perchloroethylene [PCE]) is currently more widely used for that purpose. Nevertheless, the authors noted, in anaerobic conditions, perchloroethylene often transforms into TCE “and their toxicity may be similar.”

Consumer products in which TCE is found include typewriter correction fluid, paint removers, gun cleaners, and aerosol cleaning products. Up until the 1970s, it was used to decaffeinate coffee.

TCE exposure isn’t confined to those who work with it. It also pollutes outdoor air, taints groundwater, and contaminates indoor air. It’s present in a substantial amount of groundwater in the United States and it “evaporates from underlying soil and groundwater and enters homes, workplaces, or schools, often undetected,” the researchers noted.

“Exposure can come via occupation or the environment and is often largely unknown at the time it occurs,” Dr. Dorsey said.

He noted that the rapid increase in PD incidence cannot be explained by genetic factors alone, which affect only about 15% of patients with PD, nor can it be explained by aging alone. “Certain pesticides ... are likely causes but would not explain the high prevalence of PD in urban areas, as is the case in the U.S.” Rather, “other factors” are involved, and “TCE is likely one such factor.”

Yet, “despite widespread contamination and increasing industrial, commercial, and military use, clinical investigations of TCE and PD have been limited.”

To fill this knowledge gap, Dr. Dorsey and his coauthors of the book, “Ending Parkinson’s Disease: A Prescription for Action,” took a deep dive into studies focusing on the potential association of TCE and PD and presented seven cases to illustrate the association.

“Like many genetic mutations (e.g., Parkin) and other environmental toxicants ... TCE damages the energy-producing parts of cells, i.e., the mitochondria,” said Dr. Dorsey.

TCE and PCE “likely mediate their toxicity through a common metabolite.” Because both are lipophilic, they “readily distribute in the brain and body tissues and appear to cause mitochondrial dysfunction at high doses,” the researchers hypothesized.

Dopaminergic neurons are particularly sensitive to mitochondrial neurotoxicants, so this might “partially explain the link to PD.”

Animal studies have shown that TCE “caused selective loss of dopaminergic neurons.” Moreover, PD-related neuropathology was found in the substantia nigra of rodents exposed to TCE over time. In addition, studies as early as 1960 were showing an association between TCE and parkinsonism.

The authors describe TCE as “ubiquitous” in the 1970s, with 10 million Americans working with the chemical or other organic solvents daily. The review details an extensive list of industries and occupations in which TCE exposure continues to occur.

People working with TCE might inhale it or touch it; but “millions more encounter the chemical unknowingly through outdoor air, contaminated groundwater, and indoor air pollution.”

They noted that TCE contaminates up to one-third of U.S. drinking water, has polluted the groundwater in more than 20 different countries on five continents, and is found in half of the 1,300 most toxic “Superfund” sites that are “part of a federal clean-up program, including 15 in California’s Silicon Valley, where TCE was used to clean electronics.”

Although the U.S. military stopped using TCE, numerous sites have been contaminated, including Marine Corps Base Camp Lejeune in North Carolina, where TCE and PCE were found in drinking water at 280 times the recommended safety standards.

The researchers highlighted seven cases of individuals who developed PD after likely exposure to TCE, including NBA basketball player Brian Grant, who developed symptoms of PD in 2006 at the age of 34.

Mr. Grant and his family had lived in Camp Lejeune when he was a child, during which time he drank, bathed, and swam in contaminated water, “unaware of its toxicity.” His father also died of esophageal cancer, “which is linked to TCE,” the authors of the study wrote. Mr. Grant has created a foundation to inspire and support patients with PD.

All of the individuals either grew up in or spent time in an area where they were extensively exposed to TCE, PCE, or other chemicals, or experienced occupational exposure.

The authors acknowledged that the role of TCE in PD, as illustrated by the cases, is “far from definitive.” For example, exposure to TCE is often combined with exposure to other toxins, or with unmeasured genetic risk factors.

They highlighted the need for more research and called for cleaning and containing contaminated sites, monitoring TCE levels, and publicly communicating risk and a ban on TCE.
 

 

 

Recall bias?

Commenting for this news organization, Rebecca Gilbert, MD, PhD, chief scientific officer, American Parkinson Disease Association (APDA), noted that the authors “are very frank about the limitations of this approach [illustrative cases] as proof of causation between PD and TCE exposure.”

Another limitation is that TCE exposure is very common, “as argued in the paper.” But “most people with exposure do not develop PD,” Dr. Gilbert pointed out. “By probing the TCE exposure of those who already have PD, there is a danger of recall bias.”

Dr. Gilbert, associate professor of neurology at NYU Langone Health, who was not involved with the study, acknowledged that the authors “present their work as hypothesis and clearly state that more work is needed to understand the connection between TCE and PD.”

In the meantime, however, there are “well-established health risks of TCE exposure, including development of various cancers,” she said. Therefore, the authors’ goals appear to be educating the public about known health risks, working to clean up known sites of contamination, and advocating to ban future use of TCE.

These goals “do not need to wait for [proof of] firm causation between TCE and PD,” she stated.

Dr. Dorsey reported he has received honoraria for speaking at the American Academy of Neurology and at multiple other societies and foundations and has received compensation for consulting services from pharmaceutical companies, foundations, medical education companies, and medical publications; he owns stock in several companies. The other authors’ disclosures can be found in the original paper. Dr. Gilbert is employed by the American Parkinson Disease Association and Bellevue Hospital Center in New York City.
 

A version of this article first appeared on Medscape.com.

Issue
Neurology Reviews - 31(4)
Issue
Neurology Reviews - 31(4)
Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JOURNAL OF PARKINSON’S DISEASE

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article