User login
Alcohol dependence drug the next antianxiety med?
, early research suggests.
Japanese researchers, headed by Akiyoshi Saitoh, PhD, professor in the department of pharmacy, Tokyo University of Science, compared the reactions of mice that received a classic anxiolytic agent (diazepam) to those that received disulfiram while performing a maze task and found comparable reductions in anxiety in both groups of mice.
Moreover, unlike diazepam, disulfiram caused no sedation, amnesia, or impairments in coordination.
“These results indicate that disulfiram can be used safely by elderly patients suffering from anxiety and insomnia and has the potential to become a breakthrough psychotropic drug,” Dr. Saitoh said in a press release.
The study was published online in Frontiers in Pharmacology.
Inhibitory function
Disulfiram inhibits the enzyme aldehyde dehydrogenase (ALDH), which is responsible for alcohol metabolism. Recent research suggests that disulfiram may have broader inhibitory functions.
In particular, it inhibits the cytoplasmic protein FROUNT, preventing it from interacting with two chemokine receptors (CCR2 and CCRs) that are involved in cellular signaling pathways and are associated with regulating behaviors, including anxiety, in rodents, the authors write.
“Although the functions of FROUNT-chemokines signaling in the immune system are well documented, the potential role of CNS-expressed FROUNT chemokine–related molecules as neuromodulators remains largely unknown,” they write.
The researchers had been conducting preclinical research on the secondary pharmacologic properties of disulfiram and “coincidentally discovered” its “anxiolytic-like effects.” They investigated these effects further because currently used anxiolytics – i.e., benzodiazepines – have unwanted side effects.
The researchers utilized an elevated plus-maze (EPM) test to investigate the effects of disulfiram in mice. The EPM apparatus consists of four arms set in a cross pattern and are connected to a central square. Of these, two arms are protected by vertical boundaries, while the other two have unprotected edges. Typically, mice with anxiety prefer to spend time in the closed arms. The mice also underwent other tests of coordination and the ability to navigate a Y-maze.
Some mice received disulfiram, others received a benzodiazepine, and others received merely a “vehicle,” which served as a control.
Disulfiram “significantly and dose-dependently” increased the time spent in the open arms of the EPM, compared with the vehicle-treated group, at 30 minutes after administration (F [3, 30] = 16.64; P < .0001), suggesting less anxiety. The finding was confirmed by a Bonferroni analysis that showed a significant effect of disulfiram, compared with the vehicle-treated group, at all three doses (20 mg/kg: t = 0.9894; P > .05; 40 mg/kg: t = 3.863; P < .01; 80 mg/kg: t = 6.417; P < .001).
A Student’s t-test analysis showed that diazepam likewise had a significant effect, compared to the vehicle (t = 5.038; P < .001).
Disulfiram also “significantly and dose-dependently” increased the percentage of open-arm entries (F [3, 30] = 14.24; P < .0001). The Bonferroni analysis showed this effect at all three doses (20 mg/kg: t = 0.3999; P > .05; 40 mg/kg: t = 2.693; P > .05; 80 mg/kg: t = 5.864; P < .001).
Diazepam similarly showed a significant effect, compared to the vehicle condition (t = 3.733; P < .005).
In particular, the 40 mg/kg dose of disulfiram significantly increased the percentage of time spent in the open arms at 15, 30, and 60 minutes after administration, with the peak effect occurring at 30 minutes.
The researchers examined the effect of cyanamide, another ALDH inhibitor, on the anxiety behaviors of mice and found no effect on the number of open-arm entries or percentage of time the mice spent in the open arm, compared with the vehicle condition.
In contrast to diazepam, disulfiram had no effect on the amount of spontaneous locomotor activity, time spent on the rotarod, or activity on the Y-maze test displayed by the mice, “suggesting that there were no apparent sedative effects at the dosages used.” Moreover, unlike the mice treated with diazepam, there were no increases in the number of falls the mice experienced on the rotarod.
Glutamate levels in the prelimbic-prefrontal cortex (PL-PFC) “play an important role in the development of anxiety-like behavior in mice,” the authors state. Disulfiram “significantly and completely attenuated increased extracellular glutamate levels in the PL-PFC during stress exposure” on the EPM.
“We propose that DSF inhibits FROUNT protein and the chemokine signaling pathways under its influence, which may suppress presynaptic glutamatergic transmission in the brain,” said Dr. Saitoh. “This, in turn, attenuates the levels of glutamate in the brain, reducing overall anxiety.”
Humanity’s most common affliction
Commenting for this news organization, Roger McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the mood disorders psychopharmacology unit, noted that there is a “renewed interest in psychiatry in excitatory and inhibitory balance – for example, ketamine represents a treatment that facilitates excitatory activity, while neurosteroids are candidate medicines now for inhibitory activity.”
Dr. McIntyre, who is the chairman and executive director of the Brain and Cognitive Discover Foundation, Toronto, and was not involved with the study, said it is believed “that the excitatory-inhibitory balance may be relevant to brain health and disease.”
Dr. McIntyre also pointed out that the study “highlights not only the repurposing of a well-known medicine but also exploit[s] the potential brain therapeutic effects of immune targets that indirectly affect inhibitory systems, resulting in potentially a safer treatment for anxiety – the most common affliction of humanity.”
Also commenting for this article, Wilfrid Noel Raby, MD, PhD, a psychiatrist in private practice in Teaneck, N.J., called disulfiram “grossly underused for alcohol use disorders and even more so when people use alcohol and cocaine.”
Dr. Raby, who was not involved with the study, has found that patients withdrawing from cocaine, cannabis, or stimulants “can respond very well to disulfiram [not only] in terms of their cravings but also in terms of mood stabilization and anxiolysis.”
He has also found that for patients with bipolar disorder or attention-deficit/hyperactivity disorder with depression disulfiram and low-dose lithium “can provide anxiolysis and mood stabilization, especially if a rapid effect is required, usually within a week.”
However, Dr. Raby cautioned that “it is probably not advisable to maintain patients on disulfiram for periods long than 3 months consecutively because there is a risk of neuropathy and hepatopathology that are not common but are seen often enough.” He usually interrupts treatment for a month and then resumes if necessary.
The research was partially supported by the Tsukuba Clinical Research and Development Organization from the Japan Agency for Medical Research and Development. The authors and Dr. Raby have disclosed no relevant financial relationships. Dr. McIntyre reports receiving research grant support from CIHR/GACD/National Natural Science Foundation of China; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, AbbVie, and Atai Life Sciences. Dr. McIntyre is CEO of Braxia Scientific.
A version of this article first appeared on Medscape.com.
, early research suggests.
Japanese researchers, headed by Akiyoshi Saitoh, PhD, professor in the department of pharmacy, Tokyo University of Science, compared the reactions of mice that received a classic anxiolytic agent (diazepam) to those that received disulfiram while performing a maze task and found comparable reductions in anxiety in both groups of mice.
Moreover, unlike diazepam, disulfiram caused no sedation, amnesia, or impairments in coordination.
“These results indicate that disulfiram can be used safely by elderly patients suffering from anxiety and insomnia and has the potential to become a breakthrough psychotropic drug,” Dr. Saitoh said in a press release.
The study was published online in Frontiers in Pharmacology.
Inhibitory function
Disulfiram inhibits the enzyme aldehyde dehydrogenase (ALDH), which is responsible for alcohol metabolism. Recent research suggests that disulfiram may have broader inhibitory functions.
In particular, it inhibits the cytoplasmic protein FROUNT, preventing it from interacting with two chemokine receptors (CCR2 and CCRs) that are involved in cellular signaling pathways and are associated with regulating behaviors, including anxiety, in rodents, the authors write.
“Although the functions of FROUNT-chemokines signaling in the immune system are well documented, the potential role of CNS-expressed FROUNT chemokine–related molecules as neuromodulators remains largely unknown,” they write.
The researchers had been conducting preclinical research on the secondary pharmacologic properties of disulfiram and “coincidentally discovered” its “anxiolytic-like effects.” They investigated these effects further because currently used anxiolytics – i.e., benzodiazepines – have unwanted side effects.
The researchers utilized an elevated plus-maze (EPM) test to investigate the effects of disulfiram in mice. The EPM apparatus consists of four arms set in a cross pattern and are connected to a central square. Of these, two arms are protected by vertical boundaries, while the other two have unprotected edges. Typically, mice with anxiety prefer to spend time in the closed arms. The mice also underwent other tests of coordination and the ability to navigate a Y-maze.
Some mice received disulfiram, others received a benzodiazepine, and others received merely a “vehicle,” which served as a control.
Disulfiram “significantly and dose-dependently” increased the time spent in the open arms of the EPM, compared with the vehicle-treated group, at 30 minutes after administration (F [3, 30] = 16.64; P < .0001), suggesting less anxiety. The finding was confirmed by a Bonferroni analysis that showed a significant effect of disulfiram, compared with the vehicle-treated group, at all three doses (20 mg/kg: t = 0.9894; P > .05; 40 mg/kg: t = 3.863; P < .01; 80 mg/kg: t = 6.417; P < .001).
A Student’s t-test analysis showed that diazepam likewise had a significant effect, compared to the vehicle (t = 5.038; P < .001).
Disulfiram also “significantly and dose-dependently” increased the percentage of open-arm entries (F [3, 30] = 14.24; P < .0001). The Bonferroni analysis showed this effect at all three doses (20 mg/kg: t = 0.3999; P > .05; 40 mg/kg: t = 2.693; P > .05; 80 mg/kg: t = 5.864; P < .001).
Diazepam similarly showed a significant effect, compared to the vehicle condition (t = 3.733; P < .005).
In particular, the 40 mg/kg dose of disulfiram significantly increased the percentage of time spent in the open arms at 15, 30, and 60 minutes after administration, with the peak effect occurring at 30 minutes.
The researchers examined the effect of cyanamide, another ALDH inhibitor, on the anxiety behaviors of mice and found no effect on the number of open-arm entries or percentage of time the mice spent in the open arm, compared with the vehicle condition.
In contrast to diazepam, disulfiram had no effect on the amount of spontaneous locomotor activity, time spent on the rotarod, or activity on the Y-maze test displayed by the mice, “suggesting that there were no apparent sedative effects at the dosages used.” Moreover, unlike the mice treated with diazepam, there were no increases in the number of falls the mice experienced on the rotarod.
Glutamate levels in the prelimbic-prefrontal cortex (PL-PFC) “play an important role in the development of anxiety-like behavior in mice,” the authors state. Disulfiram “significantly and completely attenuated increased extracellular glutamate levels in the PL-PFC during stress exposure” on the EPM.
“We propose that DSF inhibits FROUNT protein and the chemokine signaling pathways under its influence, which may suppress presynaptic glutamatergic transmission in the brain,” said Dr. Saitoh. “This, in turn, attenuates the levels of glutamate in the brain, reducing overall anxiety.”
Humanity’s most common affliction
Commenting for this news organization, Roger McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the mood disorders psychopharmacology unit, noted that there is a “renewed interest in psychiatry in excitatory and inhibitory balance – for example, ketamine represents a treatment that facilitates excitatory activity, while neurosteroids are candidate medicines now for inhibitory activity.”
Dr. McIntyre, who is the chairman and executive director of the Brain and Cognitive Discover Foundation, Toronto, and was not involved with the study, said it is believed “that the excitatory-inhibitory balance may be relevant to brain health and disease.”
Dr. McIntyre also pointed out that the study “highlights not only the repurposing of a well-known medicine but also exploit[s] the potential brain therapeutic effects of immune targets that indirectly affect inhibitory systems, resulting in potentially a safer treatment for anxiety – the most common affliction of humanity.”
Also commenting for this article, Wilfrid Noel Raby, MD, PhD, a psychiatrist in private practice in Teaneck, N.J., called disulfiram “grossly underused for alcohol use disorders and even more so when people use alcohol and cocaine.”
Dr. Raby, who was not involved with the study, has found that patients withdrawing from cocaine, cannabis, or stimulants “can respond very well to disulfiram [not only] in terms of their cravings but also in terms of mood stabilization and anxiolysis.”
He has also found that for patients with bipolar disorder or attention-deficit/hyperactivity disorder with depression disulfiram and low-dose lithium “can provide anxiolysis and mood stabilization, especially if a rapid effect is required, usually within a week.”
However, Dr. Raby cautioned that “it is probably not advisable to maintain patients on disulfiram for periods long than 3 months consecutively because there is a risk of neuropathy and hepatopathology that are not common but are seen often enough.” He usually interrupts treatment for a month and then resumes if necessary.
The research was partially supported by the Tsukuba Clinical Research and Development Organization from the Japan Agency for Medical Research and Development. The authors and Dr. Raby have disclosed no relevant financial relationships. Dr. McIntyre reports receiving research grant support from CIHR/GACD/National Natural Science Foundation of China; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, AbbVie, and Atai Life Sciences. Dr. McIntyre is CEO of Braxia Scientific.
A version of this article first appeared on Medscape.com.
, early research suggests.
Japanese researchers, headed by Akiyoshi Saitoh, PhD, professor in the department of pharmacy, Tokyo University of Science, compared the reactions of mice that received a classic anxiolytic agent (diazepam) to those that received disulfiram while performing a maze task and found comparable reductions in anxiety in both groups of mice.
Moreover, unlike diazepam, disulfiram caused no sedation, amnesia, or impairments in coordination.
“These results indicate that disulfiram can be used safely by elderly patients suffering from anxiety and insomnia and has the potential to become a breakthrough psychotropic drug,” Dr. Saitoh said in a press release.
The study was published online in Frontiers in Pharmacology.
Inhibitory function
Disulfiram inhibits the enzyme aldehyde dehydrogenase (ALDH), which is responsible for alcohol metabolism. Recent research suggests that disulfiram may have broader inhibitory functions.
In particular, it inhibits the cytoplasmic protein FROUNT, preventing it from interacting with two chemokine receptors (CCR2 and CCRs) that are involved in cellular signaling pathways and are associated with regulating behaviors, including anxiety, in rodents, the authors write.
“Although the functions of FROUNT-chemokines signaling in the immune system are well documented, the potential role of CNS-expressed FROUNT chemokine–related molecules as neuromodulators remains largely unknown,” they write.
The researchers had been conducting preclinical research on the secondary pharmacologic properties of disulfiram and “coincidentally discovered” its “anxiolytic-like effects.” They investigated these effects further because currently used anxiolytics – i.e., benzodiazepines – have unwanted side effects.
The researchers utilized an elevated plus-maze (EPM) test to investigate the effects of disulfiram in mice. The EPM apparatus consists of four arms set in a cross pattern and are connected to a central square. Of these, two arms are protected by vertical boundaries, while the other two have unprotected edges. Typically, mice with anxiety prefer to spend time in the closed arms. The mice also underwent other tests of coordination and the ability to navigate a Y-maze.
Some mice received disulfiram, others received a benzodiazepine, and others received merely a “vehicle,” which served as a control.
Disulfiram “significantly and dose-dependently” increased the time spent in the open arms of the EPM, compared with the vehicle-treated group, at 30 minutes after administration (F [3, 30] = 16.64; P < .0001), suggesting less anxiety. The finding was confirmed by a Bonferroni analysis that showed a significant effect of disulfiram, compared with the vehicle-treated group, at all three doses (20 mg/kg: t = 0.9894; P > .05; 40 mg/kg: t = 3.863; P < .01; 80 mg/kg: t = 6.417; P < .001).
A Student’s t-test analysis showed that diazepam likewise had a significant effect, compared to the vehicle (t = 5.038; P < .001).
Disulfiram also “significantly and dose-dependently” increased the percentage of open-arm entries (F [3, 30] = 14.24; P < .0001). The Bonferroni analysis showed this effect at all three doses (20 mg/kg: t = 0.3999; P > .05; 40 mg/kg: t = 2.693; P > .05; 80 mg/kg: t = 5.864; P < .001).
Diazepam similarly showed a significant effect, compared to the vehicle condition (t = 3.733; P < .005).
In particular, the 40 mg/kg dose of disulfiram significantly increased the percentage of time spent in the open arms at 15, 30, and 60 minutes after administration, with the peak effect occurring at 30 minutes.
The researchers examined the effect of cyanamide, another ALDH inhibitor, on the anxiety behaviors of mice and found no effect on the number of open-arm entries or percentage of time the mice spent in the open arm, compared with the vehicle condition.
In contrast to diazepam, disulfiram had no effect on the amount of spontaneous locomotor activity, time spent on the rotarod, or activity on the Y-maze test displayed by the mice, “suggesting that there were no apparent sedative effects at the dosages used.” Moreover, unlike the mice treated with diazepam, there were no increases in the number of falls the mice experienced on the rotarod.
Glutamate levels in the prelimbic-prefrontal cortex (PL-PFC) “play an important role in the development of anxiety-like behavior in mice,” the authors state. Disulfiram “significantly and completely attenuated increased extracellular glutamate levels in the PL-PFC during stress exposure” on the EPM.
“We propose that DSF inhibits FROUNT protein and the chemokine signaling pathways under its influence, which may suppress presynaptic glutamatergic transmission in the brain,” said Dr. Saitoh. “This, in turn, attenuates the levels of glutamate in the brain, reducing overall anxiety.”
Humanity’s most common affliction
Commenting for this news organization, Roger McIntyre, MD, professor of psychiatry and pharmacology, University of Toronto, and head of the mood disorders psychopharmacology unit, noted that there is a “renewed interest in psychiatry in excitatory and inhibitory balance – for example, ketamine represents a treatment that facilitates excitatory activity, while neurosteroids are candidate medicines now for inhibitory activity.”
Dr. McIntyre, who is the chairman and executive director of the Brain and Cognitive Discover Foundation, Toronto, and was not involved with the study, said it is believed “that the excitatory-inhibitory balance may be relevant to brain health and disease.”
Dr. McIntyre also pointed out that the study “highlights not only the repurposing of a well-known medicine but also exploit[s] the potential brain therapeutic effects of immune targets that indirectly affect inhibitory systems, resulting in potentially a safer treatment for anxiety – the most common affliction of humanity.”
Also commenting for this article, Wilfrid Noel Raby, MD, PhD, a psychiatrist in private practice in Teaneck, N.J., called disulfiram “grossly underused for alcohol use disorders and even more so when people use alcohol and cocaine.”
Dr. Raby, who was not involved with the study, has found that patients withdrawing from cocaine, cannabis, or stimulants “can respond very well to disulfiram [not only] in terms of their cravings but also in terms of mood stabilization and anxiolysis.”
He has also found that for patients with bipolar disorder or attention-deficit/hyperactivity disorder with depression disulfiram and low-dose lithium “can provide anxiolysis and mood stabilization, especially if a rapid effect is required, usually within a week.”
However, Dr. Raby cautioned that “it is probably not advisable to maintain patients on disulfiram for periods long than 3 months consecutively because there is a risk of neuropathy and hepatopathology that are not common but are seen often enough.” He usually interrupts treatment for a month and then resumes if necessary.
The research was partially supported by the Tsukuba Clinical Research and Development Organization from the Japan Agency for Medical Research and Development. The authors and Dr. Raby have disclosed no relevant financial relationships. Dr. McIntyre reports receiving research grant support from CIHR/GACD/National Natural Science Foundation of China; speaker/consultation fees from Lundbeck, Janssen, Alkermes, Mitsubishi Tanabe, Purdue, Pfizer, Otsuka, Takeda, Neurocrine, Sunovion, Bausch Health, Axsome, Novo Nordisk, Kris, Sanofi, Eisai, Intra-Cellular, NewBridge Pharmaceuticals, AbbVie, and Atai Life Sciences. Dr. McIntyre is CEO of Braxia Scientific.
A version of this article first appeared on Medscape.com.
FROM FRONTIERS IN PHARMACOLOGY
Aged black garlic supplement may help lower BP
After 6 weeks, consumption of ABG with a high concentration of s-allyl-L-cystine (SAC) was associated with a nearly 6-mm Hg reduction in DBP in men. Other cardiovascular disease (CVD) risk factors were not significantly affected.
“The observed reduction in DBP by ABG extract was similar to the effects of dietary approaches, including the effects of the Dietary Approaches to Stop Hypertension(DASH) diet on BP,” say Rosa M. Valls, PhD, Universitat Rovira i Virgili, Reus, Spain, and colleagues.
“The potential beneficial effects of ABG may contribute to obtaining an optimal DBP” but were “better observed in men and in nonoptimal DBP populations,” they write in the study, published in Nutrients.
Pure SAC and aged garlics have shown healthy effects on multiple targets in in vitro and in vivo tests. However, previous studies in humans have not focused on ABG but rather on other types of aged garlic in patients with some type of CVD risk factor and suffered from methodologic or design weaknesses, the authors note.
To address this gap, Dr. Valls and colleagues randomly assigned 67 individuals with moderate hypercholesterolemia (defined as LDL levels of at least 115 mg/dL) to receive one ABG tablet (250 mg ABG extract/1.25 mg SAC) or placebo daily for 6 weeks. Following a 3-week washout, the groups were reversed and the new intervention continued for another 6 weeks.
Participants received dietary recommendations regarding CVD risk factors and had their dietary habits assessed through a 3-day food record at baseline and after 6 weeks during both treatments.
Individuals receiving lipid-lowering treatment or antihypertensives were excluded, as were those with a body mass index of 35 kg/m2 or higher, those with a fasting blood glucose of at least 126 mg/dL, or active smokers.
There were no differences in baseline characteristics between the two groups. The mean systolic and diastolic pressures at baseline were 124/75 mm Hg in the ABG group and 121/74 mm Hg in the placebo group. Their mean age was 53 years.
Adherence with the protocol was “high” at 96.5% in both groups, and no adverse effects were reported.
Reduced risk of death from stroke, ischemic heart disease
Although no significant differences between ABG and placebo were observed at 3 weeks, the decline in DBP after consumption of the ABG extract became significant at 6 weeks (mean change, –3.7 mm Hg vs. –0.10 mm Hg; P = .007).
When stratified by sex and categories of DBP, the mean change in DBP after 6 weeks of ABG consumption was particularly prominent in men and in those with a baseline DBP of at least 75 mm Hg.
The 6-week change in systolic blood pressure with ABG and placebo was 1.32 mm Hg and 2.84 mm Hg, respectively (P = .694).
At week 6, total cholesterol levels showed a “quadratic decreasing trend” after ABG treatment (P = .047), but no other significant differences between groups were observed for lipid profile, apolipoproteins, or other outcomes of interest, including serum insulin, waist circumference, and body mass index.
The authors note that although systolic BP elevation “has a greater effect on outcomes, both systolic and diastolic hypertension independently influence the risk of adverse cardiovascular events, regardless of the definition of hypertension” and that the risk of death from ischemic heart disease and stroke doubles with every 10 mm Hg increase in DBP in people between the ages of 40 and 89 years.
“Thus, reducing DBP by 5 mm Hg results in a 40% lower risk of death from stroke and a 30% lower risk of death from ischemic heart disease or other vascular death,” they state.
Small study
Commenting for this news organization, Linda Van Horn, PhD, RDN, professor and chief of the department of preventive medicine’s nutrition division, Northwestern University, Chicago, said that for many years, garlic has been “reported to be an adjunct to the benefits of a healthy eating pattern, with inconclusive results.”
She noted that ABG is “literally aged for many months to years, and the resulting concentrate is found higher in many organosulfur compounds and phytochemicals that suggest enhanced response.”
Dr. Van Horn, a member of the American Heart Association’s Nutrition Committee, who was not involved with the study, continued: “The data suggest that ABG that is much more highly concentrated than fresh or processed garlic might be helpful in lowering BP in certain subgroups, in this case men with higher BP.”
However, she cautioned, “these results are limited in a small study, and ... potential other issues, such as sodium, potassium, or other nutrients known to be associated with blood pressure, were not reported, thereby raising questions about the exclusivity of the ABG over other accompanying dietary factors.”
The study was funded by the Center for the Development of Industrial Technology of the Spanish Ministry of Science and Innovation. Two authors are employees of Pharmactive Biotech Products, SL (Madrid), which manufactured the ABG product, but neither played a role in any result or conclusion. The other authors and Dr. Van Horn report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
After 6 weeks, consumption of ABG with a high concentration of s-allyl-L-cystine (SAC) was associated with a nearly 6-mm Hg reduction in DBP in men. Other cardiovascular disease (CVD) risk factors were not significantly affected.
“The observed reduction in DBP by ABG extract was similar to the effects of dietary approaches, including the effects of the Dietary Approaches to Stop Hypertension(DASH) diet on BP,” say Rosa M. Valls, PhD, Universitat Rovira i Virgili, Reus, Spain, and colleagues.
“The potential beneficial effects of ABG may contribute to obtaining an optimal DBP” but were “better observed in men and in nonoptimal DBP populations,” they write in the study, published in Nutrients.
Pure SAC and aged garlics have shown healthy effects on multiple targets in in vitro and in vivo tests. However, previous studies in humans have not focused on ABG but rather on other types of aged garlic in patients with some type of CVD risk factor and suffered from methodologic or design weaknesses, the authors note.
To address this gap, Dr. Valls and colleagues randomly assigned 67 individuals with moderate hypercholesterolemia (defined as LDL levels of at least 115 mg/dL) to receive one ABG tablet (250 mg ABG extract/1.25 mg SAC) or placebo daily for 6 weeks. Following a 3-week washout, the groups were reversed and the new intervention continued for another 6 weeks.
Participants received dietary recommendations regarding CVD risk factors and had their dietary habits assessed through a 3-day food record at baseline and after 6 weeks during both treatments.
Individuals receiving lipid-lowering treatment or antihypertensives were excluded, as were those with a body mass index of 35 kg/m2 or higher, those with a fasting blood glucose of at least 126 mg/dL, or active smokers.
There were no differences in baseline characteristics between the two groups. The mean systolic and diastolic pressures at baseline were 124/75 mm Hg in the ABG group and 121/74 mm Hg in the placebo group. Their mean age was 53 years.
Adherence with the protocol was “high” at 96.5% in both groups, and no adverse effects were reported.
Reduced risk of death from stroke, ischemic heart disease
Although no significant differences between ABG and placebo were observed at 3 weeks, the decline in DBP after consumption of the ABG extract became significant at 6 weeks (mean change, –3.7 mm Hg vs. –0.10 mm Hg; P = .007).
When stratified by sex and categories of DBP, the mean change in DBP after 6 weeks of ABG consumption was particularly prominent in men and in those with a baseline DBP of at least 75 mm Hg.
The 6-week change in systolic blood pressure with ABG and placebo was 1.32 mm Hg and 2.84 mm Hg, respectively (P = .694).
At week 6, total cholesterol levels showed a “quadratic decreasing trend” after ABG treatment (P = .047), but no other significant differences between groups were observed for lipid profile, apolipoproteins, or other outcomes of interest, including serum insulin, waist circumference, and body mass index.
The authors note that although systolic BP elevation “has a greater effect on outcomes, both systolic and diastolic hypertension independently influence the risk of adverse cardiovascular events, regardless of the definition of hypertension” and that the risk of death from ischemic heart disease and stroke doubles with every 10 mm Hg increase in DBP in people between the ages of 40 and 89 years.
“Thus, reducing DBP by 5 mm Hg results in a 40% lower risk of death from stroke and a 30% lower risk of death from ischemic heart disease or other vascular death,” they state.
Small study
Commenting for this news organization, Linda Van Horn, PhD, RDN, professor and chief of the department of preventive medicine’s nutrition division, Northwestern University, Chicago, said that for many years, garlic has been “reported to be an adjunct to the benefits of a healthy eating pattern, with inconclusive results.”
She noted that ABG is “literally aged for many months to years, and the resulting concentrate is found higher in many organosulfur compounds and phytochemicals that suggest enhanced response.”
Dr. Van Horn, a member of the American Heart Association’s Nutrition Committee, who was not involved with the study, continued: “The data suggest that ABG that is much more highly concentrated than fresh or processed garlic might be helpful in lowering BP in certain subgroups, in this case men with higher BP.”
However, she cautioned, “these results are limited in a small study, and ... potential other issues, such as sodium, potassium, or other nutrients known to be associated with blood pressure, were not reported, thereby raising questions about the exclusivity of the ABG over other accompanying dietary factors.”
The study was funded by the Center for the Development of Industrial Technology of the Spanish Ministry of Science and Innovation. Two authors are employees of Pharmactive Biotech Products, SL (Madrid), which manufactured the ABG product, but neither played a role in any result or conclusion. The other authors and Dr. Van Horn report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
After 6 weeks, consumption of ABG with a high concentration of s-allyl-L-cystine (SAC) was associated with a nearly 6-mm Hg reduction in DBP in men. Other cardiovascular disease (CVD) risk factors were not significantly affected.
“The observed reduction in DBP by ABG extract was similar to the effects of dietary approaches, including the effects of the Dietary Approaches to Stop Hypertension(DASH) diet on BP,” say Rosa M. Valls, PhD, Universitat Rovira i Virgili, Reus, Spain, and colleagues.
“The potential beneficial effects of ABG may contribute to obtaining an optimal DBP” but were “better observed in men and in nonoptimal DBP populations,” they write in the study, published in Nutrients.
Pure SAC and aged garlics have shown healthy effects on multiple targets in in vitro and in vivo tests. However, previous studies in humans have not focused on ABG but rather on other types of aged garlic in patients with some type of CVD risk factor and suffered from methodologic or design weaknesses, the authors note.
To address this gap, Dr. Valls and colleagues randomly assigned 67 individuals with moderate hypercholesterolemia (defined as LDL levels of at least 115 mg/dL) to receive one ABG tablet (250 mg ABG extract/1.25 mg SAC) or placebo daily for 6 weeks. Following a 3-week washout, the groups were reversed and the new intervention continued for another 6 weeks.
Participants received dietary recommendations regarding CVD risk factors and had their dietary habits assessed through a 3-day food record at baseline and after 6 weeks during both treatments.
Individuals receiving lipid-lowering treatment or antihypertensives were excluded, as were those with a body mass index of 35 kg/m2 or higher, those with a fasting blood glucose of at least 126 mg/dL, or active smokers.
There were no differences in baseline characteristics between the two groups. The mean systolic and diastolic pressures at baseline were 124/75 mm Hg in the ABG group and 121/74 mm Hg in the placebo group. Their mean age was 53 years.
Adherence with the protocol was “high” at 96.5% in both groups, and no adverse effects were reported.
Reduced risk of death from stroke, ischemic heart disease
Although no significant differences between ABG and placebo were observed at 3 weeks, the decline in DBP after consumption of the ABG extract became significant at 6 weeks (mean change, –3.7 mm Hg vs. –0.10 mm Hg; P = .007).
When stratified by sex and categories of DBP, the mean change in DBP after 6 weeks of ABG consumption was particularly prominent in men and in those with a baseline DBP of at least 75 mm Hg.
The 6-week change in systolic blood pressure with ABG and placebo was 1.32 mm Hg and 2.84 mm Hg, respectively (P = .694).
At week 6, total cholesterol levels showed a “quadratic decreasing trend” after ABG treatment (P = .047), but no other significant differences between groups were observed for lipid profile, apolipoproteins, or other outcomes of interest, including serum insulin, waist circumference, and body mass index.
The authors note that although systolic BP elevation “has a greater effect on outcomes, both systolic and diastolic hypertension independently influence the risk of adverse cardiovascular events, regardless of the definition of hypertension” and that the risk of death from ischemic heart disease and stroke doubles with every 10 mm Hg increase in DBP in people between the ages of 40 and 89 years.
“Thus, reducing DBP by 5 mm Hg results in a 40% lower risk of death from stroke and a 30% lower risk of death from ischemic heart disease or other vascular death,” they state.
Small study
Commenting for this news organization, Linda Van Horn, PhD, RDN, professor and chief of the department of preventive medicine’s nutrition division, Northwestern University, Chicago, said that for many years, garlic has been “reported to be an adjunct to the benefits of a healthy eating pattern, with inconclusive results.”
She noted that ABG is “literally aged for many months to years, and the resulting concentrate is found higher in many organosulfur compounds and phytochemicals that suggest enhanced response.”
Dr. Van Horn, a member of the American Heart Association’s Nutrition Committee, who was not involved with the study, continued: “The data suggest that ABG that is much more highly concentrated than fresh or processed garlic might be helpful in lowering BP in certain subgroups, in this case men with higher BP.”
However, she cautioned, “these results are limited in a small study, and ... potential other issues, such as sodium, potassium, or other nutrients known to be associated with blood pressure, were not reported, thereby raising questions about the exclusivity of the ABG over other accompanying dietary factors.”
The study was funded by the Center for the Development of Industrial Technology of the Spanish Ministry of Science and Innovation. Two authors are employees of Pharmactive Biotech Products, SL (Madrid), which manufactured the ABG product, but neither played a role in any result or conclusion. The other authors and Dr. Van Horn report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM NUTRIENTS
New insight into how psychedelics work
What causes the dramatic alterations in subjective awareness experienced during a psychedelic “trip?” A new study maps anatomical changes in specific neurotransmitter systems and brain regions that may be responsible for these effects.
Investigators gathered more than 6,800 accounts from individuals who had taken one of 27 different psychedelic compounds. Using a machine learning strategy, they extracted commonly used words from these testimonials, linking them with 40 different neurotransmitter subtypes that had likely induced these experiences.
The investigators then linked these subjective experiences with specific brain regions where the receptor combinations are most commonly found and, using gene transcription probes, created a 3D whole-brain map of the brain receptors and the subjective experiences linked to them.
“Hallucinogenic drugs may very well turn out to be the next big thing to improve clinical care of major mental health conditions,” senior author Danilo Bzdok, MD, PhD, associate professor, McGill University, Montreal, said in a press release.
“Our study provides a first step, a proof of principle, that we may be able to build machine-learning systems in the future that can accurately predict which neurotransmitter receptor combinations need to be stimulated to induce a specific state of conscious experience in a given person,” said Dr. Bzdok, who is also the Canada CIFAR AI Chair at Mila-Quebec Artificial Intelligence Institute.
The study was published online in Science Advances.
‘Unique window’
Psychedelic drugs “show promise” as treatments for various psychiatric disorders, but subjective alterations of reality are “highly variable across individuals” and this “poses a key challenge as we venture to bring hallucinogenic substances into medical practice,” the investigators note.
Although the 5-HT2A receptor has been regarded as a “putative essential mechanism” of hallucinogenic experiences, it is unclear whether the experiential differences are explained by functional selectivity at the 5-HT2A receptor itself or “orchestrated by the vast array of neurotransmitter receptor subclasses on which these drugs act,” they add.
Lead author Galen Ballentine, MD, psychiatry resident, SUNY Downstate Medical Center, Brooklyn, told this news organization that he was “personally eager to find novel ways to identify the neurobiological underpinnings of different states of conscious awareness.”
Psychedelics, he said, offer a “unique window into a vast array of unusual states of consciousness and are particularly useful because they can point toward underlying mechanistic processes that are initiated in specific areas of receptor expression.”
The investigators wanted to understand “how these drugs work in order to help guide their use in clinical practice,” Dr. Ballentine said.
To explore the issue, they undertook the “largest investigation to date into the neuroscience of psychedelic drug experiences,” Dr. Ballentine said. “While most studies are limited to a single drug on a handful of subjects, this project integrates thousands of experiences induced by dozens of different hallucinogenic compounds, viewing them through the prism of 40 receptor subtypes.”
Unique neurotransmitter fingerprint
The researchers analyzed 6,850 experience reports of people who had taken 1 of 27 psychedelic compounds. The reports were drawn from a database hosted by the Erowid Center, an organization that collects first-hand accounts of experiences elicited by psychoactive drugs.
The researchers constructed a “bag-of-words” encoding of the text descriptions in each testimonial. Using linguistic calculation methods, they derived a final vocabulary of 14,410 words that they analyzed for descriptive experiential terms.
To shed light on the spatial distribution of these compounds that modulate neuronal activity during subjective “trips,” they compared normalized measurements of their relative binding strengths in 40 sites.
- 5-HT (5-HT2A, 5-HT2C, 5-HT2B, 5-HT1A, 5-HT1B, 5-HT1D, 5-HT1E, 5-HT5A, 5-HT6, 5-HT7)
- Dopamine (D1, D2, D3, D4, D5)
- Adrenergic (a-1A, a-1B, a-2A, a-2B, a-2C, b-1, b-2)
- Serotonin transporter (SERT)
- Dopamine transporter (DAT)
- Norepinephrine transporter (NET)
- Imidazoline-1 receptor (I1)
- Sigma receptors (s-1, s-2)
- d-opioid receptor (DOR)
- k-opioid receptor (KOR)
- m-opioid receptor (MOR)
- Muscarinic receptors (M1, M2, M3, M4, M5)
- Histamine receptors (H1, H2)
- Calcium ion channel (CA+)
- NMDA glutamate receptor
To map receptor-experience factors to regional levels of receptor gene transcription, they utilized human gene expression data drawn from the Allen Human Brain Atlas, as well as the Shafer-Yeo brain atlas.
Via a machine-learning algorithm, they dissected the “phenomenologically rich anecdotes” into a ranking of constituent brain-behavior factors, each of which was characterized by a “unique neurotransmitter fingerprint of action and a unique experiential context” and ultimately created a dimensional map of these neurotransmitter systems.
Data-driven framework
Cortex-wide distribution of receptor-experience factors was found in both deep and shallow anatomical brain regions. Regions involved in genetic factor expressions were also wide-ranging, spanning from higher association cortices to unimodal sensory cortices.
The dominant factor “elucidated mystical experience in general and the dissolution of self-world boundaries (ego dissolution) in particular,” the authors report, while the second- and third-most explanatory factors “evoked auditory, visual, and emotional themes of mental expansion.”
Ego dissolution was found to be most associated with the 5-HT2A receptor, as well as other serotonin receptors (5-HT2C, 5-HT1A, 5-HT2B), adrenergic receptors a-2A and b-2, and the D2 receptor.
Alterations in sensory perception were associated with expression of the 5-HT2A receptor in the visual cortex, while modulation of the salience network by dopamine and opioid receptors were implicated in the experience transcendence of space, time, and the structure of self. Auditory hallucinations were linked to a weighted blend of receptors expressed throughout the auditory cortex.
“This data-driven framework identifies patterns that undergird diverse psychedelic experiences such as mystical bliss, existential terror, and complex hallucinations,” Dr. Ballentine commented.
“Simultaneously subjective and neurobiological, these patterns align with the leading hypothesis that psychedelics temporarily diminish top-down control of the most evolutionarily advanced regions of the brain, while at the same time amplifying bottom-up sensory processing from primary sensory cortices,” he added.
Forging a new path
Scott Aaronson, MD, chief science officer, Institute for Advanced Diagnostics and Therapeutics and director of the Centre of Excellence at Sheppard Pratt, Towson, Md., said, “As we try to get our arms around understanding the implications of a psychedelic exposure, forward-thinking researchers like Dr. Bzdok et al. are offering interesting ways to capture and understand the experience.”
Dr. Aaronson, an adjunct professor at the University of Maryland School of Medicine who was not involved with the study, continued: “Using the rapidly developing field of natural language processing (NLP), which looks at how language is used for a deeper understanding of human experiences, and combining it with effects of psychedelic compounds on neuronal pathways and neurochemical receptor sites, the authors are forging a new path for further inquiry.”
In an accompanying editorial, Daniel Barron, MD, PhD, medical director, Interventional Pain Psychiatry Program, Brigham and Women’s Hospital, Boston, and Richard Friedman, MD, professor of clinical psychiatry, Weill Cornell Medical College, New York, call the work “impressive” and “clever.”
“Psychedelics paired with new applications of computational tools might help bypass the imprecision of psychiatric diagnosis and connect measures of behavior to specific physiologic targets,” they write.
The research was supported by the Brain Canada Foundation, through the Canada Brain Research Fund, a grant from the NIH grant, and the Canadian Institutes of Health Research. Dr. Bzdok was also supported by the Healthy Brains Healthy Lives initiative (Canada First Research Excellence fund) and the CIFAR Artificial Intelligence Chairs program (Canada Institute for Advanced Research), as well as Research Award and Teaching Award by Google. The other authors’ disclosures are listed on the original paper. No disclosures were listed for Dr. Barron and Dr. Friedman. Dr. Aaronson’s research is supported by Compass Pathways.
A version of this article first appeared on Medscape.com.
What causes the dramatic alterations in subjective awareness experienced during a psychedelic “trip?” A new study maps anatomical changes in specific neurotransmitter systems and brain regions that may be responsible for these effects.
Investigators gathered more than 6,800 accounts from individuals who had taken one of 27 different psychedelic compounds. Using a machine learning strategy, they extracted commonly used words from these testimonials, linking them with 40 different neurotransmitter subtypes that had likely induced these experiences.
The investigators then linked these subjective experiences with specific brain regions where the receptor combinations are most commonly found and, using gene transcription probes, created a 3D whole-brain map of the brain receptors and the subjective experiences linked to them.
“Hallucinogenic drugs may very well turn out to be the next big thing to improve clinical care of major mental health conditions,” senior author Danilo Bzdok, MD, PhD, associate professor, McGill University, Montreal, said in a press release.
“Our study provides a first step, a proof of principle, that we may be able to build machine-learning systems in the future that can accurately predict which neurotransmitter receptor combinations need to be stimulated to induce a specific state of conscious experience in a given person,” said Dr. Bzdok, who is also the Canada CIFAR AI Chair at Mila-Quebec Artificial Intelligence Institute.
The study was published online in Science Advances.
‘Unique window’
Psychedelic drugs “show promise” as treatments for various psychiatric disorders, but subjective alterations of reality are “highly variable across individuals” and this “poses a key challenge as we venture to bring hallucinogenic substances into medical practice,” the investigators note.
Although the 5-HT2A receptor has been regarded as a “putative essential mechanism” of hallucinogenic experiences, it is unclear whether the experiential differences are explained by functional selectivity at the 5-HT2A receptor itself or “orchestrated by the vast array of neurotransmitter receptor subclasses on which these drugs act,” they add.
Lead author Galen Ballentine, MD, psychiatry resident, SUNY Downstate Medical Center, Brooklyn, told this news organization that he was “personally eager to find novel ways to identify the neurobiological underpinnings of different states of conscious awareness.”
Psychedelics, he said, offer a “unique window into a vast array of unusual states of consciousness and are particularly useful because they can point toward underlying mechanistic processes that are initiated in specific areas of receptor expression.”
The investigators wanted to understand “how these drugs work in order to help guide their use in clinical practice,” Dr. Ballentine said.
To explore the issue, they undertook the “largest investigation to date into the neuroscience of psychedelic drug experiences,” Dr. Ballentine said. “While most studies are limited to a single drug on a handful of subjects, this project integrates thousands of experiences induced by dozens of different hallucinogenic compounds, viewing them through the prism of 40 receptor subtypes.”
Unique neurotransmitter fingerprint
The researchers analyzed 6,850 experience reports of people who had taken 1 of 27 psychedelic compounds. The reports were drawn from a database hosted by the Erowid Center, an organization that collects first-hand accounts of experiences elicited by psychoactive drugs.
The researchers constructed a “bag-of-words” encoding of the text descriptions in each testimonial. Using linguistic calculation methods, they derived a final vocabulary of 14,410 words that they analyzed for descriptive experiential terms.
To shed light on the spatial distribution of these compounds that modulate neuronal activity during subjective “trips,” they compared normalized measurements of their relative binding strengths in 40 sites.
- 5-HT (5-HT2A, 5-HT2C, 5-HT2B, 5-HT1A, 5-HT1B, 5-HT1D, 5-HT1E, 5-HT5A, 5-HT6, 5-HT7)
- Dopamine (D1, D2, D3, D4, D5)
- Adrenergic (a-1A, a-1B, a-2A, a-2B, a-2C, b-1, b-2)
- Serotonin transporter (SERT)
- Dopamine transporter (DAT)
- Norepinephrine transporter (NET)
- Imidazoline-1 receptor (I1)
- Sigma receptors (s-1, s-2)
- d-opioid receptor (DOR)
- k-opioid receptor (KOR)
- m-opioid receptor (MOR)
- Muscarinic receptors (M1, M2, M3, M4, M5)
- Histamine receptors (H1, H2)
- Calcium ion channel (CA+)
- NMDA glutamate receptor
To map receptor-experience factors to regional levels of receptor gene transcription, they utilized human gene expression data drawn from the Allen Human Brain Atlas, as well as the Shafer-Yeo brain atlas.
Via a machine-learning algorithm, they dissected the “phenomenologically rich anecdotes” into a ranking of constituent brain-behavior factors, each of which was characterized by a “unique neurotransmitter fingerprint of action and a unique experiential context” and ultimately created a dimensional map of these neurotransmitter systems.
Data-driven framework
Cortex-wide distribution of receptor-experience factors was found in both deep and shallow anatomical brain regions. Regions involved in genetic factor expressions were also wide-ranging, spanning from higher association cortices to unimodal sensory cortices.
The dominant factor “elucidated mystical experience in general and the dissolution of self-world boundaries (ego dissolution) in particular,” the authors report, while the second- and third-most explanatory factors “evoked auditory, visual, and emotional themes of mental expansion.”
Ego dissolution was found to be most associated with the 5-HT2A receptor, as well as other serotonin receptors (5-HT2C, 5-HT1A, 5-HT2B), adrenergic receptors a-2A and b-2, and the D2 receptor.
Alterations in sensory perception were associated with expression of the 5-HT2A receptor in the visual cortex, while modulation of the salience network by dopamine and opioid receptors were implicated in the experience transcendence of space, time, and the structure of self. Auditory hallucinations were linked to a weighted blend of receptors expressed throughout the auditory cortex.
“This data-driven framework identifies patterns that undergird diverse psychedelic experiences such as mystical bliss, existential terror, and complex hallucinations,” Dr. Ballentine commented.
“Simultaneously subjective and neurobiological, these patterns align with the leading hypothesis that psychedelics temporarily diminish top-down control of the most evolutionarily advanced regions of the brain, while at the same time amplifying bottom-up sensory processing from primary sensory cortices,” he added.
Forging a new path
Scott Aaronson, MD, chief science officer, Institute for Advanced Diagnostics and Therapeutics and director of the Centre of Excellence at Sheppard Pratt, Towson, Md., said, “As we try to get our arms around understanding the implications of a psychedelic exposure, forward-thinking researchers like Dr. Bzdok et al. are offering interesting ways to capture and understand the experience.”
Dr. Aaronson, an adjunct professor at the University of Maryland School of Medicine who was not involved with the study, continued: “Using the rapidly developing field of natural language processing (NLP), which looks at how language is used for a deeper understanding of human experiences, and combining it with effects of psychedelic compounds on neuronal pathways and neurochemical receptor sites, the authors are forging a new path for further inquiry.”
In an accompanying editorial, Daniel Barron, MD, PhD, medical director, Interventional Pain Psychiatry Program, Brigham and Women’s Hospital, Boston, and Richard Friedman, MD, professor of clinical psychiatry, Weill Cornell Medical College, New York, call the work “impressive” and “clever.”
“Psychedelics paired with new applications of computational tools might help bypass the imprecision of psychiatric diagnosis and connect measures of behavior to specific physiologic targets,” they write.
The research was supported by the Brain Canada Foundation, through the Canada Brain Research Fund, a grant from the NIH grant, and the Canadian Institutes of Health Research. Dr. Bzdok was also supported by the Healthy Brains Healthy Lives initiative (Canada First Research Excellence fund) and the CIFAR Artificial Intelligence Chairs program (Canada Institute for Advanced Research), as well as Research Award and Teaching Award by Google. The other authors’ disclosures are listed on the original paper. No disclosures were listed for Dr. Barron and Dr. Friedman. Dr. Aaronson’s research is supported by Compass Pathways.
A version of this article first appeared on Medscape.com.
What causes the dramatic alterations in subjective awareness experienced during a psychedelic “trip?” A new study maps anatomical changes in specific neurotransmitter systems and brain regions that may be responsible for these effects.
Investigators gathered more than 6,800 accounts from individuals who had taken one of 27 different psychedelic compounds. Using a machine learning strategy, they extracted commonly used words from these testimonials, linking them with 40 different neurotransmitter subtypes that had likely induced these experiences.
The investigators then linked these subjective experiences with specific brain regions where the receptor combinations are most commonly found and, using gene transcription probes, created a 3D whole-brain map of the brain receptors and the subjective experiences linked to them.
“Hallucinogenic drugs may very well turn out to be the next big thing to improve clinical care of major mental health conditions,” senior author Danilo Bzdok, MD, PhD, associate professor, McGill University, Montreal, said in a press release.
“Our study provides a first step, a proof of principle, that we may be able to build machine-learning systems in the future that can accurately predict which neurotransmitter receptor combinations need to be stimulated to induce a specific state of conscious experience in a given person,” said Dr. Bzdok, who is also the Canada CIFAR AI Chair at Mila-Quebec Artificial Intelligence Institute.
The study was published online in Science Advances.
‘Unique window’
Psychedelic drugs “show promise” as treatments for various psychiatric disorders, but subjective alterations of reality are “highly variable across individuals” and this “poses a key challenge as we venture to bring hallucinogenic substances into medical practice,” the investigators note.
Although the 5-HT2A receptor has been regarded as a “putative essential mechanism” of hallucinogenic experiences, it is unclear whether the experiential differences are explained by functional selectivity at the 5-HT2A receptor itself or “orchestrated by the vast array of neurotransmitter receptor subclasses on which these drugs act,” they add.
Lead author Galen Ballentine, MD, psychiatry resident, SUNY Downstate Medical Center, Brooklyn, told this news organization that he was “personally eager to find novel ways to identify the neurobiological underpinnings of different states of conscious awareness.”
Psychedelics, he said, offer a “unique window into a vast array of unusual states of consciousness and are particularly useful because they can point toward underlying mechanistic processes that are initiated in specific areas of receptor expression.”
The investigators wanted to understand “how these drugs work in order to help guide their use in clinical practice,” Dr. Ballentine said.
To explore the issue, they undertook the “largest investigation to date into the neuroscience of psychedelic drug experiences,” Dr. Ballentine said. “While most studies are limited to a single drug on a handful of subjects, this project integrates thousands of experiences induced by dozens of different hallucinogenic compounds, viewing them through the prism of 40 receptor subtypes.”
Unique neurotransmitter fingerprint
The researchers analyzed 6,850 experience reports of people who had taken 1 of 27 psychedelic compounds. The reports were drawn from a database hosted by the Erowid Center, an organization that collects first-hand accounts of experiences elicited by psychoactive drugs.
The researchers constructed a “bag-of-words” encoding of the text descriptions in each testimonial. Using linguistic calculation methods, they derived a final vocabulary of 14,410 words that they analyzed for descriptive experiential terms.
To shed light on the spatial distribution of these compounds that modulate neuronal activity during subjective “trips,” they compared normalized measurements of their relative binding strengths in 40 sites.
- 5-HT (5-HT2A, 5-HT2C, 5-HT2B, 5-HT1A, 5-HT1B, 5-HT1D, 5-HT1E, 5-HT5A, 5-HT6, 5-HT7)
- Dopamine (D1, D2, D3, D4, D5)
- Adrenergic (a-1A, a-1B, a-2A, a-2B, a-2C, b-1, b-2)
- Serotonin transporter (SERT)
- Dopamine transporter (DAT)
- Norepinephrine transporter (NET)
- Imidazoline-1 receptor (I1)
- Sigma receptors (s-1, s-2)
- d-opioid receptor (DOR)
- k-opioid receptor (KOR)
- m-opioid receptor (MOR)
- Muscarinic receptors (M1, M2, M3, M4, M5)
- Histamine receptors (H1, H2)
- Calcium ion channel (CA+)
- NMDA glutamate receptor
To map receptor-experience factors to regional levels of receptor gene transcription, they utilized human gene expression data drawn from the Allen Human Brain Atlas, as well as the Shafer-Yeo brain atlas.
Via a machine-learning algorithm, they dissected the “phenomenologically rich anecdotes” into a ranking of constituent brain-behavior factors, each of which was characterized by a “unique neurotransmitter fingerprint of action and a unique experiential context” and ultimately created a dimensional map of these neurotransmitter systems.
Data-driven framework
Cortex-wide distribution of receptor-experience factors was found in both deep and shallow anatomical brain regions. Regions involved in genetic factor expressions were also wide-ranging, spanning from higher association cortices to unimodal sensory cortices.
The dominant factor “elucidated mystical experience in general and the dissolution of self-world boundaries (ego dissolution) in particular,” the authors report, while the second- and third-most explanatory factors “evoked auditory, visual, and emotional themes of mental expansion.”
Ego dissolution was found to be most associated with the 5-HT2A receptor, as well as other serotonin receptors (5-HT2C, 5-HT1A, 5-HT2B), adrenergic receptors a-2A and b-2, and the D2 receptor.
Alterations in sensory perception were associated with expression of the 5-HT2A receptor in the visual cortex, while modulation of the salience network by dopamine and opioid receptors were implicated in the experience transcendence of space, time, and the structure of self. Auditory hallucinations were linked to a weighted blend of receptors expressed throughout the auditory cortex.
“This data-driven framework identifies patterns that undergird diverse psychedelic experiences such as mystical bliss, existential terror, and complex hallucinations,” Dr. Ballentine commented.
“Simultaneously subjective and neurobiological, these patterns align with the leading hypothesis that psychedelics temporarily diminish top-down control of the most evolutionarily advanced regions of the brain, while at the same time amplifying bottom-up sensory processing from primary sensory cortices,” he added.
Forging a new path
Scott Aaronson, MD, chief science officer, Institute for Advanced Diagnostics and Therapeutics and director of the Centre of Excellence at Sheppard Pratt, Towson, Md., said, “As we try to get our arms around understanding the implications of a psychedelic exposure, forward-thinking researchers like Dr. Bzdok et al. are offering interesting ways to capture and understand the experience.”
Dr. Aaronson, an adjunct professor at the University of Maryland School of Medicine who was not involved with the study, continued: “Using the rapidly developing field of natural language processing (NLP), which looks at how language is used for a deeper understanding of human experiences, and combining it with effects of psychedelic compounds on neuronal pathways and neurochemical receptor sites, the authors are forging a new path for further inquiry.”
In an accompanying editorial, Daniel Barron, MD, PhD, medical director, Interventional Pain Psychiatry Program, Brigham and Women’s Hospital, Boston, and Richard Friedman, MD, professor of clinical psychiatry, Weill Cornell Medical College, New York, call the work “impressive” and “clever.”
“Psychedelics paired with new applications of computational tools might help bypass the imprecision of psychiatric diagnosis and connect measures of behavior to specific physiologic targets,” they write.
The research was supported by the Brain Canada Foundation, through the Canada Brain Research Fund, a grant from the NIH grant, and the Canadian Institutes of Health Research. Dr. Bzdok was also supported by the Healthy Brains Healthy Lives initiative (Canada First Research Excellence fund) and the CIFAR Artificial Intelligence Chairs program (Canada Institute for Advanced Research), as well as Research Award and Teaching Award by Google. The other authors’ disclosures are listed on the original paper. No disclosures were listed for Dr. Barron and Dr. Friedman. Dr. Aaronson’s research is supported by Compass Pathways.
A version of this article first appeared on Medscape.com.
‘Pre-death grief’ is a real, but overlooked, syndrome
When an individual develops a terminal illness, those closest to them often start to grieve long before the person dies. Although a common syndrome, it often goes unrecognized and unaddressed.
A new review proposes a way of defining this specific type of grief in the hope that better, more precise descriptive categories will inform therapeutic interventions to help those facing a life-changing loss.
, lead author Jonathan Singer, PhD, visiting assistant professor of clinical psychology, Texas Tech University, Lubbock, told this news organization.
“We proposed the overarching term ‘pre-death grief,’ with the two separate constructs under pre-death grief – anticipatory grief [AG] and illness-related grief [IRG],” he said. “These definitions provide the field with uniform constructs to advance the study of grief before the death of an individual with a life-limiting illness.
“Research examining grief experienced by family members prior to an individual’s death to a life-limiting illness revealed wide variation in the terminology used and characterization of such grief across studies,”
The study was published online Feb. 23 in Palliative Medicine.
‘Typical’ versus ‘impairing’ grief
“Most deaths worldwide are attributed to a chronic or life-limiting Illness,” the authors write. The experience of grief before the loss of a family member “has been studied frequently, but there have been conceptualization issues, which is problematic, as it hinders the potential advancement of the field in differentiating typical grief from more impairing grief before the death,” Dr. Singer said. “Further complicating the picture is the sheer number of terms used to describe grief before death.”
Dr. Singer said that when he started conducting research in this field, he “realized someone had to combine the articles that have been published in order to create definitions that will advance the field, so risk and protective factors could be identified and interventions could be tested.”
For the current study, the investigators searched six databases to find research that “evaluated family members’ or friends’ grief related to an individual currently living with a life-limiting illness.” They excluded studies that evaluated grief after death.
Of 9,568 records reviewed, the researchers selected 134 full-text articles that met inclusion criteria. Most studies (57.46%) were quantitative; 23.88% were qualitative, and 17.91% used mixed methods. Most studies were retrospective, although 14.93% were prospective, and 3% included both prospective and retrospective analyses.
Most participants reported that the family member/friend was diagnosed either with “late-stage dementia” or “advanced cancer.” The majority (58%) were adult children of the individual with the illness, followed by spouses/partners (28.1%) and other relatives/friends (13.9%) in studies that reported the relationship to the participant and the person with the illness.
Various scales were used in the studies to measure grief, particularly the Marwit-Meuser-Caregiver Grief Inventory (n = 28), the Anticipatory Grief Scale (n = 18), and the Prolonged Grief–12 (n = 13).
A new name
Owing to the large number of articles included in the review, the researchers limited the analysis to those in which a given term was used in ≥ 1 articles.
The researchers found 18 different terms used by family members/friends of individuals with life-limiting illness to describe grief, including AG (used in the most studies, n = 54); pre-death grief (n = 18), grief (n = 12), pre-loss grief (n = 6), caregiver grief (n = 5), and anticipatory mourning (n = 4). These 18 terms were associated with greater than or equal to 30 different definitions across all of the various studies.
“Definitions of these terms differed drastically,” and many studies used the term AG without defining it.
Nineteen studies used multiple terms within a single article, and the terms were “used interchangeably, with the same definition applied,” the researchers report.
For example, one study defined AG as “the process associated with grieving the eventual loss of a family member in advance of their inevitable death,” while another defined AG as “a series of losses based on a loved one’s progression of cognitive and physical decline.”
On the basis of this analysis, the researchers chose the term “pre-death grief,” which encompasses IRG and AG.
Dr. Singer explained that IRG is “present-oriented” and involves the “longing and yearning for the family member to be as they were before the illness.” AG is “future oriented” and is defined as “family members’ grief experience while the person with the life-limiting illness is alive but that is focused on feared or anticipated losses that will occur after the person’s death.”
The study was intended “to advance the field and provide the knowledge and definitions in order to create and test an evidence-based intervention,” Dr. Singer said.
He pointed to interventions (for example: behavioral activation, meaning-centered grief therapy) that could be tested to reduce pre-death grief or specific interventions that focus on addressing IRG or AG. “For example, cognitive behavior therapy might be used to challenge worry about life without the person, which would be classified as AG.”
Dr. Singer feels it is “vital” to reduce pre-death grief, insofar as numerous studies have shown that high rates of pre-death grief “result in higher rates of prolonged grief disorder.”
‘Paradoxical reality’
Francesca Falzarano, PhD, a postdoctoral associate in medicine, Weill Cornell Medicine, New York, called the article a “timely piece drawing much-needed attention to an all-too-often overlooked experience lived by those affected by terminal illnesses.”
Dr. Falzarano, who was not involved in the review, said that “from her own experience” as both a caregiver and behavioral scientist conducting research in this area, the concept of pre-death grief is a paradoxical reality – “how do we grieve someone we haven’t lost yet?”
The experience of pre-death grief is “quite distinct from grief after bereavement” because there is no end date. Rather, the person “cycles back and forth between preparing themselves for an impending death while also attending to whatever is happening in the current moment.” It’s also “unique in that both patients and caregivers individually and collectively grieve losses over the course of the illness,” she noted.
“We as researchers absolutely need to focus our attention on achieving consensus on an appropriate definition for pre-death grief that adequately encompasses its complexity and multidimensionality,” she said.
The authors and Dr. Falzarano report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
When an individual develops a terminal illness, those closest to them often start to grieve long before the person dies. Although a common syndrome, it often goes unrecognized and unaddressed.
A new review proposes a way of defining this specific type of grief in the hope that better, more precise descriptive categories will inform therapeutic interventions to help those facing a life-changing loss.
, lead author Jonathan Singer, PhD, visiting assistant professor of clinical psychology, Texas Tech University, Lubbock, told this news organization.
“We proposed the overarching term ‘pre-death grief,’ with the two separate constructs under pre-death grief – anticipatory grief [AG] and illness-related grief [IRG],” he said. “These definitions provide the field with uniform constructs to advance the study of grief before the death of an individual with a life-limiting illness.
“Research examining grief experienced by family members prior to an individual’s death to a life-limiting illness revealed wide variation in the terminology used and characterization of such grief across studies,”
The study was published online Feb. 23 in Palliative Medicine.
‘Typical’ versus ‘impairing’ grief
“Most deaths worldwide are attributed to a chronic or life-limiting Illness,” the authors write. The experience of grief before the loss of a family member “has been studied frequently, but there have been conceptualization issues, which is problematic, as it hinders the potential advancement of the field in differentiating typical grief from more impairing grief before the death,” Dr. Singer said. “Further complicating the picture is the sheer number of terms used to describe grief before death.”
Dr. Singer said that when he started conducting research in this field, he “realized someone had to combine the articles that have been published in order to create definitions that will advance the field, so risk and protective factors could be identified and interventions could be tested.”
For the current study, the investigators searched six databases to find research that “evaluated family members’ or friends’ grief related to an individual currently living with a life-limiting illness.” They excluded studies that evaluated grief after death.
Of 9,568 records reviewed, the researchers selected 134 full-text articles that met inclusion criteria. Most studies (57.46%) were quantitative; 23.88% were qualitative, and 17.91% used mixed methods. Most studies were retrospective, although 14.93% were prospective, and 3% included both prospective and retrospective analyses.
Most participants reported that the family member/friend was diagnosed either with “late-stage dementia” or “advanced cancer.” The majority (58%) were adult children of the individual with the illness, followed by spouses/partners (28.1%) and other relatives/friends (13.9%) in studies that reported the relationship to the participant and the person with the illness.
Various scales were used in the studies to measure grief, particularly the Marwit-Meuser-Caregiver Grief Inventory (n = 28), the Anticipatory Grief Scale (n = 18), and the Prolonged Grief–12 (n = 13).
A new name
Owing to the large number of articles included in the review, the researchers limited the analysis to those in which a given term was used in ≥ 1 articles.
The researchers found 18 different terms used by family members/friends of individuals with life-limiting illness to describe grief, including AG (used in the most studies, n = 54); pre-death grief (n = 18), grief (n = 12), pre-loss grief (n = 6), caregiver grief (n = 5), and anticipatory mourning (n = 4). These 18 terms were associated with greater than or equal to 30 different definitions across all of the various studies.
“Definitions of these terms differed drastically,” and many studies used the term AG without defining it.
Nineteen studies used multiple terms within a single article, and the terms were “used interchangeably, with the same definition applied,” the researchers report.
For example, one study defined AG as “the process associated with grieving the eventual loss of a family member in advance of their inevitable death,” while another defined AG as “a series of losses based on a loved one’s progression of cognitive and physical decline.”
On the basis of this analysis, the researchers chose the term “pre-death grief,” which encompasses IRG and AG.
Dr. Singer explained that IRG is “present-oriented” and involves the “longing and yearning for the family member to be as they were before the illness.” AG is “future oriented” and is defined as “family members’ grief experience while the person with the life-limiting illness is alive but that is focused on feared or anticipated losses that will occur after the person’s death.”
The study was intended “to advance the field and provide the knowledge and definitions in order to create and test an evidence-based intervention,” Dr. Singer said.
He pointed to interventions (for example: behavioral activation, meaning-centered grief therapy) that could be tested to reduce pre-death grief or specific interventions that focus on addressing IRG or AG. “For example, cognitive behavior therapy might be used to challenge worry about life without the person, which would be classified as AG.”
Dr. Singer feels it is “vital” to reduce pre-death grief, insofar as numerous studies have shown that high rates of pre-death grief “result in higher rates of prolonged grief disorder.”
‘Paradoxical reality’
Francesca Falzarano, PhD, a postdoctoral associate in medicine, Weill Cornell Medicine, New York, called the article a “timely piece drawing much-needed attention to an all-too-often overlooked experience lived by those affected by terminal illnesses.”
Dr. Falzarano, who was not involved in the review, said that “from her own experience” as both a caregiver and behavioral scientist conducting research in this area, the concept of pre-death grief is a paradoxical reality – “how do we grieve someone we haven’t lost yet?”
The experience of pre-death grief is “quite distinct from grief after bereavement” because there is no end date. Rather, the person “cycles back and forth between preparing themselves for an impending death while also attending to whatever is happening in the current moment.” It’s also “unique in that both patients and caregivers individually and collectively grieve losses over the course of the illness,” she noted.
“We as researchers absolutely need to focus our attention on achieving consensus on an appropriate definition for pre-death grief that adequately encompasses its complexity and multidimensionality,” she said.
The authors and Dr. Falzarano report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
When an individual develops a terminal illness, those closest to them often start to grieve long before the person dies. Although a common syndrome, it often goes unrecognized and unaddressed.
A new review proposes a way of defining this specific type of grief in the hope that better, more precise descriptive categories will inform therapeutic interventions to help those facing a life-changing loss.
, lead author Jonathan Singer, PhD, visiting assistant professor of clinical psychology, Texas Tech University, Lubbock, told this news organization.
“We proposed the overarching term ‘pre-death grief,’ with the two separate constructs under pre-death grief – anticipatory grief [AG] and illness-related grief [IRG],” he said. “These definitions provide the field with uniform constructs to advance the study of grief before the death of an individual with a life-limiting illness.
“Research examining grief experienced by family members prior to an individual’s death to a life-limiting illness revealed wide variation in the terminology used and characterization of such grief across studies,”
The study was published online Feb. 23 in Palliative Medicine.
‘Typical’ versus ‘impairing’ grief
“Most deaths worldwide are attributed to a chronic or life-limiting Illness,” the authors write. The experience of grief before the loss of a family member “has been studied frequently, but there have been conceptualization issues, which is problematic, as it hinders the potential advancement of the field in differentiating typical grief from more impairing grief before the death,” Dr. Singer said. “Further complicating the picture is the sheer number of terms used to describe grief before death.”
Dr. Singer said that when he started conducting research in this field, he “realized someone had to combine the articles that have been published in order to create definitions that will advance the field, so risk and protective factors could be identified and interventions could be tested.”
For the current study, the investigators searched six databases to find research that “evaluated family members’ or friends’ grief related to an individual currently living with a life-limiting illness.” They excluded studies that evaluated grief after death.
Of 9,568 records reviewed, the researchers selected 134 full-text articles that met inclusion criteria. Most studies (57.46%) were quantitative; 23.88% were qualitative, and 17.91% used mixed methods. Most studies were retrospective, although 14.93% were prospective, and 3% included both prospective and retrospective analyses.
Most participants reported that the family member/friend was diagnosed either with “late-stage dementia” or “advanced cancer.” The majority (58%) were adult children of the individual with the illness, followed by spouses/partners (28.1%) and other relatives/friends (13.9%) in studies that reported the relationship to the participant and the person with the illness.
Various scales were used in the studies to measure grief, particularly the Marwit-Meuser-Caregiver Grief Inventory (n = 28), the Anticipatory Grief Scale (n = 18), and the Prolonged Grief–12 (n = 13).
A new name
Owing to the large number of articles included in the review, the researchers limited the analysis to those in which a given term was used in ≥ 1 articles.
The researchers found 18 different terms used by family members/friends of individuals with life-limiting illness to describe grief, including AG (used in the most studies, n = 54); pre-death grief (n = 18), grief (n = 12), pre-loss grief (n = 6), caregiver grief (n = 5), and anticipatory mourning (n = 4). These 18 terms were associated with greater than or equal to 30 different definitions across all of the various studies.
“Definitions of these terms differed drastically,” and many studies used the term AG without defining it.
Nineteen studies used multiple terms within a single article, and the terms were “used interchangeably, with the same definition applied,” the researchers report.
For example, one study defined AG as “the process associated with grieving the eventual loss of a family member in advance of their inevitable death,” while another defined AG as “a series of losses based on a loved one’s progression of cognitive and physical decline.”
On the basis of this analysis, the researchers chose the term “pre-death grief,” which encompasses IRG and AG.
Dr. Singer explained that IRG is “present-oriented” and involves the “longing and yearning for the family member to be as they were before the illness.” AG is “future oriented” and is defined as “family members’ grief experience while the person with the life-limiting illness is alive but that is focused on feared or anticipated losses that will occur after the person’s death.”
The study was intended “to advance the field and provide the knowledge and definitions in order to create and test an evidence-based intervention,” Dr. Singer said.
He pointed to interventions (for example: behavioral activation, meaning-centered grief therapy) that could be tested to reduce pre-death grief or specific interventions that focus on addressing IRG or AG. “For example, cognitive behavior therapy might be used to challenge worry about life without the person, which would be classified as AG.”
Dr. Singer feels it is “vital” to reduce pre-death grief, insofar as numerous studies have shown that high rates of pre-death grief “result in higher rates of prolonged grief disorder.”
‘Paradoxical reality’
Francesca Falzarano, PhD, a postdoctoral associate in medicine, Weill Cornell Medicine, New York, called the article a “timely piece drawing much-needed attention to an all-too-often overlooked experience lived by those affected by terminal illnesses.”
Dr. Falzarano, who was not involved in the review, said that “from her own experience” as both a caregiver and behavioral scientist conducting research in this area, the concept of pre-death grief is a paradoxical reality – “how do we grieve someone we haven’t lost yet?”
The experience of pre-death grief is “quite distinct from grief after bereavement” because there is no end date. Rather, the person “cycles back and forth between preparing themselves for an impending death while also attending to whatever is happening in the current moment.” It’s also “unique in that both patients and caregivers individually and collectively grieve losses over the course of the illness,” she noted.
“We as researchers absolutely need to focus our attention on achieving consensus on an appropriate definition for pre-death grief that adequately encompasses its complexity and multidimensionality,” she said.
The authors and Dr. Falzarano report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
‘My boss is my son’s age’: Age differences in medical practices
Morton J, MD, a 68-year-old cardiologist based in the Midwest, saw things become dramatically worse when his nine-physician practice was taken over by a large health system.
“Everything changed. My partners and I lost a lot of autonomy. We had a say – but not the final say-so in who we hired as medical assistants or receptionists. We had to change how long we spent with patients and justify procedures or tests – not just to the insurance companies, which is an old story, but to our new employer,” said Dr. J, who asked to remain anonymous.
Worst of all, “I had to report to a kid – a doctor in his 30s, someone young enough to be my son, someone with a fraction of the clinical training and experience I had but who now got to tell me what to do and how to run my practice.”
The “final straw” for Dr. J came when the practice had to change to a new electronic health record (EHR) system. “Learning this new system was like pulling teeth,” he said. His youthful supervisor was “obviously impatient and irritated – his whole attitude and demeanor reflected a sense that he was saddled with a dinosaur.”
After much anguishing and soul-searching, Dr. J decided to retire. “I was already close to retirement age, and I thought it would be nice to spend more time with my grandchildren. Feeling so disrespected was simply the catalyst that brought the decision to a head a couple of years sooner than I had planned.”
Getting through a delicate discussion
This unfortunate situation could have been avoided had the younger supervisor shown more sensitivity, says otolaryngologist Mark Wallace, DO.
Dr. Wallace is speaking from personal experience. Early in his career, he was a younger physician who was forced to discuss a practice management issue with an older physician.
Dr. Wallace was a member of a committee that was responsible for “maximizing the efficiency of good care, while still being aware of cost issues.” When the committee “wanted one of the physicians in the group to change their behavior to improve cost savings, it was my job to discuss that with them.”
Dr. Wallace, who today is a locum tenens physician and a medical practice consultant to Physicians Thrive – an advisory group that helps physicians with financial and practice management problems – recalls feeling uncomfortable about broaching the subject to his supervisee. In this case, the older physician was prescribing name brand medications, and the committee that appointed Dr. Wallace wanted him to encourage the physician to prescribe a generic medication first and reserve brand prescriptions only for cases in which the generic was ineffective.
He acknowledges that he thought the generic was equivalent to the branded product in safety and efficacy.
“I always felt this to be a delicate discussion, whatever the age of the physician, because I didn’t like the idea of telling a doctor that they have to change how they practice so as to save money. I would never want anyone to feel they’re providing a lower level of care.”
The fact that this was an older physician – in his 60s – compounded his hesitancy. “Older physicians have a lot more experience than what I had in my 30s,” Dr. Wallace said. “I could talk to them about studies and outcomes and things like that, but a large part of medicine is the experience you gain over time.
“I presented it simply as a cost issue raised by the committee and asked him to consider experimenting with changing his prescribing behavior, while emphasizing that ultimately, it was his decision,” says Dr. Wallace.
The supervisee understood the concern and agreed to the experiment. He ended up prescribing the generic more frequently, although perhaps not as frequently as the committee would have liked.
, says Ted Epperly, MD, a family physician in Boise, Idaho, and president and CEO of Family Medicine Residency of Idaho.
Dr. Wallace said that older physicians, on coming out of training, felt more respected, were better paid, and didn’t have to continually adjust to new regulations and new complicated insurance requirements. Today’s young physicians coming out of training may not find the practice of medicine as enjoyable as their older counterparts did, but they are accustomed to increasingly complex rules and regulations, so it’s less of an adjustment. But many may not feel they want to work 80 hours per week, as their older counterparts did.
Challenges of technology
Technology is one of the most central areas where intergenerational differences play out, says Tracy Clarke, chief human resources officer at Kitsap Mental Health Services, a large nonprofit organization in Bremerton, Wash., that employs roughly 500 individuals. “The younger physicians in our practice are really prepared, already engaged in technology, and used to using technology for documentation, and it is already integrated into the way they do business in general and practice,” she said.
Dr. Epperly noted that Gen X-ers are typically comfortable with digital technology, although not quite as much as the following generation, the millennials, who have grown up with smartphones and computers quite literally at their fingertips from earliest childhood.
Dr. Epperly, now 67, described the experience of having his organization convert to a new EHR system. “Although the younger physicians were not my supervisors, the dynamic that occurred when we were switching to the new system is typical of what might happen in a more formal reporting structure of older ‘supervisee’ and younger supervisor,” he said. In fact, his experience was similar to that of Dr. J.
“Some of the millennials were so quick to learn the new system that they forgot to check in with the older ones about how they were doing, or they were frustrated with our slow pace of learning the new technology,” said Dr. Epperly. “In fact, I was struggling to master it, and so were many others of my generation, and I felt very dumb, slow, and vulnerable, even though I usually regard myself as a pretty bright guy.”
Dr. Epperly encourages younger physicians not to think, “He’s asked me five times how to do this – what’s his problem?” This impatience can be intuited by the older physician, who may take it personally and feel devalued and disrespected.
Joy Engblade, an internal medicine physician and CMO of Northern Inyo Hospital, Bishop, Calif., said that when her institution was transitioning to a new EHR system this past May, she was worried that the older physicians would have the most difficulty.
Ironically, that turned out not to be the case. In fact, the younger physicians struggled more because the older physicians recognized their limitations and “were willing to do whatever we asked them to do. They watched the tutorials about how to use the new EHR. They went to every class that was offered and did all the practice sessions.” By contrast, many of the younger ones thought, “I know how to work an EHR, I’ve been doing it for years, so how hard could it be?” By the time they needed to actually use it, the instructional resources and tutorials were no longer available.
Dr. Epperly’s experience is different. He noted that some older physicians may be embarrassed to acknowledge that they are technologically challenged and may say, “I got it, I understand,” when they are still struggling to master the new technology.
Ms. Clarke notes that the leadership in her organization is younger than many of the physicians who report to them. “For the leadership, the biggest challenge is that many older physicians are set in their ways, and they haven’t really seen a reason to change their practice or ways of doing things.” For example, some still prefer paper charting or making voice recordings of patient visits for other people to transcribe.
Ms. Clarke has some advice for younger leaders: “Really explore what the pain points are of these older physicians. Beyond their saying, ‘because I’ve always done it this way,’ what really is the advantage of, for example, paper charting when using the EHR is more efficient?”
Daniel DeBehnke, MD, is an emergency medicine physician and vice president and chief physician executive for Premier Inc., where he helps hospitals improve quality, safety, and financial performance. Before joining Premier, he was both a practicing physician and CEO of a health system consisting of more than 1,500 physicians.
“Having been on both sides of the spectrum as manager/leader within a physician group, some of whom are senior to me and some of whom are junior, I can tell you that I have never had any issues related to the age gap.” In fact, it is less about age per se and more about “the expertise that you, as a manager, bring to the table in understanding the nuances of the medical practice and for the individual being ‘managed.’ It is about trusting the expertise of the manager.”
Before and after hourly caps
Dr. Engblade regards “generational” issues to be less about age and birth year and more about the cap on hours worked during residency.
Dr. Engblade, who is 45 years old, said she did her internship year with no hourly restrictions. Such restrictions only went into effect during her second year of residency. “This created a paradigm shift in how much people wanted to work and created a consciousness of work-life balance that hadn’t been part of the conversation before,” she said.
When she interviews an older physician, a typical response is, “Of course I’ll be available any time,” whereas younger physicians, who went through residency after hourly restrictions had been established, are more likely to ask how many hours they will be on and how many they’ll be off.
Matt Lambert, MD, an independent emergency medicine physician and CMO of Curation Health, Washington, agreed, noting that differences in the cap on hours during training “can create a bit of an undertow, a tension between younger managers who are better adjusted in terms of work-life balance and older physicians being managed, who have a different work ethic and also might regard their managers as being less trained because they put in fewer hours during training.”
It is also important to be cognizant of differences in style and priorities that each generation brings to the table. Jaciel Keltgen, PhD, assistant professor of business administration, Augustana University, Sioux Falls, S.D., has heard older physicians say, “We did this the hard way, we sacrificed for our organization, and we expect the same values of younger physicians.” The younger ones tend to say, “We need to use all the tools at our disposal, and medicine doesn’t have to be practiced the way it’s always been.”
Dr. Keltgen, whose PhD is in political science and who has studied public administration, said that younger physicians may also question the mores and protocols that older physicians take for granted. For example, when her physician son was beginning his career, he was told by his senior supervisors that although he was “performing beautifully as a physician, he needed to shave more frequently, wear his white coat more often, and introduce himself as ‘Doctor’ rather than by his first name. Although he did wear his white coat more often, he didn’t change how he introduced himself to patients.”
Flexibility and mutual understanding of each generation’s needs, the type, structure, and amount of training they underwent, and the prevailing values will smooth supervisory interactions and optimize outcomes, experts agree.
Every generation’s No. 1 concern
For her dissertation, Dr. Keltgen used a large dataset of physicians and sought to draw a predictive model by generation and gender as to what physicians were seeking in order to be satisfied in their careers. One “overwhelming finding” of her research into generational differences in physicians is that “every single generation and gender is there to promote the health of their patients, and providing excellent care is their No. 1 concern. That is the common focus and the foundation that everyone can build on.”
Dr. J agreed. “Had I felt like a valued collaborator, I might have made a different decision.” He has begun to consider reentering clinical practice, perhaps as locum tenens or on a part-time basis. “I don’t want to feel that I’ve been driven out of a field that I love. I will see if I can find some type of context where my experience will be valued and learn to bring myself up to speed with technology if necessary. I believe I still have much to offer patients, and I would like to find a context to do so.”
A version of this article first appeared on Medscape.com.
Morton J, MD, a 68-year-old cardiologist based in the Midwest, saw things become dramatically worse when his nine-physician practice was taken over by a large health system.
“Everything changed. My partners and I lost a lot of autonomy. We had a say – but not the final say-so in who we hired as medical assistants or receptionists. We had to change how long we spent with patients and justify procedures or tests – not just to the insurance companies, which is an old story, but to our new employer,” said Dr. J, who asked to remain anonymous.
Worst of all, “I had to report to a kid – a doctor in his 30s, someone young enough to be my son, someone with a fraction of the clinical training and experience I had but who now got to tell me what to do and how to run my practice.”
The “final straw” for Dr. J came when the practice had to change to a new electronic health record (EHR) system. “Learning this new system was like pulling teeth,” he said. His youthful supervisor was “obviously impatient and irritated – his whole attitude and demeanor reflected a sense that he was saddled with a dinosaur.”
After much anguishing and soul-searching, Dr. J decided to retire. “I was already close to retirement age, and I thought it would be nice to spend more time with my grandchildren. Feeling so disrespected was simply the catalyst that brought the decision to a head a couple of years sooner than I had planned.”
Getting through a delicate discussion
This unfortunate situation could have been avoided had the younger supervisor shown more sensitivity, says otolaryngologist Mark Wallace, DO.
Dr. Wallace is speaking from personal experience. Early in his career, he was a younger physician who was forced to discuss a practice management issue with an older physician.
Dr. Wallace was a member of a committee that was responsible for “maximizing the efficiency of good care, while still being aware of cost issues.” When the committee “wanted one of the physicians in the group to change their behavior to improve cost savings, it was my job to discuss that with them.”
Dr. Wallace, who today is a locum tenens physician and a medical practice consultant to Physicians Thrive – an advisory group that helps physicians with financial and practice management problems – recalls feeling uncomfortable about broaching the subject to his supervisee. In this case, the older physician was prescribing name brand medications, and the committee that appointed Dr. Wallace wanted him to encourage the physician to prescribe a generic medication first and reserve brand prescriptions only for cases in which the generic was ineffective.
He acknowledges that he thought the generic was equivalent to the branded product in safety and efficacy.
“I always felt this to be a delicate discussion, whatever the age of the physician, because I didn’t like the idea of telling a doctor that they have to change how they practice so as to save money. I would never want anyone to feel they’re providing a lower level of care.”
The fact that this was an older physician – in his 60s – compounded his hesitancy. “Older physicians have a lot more experience than what I had in my 30s,” Dr. Wallace said. “I could talk to them about studies and outcomes and things like that, but a large part of medicine is the experience you gain over time.
“I presented it simply as a cost issue raised by the committee and asked him to consider experimenting with changing his prescribing behavior, while emphasizing that ultimately, it was his decision,” says Dr. Wallace.
The supervisee understood the concern and agreed to the experiment. He ended up prescribing the generic more frequently, although perhaps not as frequently as the committee would have liked.
, says Ted Epperly, MD, a family physician in Boise, Idaho, and president and CEO of Family Medicine Residency of Idaho.
Dr. Wallace said that older physicians, on coming out of training, felt more respected, were better paid, and didn’t have to continually adjust to new regulations and new complicated insurance requirements. Today’s young physicians coming out of training may not find the practice of medicine as enjoyable as their older counterparts did, but they are accustomed to increasingly complex rules and regulations, so it’s less of an adjustment. But many may not feel they want to work 80 hours per week, as their older counterparts did.
Challenges of technology
Technology is one of the most central areas where intergenerational differences play out, says Tracy Clarke, chief human resources officer at Kitsap Mental Health Services, a large nonprofit organization in Bremerton, Wash., that employs roughly 500 individuals. “The younger physicians in our practice are really prepared, already engaged in technology, and used to using technology for documentation, and it is already integrated into the way they do business in general and practice,” she said.
Dr. Epperly noted that Gen X-ers are typically comfortable with digital technology, although not quite as much as the following generation, the millennials, who have grown up with smartphones and computers quite literally at their fingertips from earliest childhood.
Dr. Epperly, now 67, described the experience of having his organization convert to a new EHR system. “Although the younger physicians were not my supervisors, the dynamic that occurred when we were switching to the new system is typical of what might happen in a more formal reporting structure of older ‘supervisee’ and younger supervisor,” he said. In fact, his experience was similar to that of Dr. J.
“Some of the millennials were so quick to learn the new system that they forgot to check in with the older ones about how they were doing, or they were frustrated with our slow pace of learning the new technology,” said Dr. Epperly. “In fact, I was struggling to master it, and so were many others of my generation, and I felt very dumb, slow, and vulnerable, even though I usually regard myself as a pretty bright guy.”
Dr. Epperly encourages younger physicians not to think, “He’s asked me five times how to do this – what’s his problem?” This impatience can be intuited by the older physician, who may take it personally and feel devalued and disrespected.
Joy Engblade, an internal medicine physician and CMO of Northern Inyo Hospital, Bishop, Calif., said that when her institution was transitioning to a new EHR system this past May, she was worried that the older physicians would have the most difficulty.
Ironically, that turned out not to be the case. In fact, the younger physicians struggled more because the older physicians recognized their limitations and “were willing to do whatever we asked them to do. They watched the tutorials about how to use the new EHR. They went to every class that was offered and did all the practice sessions.” By contrast, many of the younger ones thought, “I know how to work an EHR, I’ve been doing it for years, so how hard could it be?” By the time they needed to actually use it, the instructional resources and tutorials were no longer available.
Dr. Epperly’s experience is different. He noted that some older physicians may be embarrassed to acknowledge that they are technologically challenged and may say, “I got it, I understand,” when they are still struggling to master the new technology.
Ms. Clarke notes that the leadership in her organization is younger than many of the physicians who report to them. “For the leadership, the biggest challenge is that many older physicians are set in their ways, and they haven’t really seen a reason to change their practice or ways of doing things.” For example, some still prefer paper charting or making voice recordings of patient visits for other people to transcribe.
Ms. Clarke has some advice for younger leaders: “Really explore what the pain points are of these older physicians. Beyond their saying, ‘because I’ve always done it this way,’ what really is the advantage of, for example, paper charting when using the EHR is more efficient?”
Daniel DeBehnke, MD, is an emergency medicine physician and vice president and chief physician executive for Premier Inc., where he helps hospitals improve quality, safety, and financial performance. Before joining Premier, he was both a practicing physician and CEO of a health system consisting of more than 1,500 physicians.
“Having been on both sides of the spectrum as manager/leader within a physician group, some of whom are senior to me and some of whom are junior, I can tell you that I have never had any issues related to the age gap.” In fact, it is less about age per se and more about “the expertise that you, as a manager, bring to the table in understanding the nuances of the medical practice and for the individual being ‘managed.’ It is about trusting the expertise of the manager.”
Before and after hourly caps
Dr. Engblade regards “generational” issues to be less about age and birth year and more about the cap on hours worked during residency.
Dr. Engblade, who is 45 years old, said she did her internship year with no hourly restrictions. Such restrictions only went into effect during her second year of residency. “This created a paradigm shift in how much people wanted to work and created a consciousness of work-life balance that hadn’t been part of the conversation before,” she said.
When she interviews an older physician, a typical response is, “Of course I’ll be available any time,” whereas younger physicians, who went through residency after hourly restrictions had been established, are more likely to ask how many hours they will be on and how many they’ll be off.
Matt Lambert, MD, an independent emergency medicine physician and CMO of Curation Health, Washington, agreed, noting that differences in the cap on hours during training “can create a bit of an undertow, a tension between younger managers who are better adjusted in terms of work-life balance and older physicians being managed, who have a different work ethic and also might regard their managers as being less trained because they put in fewer hours during training.”
It is also important to be cognizant of differences in style and priorities that each generation brings to the table. Jaciel Keltgen, PhD, assistant professor of business administration, Augustana University, Sioux Falls, S.D., has heard older physicians say, “We did this the hard way, we sacrificed for our organization, and we expect the same values of younger physicians.” The younger ones tend to say, “We need to use all the tools at our disposal, and medicine doesn’t have to be practiced the way it’s always been.”
Dr. Keltgen, whose PhD is in political science and who has studied public administration, said that younger physicians may also question the mores and protocols that older physicians take for granted. For example, when her physician son was beginning his career, he was told by his senior supervisors that although he was “performing beautifully as a physician, he needed to shave more frequently, wear his white coat more often, and introduce himself as ‘Doctor’ rather than by his first name. Although he did wear his white coat more often, he didn’t change how he introduced himself to patients.”
Flexibility and mutual understanding of each generation’s needs, the type, structure, and amount of training they underwent, and the prevailing values will smooth supervisory interactions and optimize outcomes, experts agree.
Every generation’s No. 1 concern
For her dissertation, Dr. Keltgen used a large dataset of physicians and sought to draw a predictive model by generation and gender as to what physicians were seeking in order to be satisfied in their careers. One “overwhelming finding” of her research into generational differences in physicians is that “every single generation and gender is there to promote the health of their patients, and providing excellent care is their No. 1 concern. That is the common focus and the foundation that everyone can build on.”
Dr. J agreed. “Had I felt like a valued collaborator, I might have made a different decision.” He has begun to consider reentering clinical practice, perhaps as locum tenens or on a part-time basis. “I don’t want to feel that I’ve been driven out of a field that I love. I will see if I can find some type of context where my experience will be valued and learn to bring myself up to speed with technology if necessary. I believe I still have much to offer patients, and I would like to find a context to do so.”
A version of this article first appeared on Medscape.com.
Morton J, MD, a 68-year-old cardiologist based in the Midwest, saw things become dramatically worse when his nine-physician practice was taken over by a large health system.
“Everything changed. My partners and I lost a lot of autonomy. We had a say – but not the final say-so in who we hired as medical assistants or receptionists. We had to change how long we spent with patients and justify procedures or tests – not just to the insurance companies, which is an old story, but to our new employer,” said Dr. J, who asked to remain anonymous.
Worst of all, “I had to report to a kid – a doctor in his 30s, someone young enough to be my son, someone with a fraction of the clinical training and experience I had but who now got to tell me what to do and how to run my practice.”
The “final straw” for Dr. J came when the practice had to change to a new electronic health record (EHR) system. “Learning this new system was like pulling teeth,” he said. His youthful supervisor was “obviously impatient and irritated – his whole attitude and demeanor reflected a sense that he was saddled with a dinosaur.”
After much anguishing and soul-searching, Dr. J decided to retire. “I was already close to retirement age, and I thought it would be nice to spend more time with my grandchildren. Feeling so disrespected was simply the catalyst that brought the decision to a head a couple of years sooner than I had planned.”
Getting through a delicate discussion
This unfortunate situation could have been avoided had the younger supervisor shown more sensitivity, says otolaryngologist Mark Wallace, DO.
Dr. Wallace is speaking from personal experience. Early in his career, he was a younger physician who was forced to discuss a practice management issue with an older physician.
Dr. Wallace was a member of a committee that was responsible for “maximizing the efficiency of good care, while still being aware of cost issues.” When the committee “wanted one of the physicians in the group to change their behavior to improve cost savings, it was my job to discuss that with them.”
Dr. Wallace, who today is a locum tenens physician and a medical practice consultant to Physicians Thrive – an advisory group that helps physicians with financial and practice management problems – recalls feeling uncomfortable about broaching the subject to his supervisee. In this case, the older physician was prescribing name brand medications, and the committee that appointed Dr. Wallace wanted him to encourage the physician to prescribe a generic medication first and reserve brand prescriptions only for cases in which the generic was ineffective.
He acknowledges that he thought the generic was equivalent to the branded product in safety and efficacy.
“I always felt this to be a delicate discussion, whatever the age of the physician, because I didn’t like the idea of telling a doctor that they have to change how they practice so as to save money. I would never want anyone to feel they’re providing a lower level of care.”
The fact that this was an older physician – in his 60s – compounded his hesitancy. “Older physicians have a lot more experience than what I had in my 30s,” Dr. Wallace said. “I could talk to them about studies and outcomes and things like that, but a large part of medicine is the experience you gain over time.
“I presented it simply as a cost issue raised by the committee and asked him to consider experimenting with changing his prescribing behavior, while emphasizing that ultimately, it was his decision,” says Dr. Wallace.
The supervisee understood the concern and agreed to the experiment. He ended up prescribing the generic more frequently, although perhaps not as frequently as the committee would have liked.
, says Ted Epperly, MD, a family physician in Boise, Idaho, and president and CEO of Family Medicine Residency of Idaho.
Dr. Wallace said that older physicians, on coming out of training, felt more respected, were better paid, and didn’t have to continually adjust to new regulations and new complicated insurance requirements. Today’s young physicians coming out of training may not find the practice of medicine as enjoyable as their older counterparts did, but they are accustomed to increasingly complex rules and regulations, so it’s less of an adjustment. But many may not feel they want to work 80 hours per week, as their older counterparts did.
Challenges of technology
Technology is one of the most central areas where intergenerational differences play out, says Tracy Clarke, chief human resources officer at Kitsap Mental Health Services, a large nonprofit organization in Bremerton, Wash., that employs roughly 500 individuals. “The younger physicians in our practice are really prepared, already engaged in technology, and used to using technology for documentation, and it is already integrated into the way they do business in general and practice,” she said.
Dr. Epperly noted that Gen X-ers are typically comfortable with digital technology, although not quite as much as the following generation, the millennials, who have grown up with smartphones and computers quite literally at their fingertips from earliest childhood.
Dr. Epperly, now 67, described the experience of having his organization convert to a new EHR system. “Although the younger physicians were not my supervisors, the dynamic that occurred when we were switching to the new system is typical of what might happen in a more formal reporting structure of older ‘supervisee’ and younger supervisor,” he said. In fact, his experience was similar to that of Dr. J.
“Some of the millennials were so quick to learn the new system that they forgot to check in with the older ones about how they were doing, or they were frustrated with our slow pace of learning the new technology,” said Dr. Epperly. “In fact, I was struggling to master it, and so were many others of my generation, and I felt very dumb, slow, and vulnerable, even though I usually regard myself as a pretty bright guy.”
Dr. Epperly encourages younger physicians not to think, “He’s asked me five times how to do this – what’s his problem?” This impatience can be intuited by the older physician, who may take it personally and feel devalued and disrespected.
Joy Engblade, an internal medicine physician and CMO of Northern Inyo Hospital, Bishop, Calif., said that when her institution was transitioning to a new EHR system this past May, she was worried that the older physicians would have the most difficulty.
Ironically, that turned out not to be the case. In fact, the younger physicians struggled more because the older physicians recognized their limitations and “were willing to do whatever we asked them to do. They watched the tutorials about how to use the new EHR. They went to every class that was offered and did all the practice sessions.” By contrast, many of the younger ones thought, “I know how to work an EHR, I’ve been doing it for years, so how hard could it be?” By the time they needed to actually use it, the instructional resources and tutorials were no longer available.
Dr. Epperly’s experience is different. He noted that some older physicians may be embarrassed to acknowledge that they are technologically challenged and may say, “I got it, I understand,” when they are still struggling to master the new technology.
Ms. Clarke notes that the leadership in her organization is younger than many of the physicians who report to them. “For the leadership, the biggest challenge is that many older physicians are set in their ways, and they haven’t really seen a reason to change their practice or ways of doing things.” For example, some still prefer paper charting or making voice recordings of patient visits for other people to transcribe.
Ms. Clarke has some advice for younger leaders: “Really explore what the pain points are of these older physicians. Beyond their saying, ‘because I’ve always done it this way,’ what really is the advantage of, for example, paper charting when using the EHR is more efficient?”
Daniel DeBehnke, MD, is an emergency medicine physician and vice president and chief physician executive for Premier Inc., where he helps hospitals improve quality, safety, and financial performance. Before joining Premier, he was both a practicing physician and CEO of a health system consisting of more than 1,500 physicians.
“Having been on both sides of the spectrum as manager/leader within a physician group, some of whom are senior to me and some of whom are junior, I can tell you that I have never had any issues related to the age gap.” In fact, it is less about age per se and more about “the expertise that you, as a manager, bring to the table in understanding the nuances of the medical practice and for the individual being ‘managed.’ It is about trusting the expertise of the manager.”
Before and after hourly caps
Dr. Engblade regards “generational” issues to be less about age and birth year and more about the cap on hours worked during residency.
Dr. Engblade, who is 45 years old, said she did her internship year with no hourly restrictions. Such restrictions only went into effect during her second year of residency. “This created a paradigm shift in how much people wanted to work and created a consciousness of work-life balance that hadn’t been part of the conversation before,” she said.
When she interviews an older physician, a typical response is, “Of course I’ll be available any time,” whereas younger physicians, who went through residency after hourly restrictions had been established, are more likely to ask how many hours they will be on and how many they’ll be off.
Matt Lambert, MD, an independent emergency medicine physician and CMO of Curation Health, Washington, agreed, noting that differences in the cap on hours during training “can create a bit of an undertow, a tension between younger managers who are better adjusted in terms of work-life balance and older physicians being managed, who have a different work ethic and also might regard their managers as being less trained because they put in fewer hours during training.”
It is also important to be cognizant of differences in style and priorities that each generation brings to the table. Jaciel Keltgen, PhD, assistant professor of business administration, Augustana University, Sioux Falls, S.D., has heard older physicians say, “We did this the hard way, we sacrificed for our organization, and we expect the same values of younger physicians.” The younger ones tend to say, “We need to use all the tools at our disposal, and medicine doesn’t have to be practiced the way it’s always been.”
Dr. Keltgen, whose PhD is in political science and who has studied public administration, said that younger physicians may also question the mores and protocols that older physicians take for granted. For example, when her physician son was beginning his career, he was told by his senior supervisors that although he was “performing beautifully as a physician, he needed to shave more frequently, wear his white coat more often, and introduce himself as ‘Doctor’ rather than by his first name. Although he did wear his white coat more often, he didn’t change how he introduced himself to patients.”
Flexibility and mutual understanding of each generation’s needs, the type, structure, and amount of training they underwent, and the prevailing values will smooth supervisory interactions and optimize outcomes, experts agree.
Every generation’s No. 1 concern
For her dissertation, Dr. Keltgen used a large dataset of physicians and sought to draw a predictive model by generation and gender as to what physicians were seeking in order to be satisfied in their careers. One “overwhelming finding” of her research into generational differences in physicians is that “every single generation and gender is there to promote the health of their patients, and providing excellent care is their No. 1 concern. That is the common focus and the foundation that everyone can build on.”
Dr. J agreed. “Had I felt like a valued collaborator, I might have made a different decision.” He has begun to consider reentering clinical practice, perhaps as locum tenens or on a part-time basis. “I don’t want to feel that I’ve been driven out of a field that I love. I will see if I can find some type of context where my experience will be valued and learn to bring myself up to speed with technology if necessary. I believe I still have much to offer patients, and I would like to find a context to do so.”
A version of this article first appeared on Medscape.com.
‘Alarming’ worldwide decline in mental health
The Mental Health Million project of Sapien Labs issued its second report, published online March 15, encompassing 34 countries and over 220,000 Internet-enabled adults. It found a continued decline in mental health in all age groups and genders, with English-speaking countries having the lowest mental well-being.
The decline was significantly correlated with the stringency of COVID-19 lockdown measures in each country and was directionally correlated to the cases and deaths per million.
The youngest age group (18-24 years) reported the poorest mental well-being, with better mental health scores rising in every successively older age group.
“Some of our findings, especially regarding mental health in young adults, are alarming,” Tara Thiagarajan, PhD, Sapien Labs founder and chief scientist, told this news organization.
“Our data, which are continually updated in real time, are freely available for nonprofit, noncommercial use and research, and we hope that researchers will get involved in an interdisciplinary way that spans sociology, economics, psychiatry, and other fields,” she said.
Pioneering research
Dr. Thiagarajan and her team pioneered the Mental Health Million project, an ongoing research initiative utilizing a “free and anonymous assessment tool,” the Mental Health Quotient (MHQ), which “encompasses a comprehensive view of our emotional, social, and cognitive function and capability.”
The MHQ consists of 47 “elements of mental well-being,” with scores ranging from –100 to +200. (Negative scores indicate poorer mental well-being.) The MHQ categorizes respondents as “clinical, at-risk, enduring, managing, succeeding, and thriving” and computes scores on the basis of six broad dimensions of mental health: core cognition, complex cognition, mood and outlook, drive and motivation, social self, and mind-body connection.
As reported by this news organization, Sapien Lab’s first Mental Health State of the World report (n = 49,000 adults) was conducted in eight English-speaking countries in 2020. Participants were compared to a smaller sample of people from the same countries polled in 2019.
In this year’s report, “we expanded quite substantially,” Dr. Thiagarajan said. The project added Spanish, French, and Arabic and recruited participants from 34 countries on six continents (n = 223,087) via advertising on Google and Facebook.
Economic prosperity not protective
Across the eight English-speaking countries, there was a decline in mental well-being of 3% from 2020 to 2021, which was smaller than the 8% decline from 2019 to 2020. The percentage of people who were “distressed or struggling” increased from 26% to 30% in 2021.
“Now that a lot of pandemic issue seems to be easing up, I hope we’ll see mental well-being coming back up, but at least it’s a smaller decline than we saw between 2019 and 2020,” said Dr. Thiagarajan.
The decline across countries from 2019 to 2021 was significantly correlated with the stringency of governmental COVID-19-related measures (based on the Oxford COVID-19 Government Response Tracker, 2022; r = .54) and directionally correlated to the cases and deaths per million.
In total, 30% of respondents in English-speaking countries had mental well-being scores in the “distressed” or “struggling” range – higher than the Middle Eastern countries, North Africa, Latin America, and Europe (23%, 23%, 24%, and 18%, respectively).
Only 36% of participants in the English-speaking countries, the Middle East, and North Africa reported “thriving or succeeding,” vs. 45% and 46% in Latin America and Europe, respectively. Venezuela topped the list with an average MHQ of 91, while the United Kingdom and South Africa had the lowest scores, at 46 each.
Mental well-being was slightly higher in males than in females but was dramatically lower in nonbinary/third-gender respondents. In fact, those identifying as nonbinary/third gender had the lowest mental well-being of any group.
Across all countries and languages, higher education was associated with better mental well-being. Employment was also associated with superior mental well-being, compared with being unemployed – particularly in core English-speaking countries.
However, “country indicators of economic prosperity were negatively correlated with mental well-being, particularly for young adults and males, belying the commonly held belief that national economic prosperity translates into greater mental well-being,” said Dr. Thiagarajan.
‘Stark’ contrast
The most dramatic finding was the difference in mental well-being between younger and older adults, which was two- to threefold larger than differences in other dimensions (for example, age, gender, employment). Even the maximum difference between countries overall (15%) was still smaller than the generational gap within any region.
While only 7% (6%- 9%) of participants aged ≥65 years were “distressed and struggling” with their mental well-being to a “clinical” extent, 44% (38%-50%) of those aged 18-24 years reported mental well-being scores in the “distressed or struggling” range – representing a “growing gap between generations that, while present prior to the COVID-19 pandemic, has since been exacerbated,” the authors state.
With every successive decrement in age group, mental well-being “plummeted,” Dr. Thiagarajan said. She noted that research conducted prior to 2010 in several regions of the world showed that young adults typically had the highest well-being. “Our findings stand in stark contrast to these previous patterns.”
The relationship between lockdown stringency and poorer mental health could play a role. “The impact of social isolation may be most strongly felt in younger people,” she said.
Internet a culprit?
“Within almost every region, scores for cognition and drive and motivation were highest while mood and outlook and social self were the lowest,” the authors report.
The aggregate percentage of respondents who reported being “distressed or struggling” in the various MHQ dimensions is shown in the following table.
In particular, English-speaking countries scored lowest on the social self scale.
The sense of social self is “how you see yourself with respect to others, how you relate to others and the ability to form strong, stable relationships and maintain them with other people,” said Dr. Thiagarajan.
Internet use might account for the “massive” difference between the youngest and the oldest generations, she suggested. “Following 2010, mobile phone penetration picked up and rose rapidly. ... Mobile phones took over the world.”
Time spent on the Internet – an estimated 7-10 hours per day – “eats into the time people in older generations used in building the social self. Kids who grow up on the Internet are losing thousands of hours in social interactions, which is challenging their ability to form relationships, how they see themselves, and how they fit into the social fabric,” Dr. Thiagarajan added
Sedentary time
Commenting for this news organization, Bernardo Ng, MD, a member of the American Psychiatric Association’s Council on International Psychiatry and Global Health and medical director of Sun Valley Research Center, Imperial, Calif., called the report “interesting, with an impressive sample size” and an “impressive geographic distribution.”
Dr. Ng, who was not involved in the report, said, “I did not think the impact of Internet use on mental health was as dramatic before looking at this report.
“On the other hand, I have personally been interested in the impact of sedentarism in mental health – not only emotionally but also biologically. Sedentarism, which is directly related to screen use time, produces inflammation that worsens brain function.”
Also commenting, Ken Duckworth, MD, chief medical officer of the National Alliance of Mental Illness, called the survey “extremely well timed and creative, although it looked only at Internet-enabled populations, so one cannot make too many overall pronouncements, because a lot of people don’t have access to the Internet.”
The data regarding young people are particularly powerful. “The idea that young people are having a decrease in their experience of mental health across the world is something I haven’t seen before.”
Dr. Duckworth suggested the reason might “have to do with the impact of the COVID lockdown on normal development that young people go through, while older people don’t struggle with these developmental challenges in the same way.”
A version of this article first appeared on Medscape.com.
The Mental Health Million project of Sapien Labs issued its second report, published online March 15, encompassing 34 countries and over 220,000 Internet-enabled adults. It found a continued decline in mental health in all age groups and genders, with English-speaking countries having the lowest mental well-being.
The decline was significantly correlated with the stringency of COVID-19 lockdown measures in each country and was directionally correlated to the cases and deaths per million.
The youngest age group (18-24 years) reported the poorest mental well-being, with better mental health scores rising in every successively older age group.
“Some of our findings, especially regarding mental health in young adults, are alarming,” Tara Thiagarajan, PhD, Sapien Labs founder and chief scientist, told this news organization.
“Our data, which are continually updated in real time, are freely available for nonprofit, noncommercial use and research, and we hope that researchers will get involved in an interdisciplinary way that spans sociology, economics, psychiatry, and other fields,” she said.
Pioneering research
Dr. Thiagarajan and her team pioneered the Mental Health Million project, an ongoing research initiative utilizing a “free and anonymous assessment tool,” the Mental Health Quotient (MHQ), which “encompasses a comprehensive view of our emotional, social, and cognitive function and capability.”
The MHQ consists of 47 “elements of mental well-being,” with scores ranging from –100 to +200. (Negative scores indicate poorer mental well-being.) The MHQ categorizes respondents as “clinical, at-risk, enduring, managing, succeeding, and thriving” and computes scores on the basis of six broad dimensions of mental health: core cognition, complex cognition, mood and outlook, drive and motivation, social self, and mind-body connection.
As reported by this news organization, Sapien Lab’s first Mental Health State of the World report (n = 49,000 adults) was conducted in eight English-speaking countries in 2020. Participants were compared to a smaller sample of people from the same countries polled in 2019.
In this year’s report, “we expanded quite substantially,” Dr. Thiagarajan said. The project added Spanish, French, and Arabic and recruited participants from 34 countries on six continents (n = 223,087) via advertising on Google and Facebook.
Economic prosperity not protective
Across the eight English-speaking countries, there was a decline in mental well-being of 3% from 2020 to 2021, which was smaller than the 8% decline from 2019 to 2020. The percentage of people who were “distressed or struggling” increased from 26% to 30% in 2021.
“Now that a lot of pandemic issue seems to be easing up, I hope we’ll see mental well-being coming back up, but at least it’s a smaller decline than we saw between 2019 and 2020,” said Dr. Thiagarajan.
The decline across countries from 2019 to 2021 was significantly correlated with the stringency of governmental COVID-19-related measures (based on the Oxford COVID-19 Government Response Tracker, 2022; r = .54) and directionally correlated to the cases and deaths per million.
In total, 30% of respondents in English-speaking countries had mental well-being scores in the “distressed” or “struggling” range – higher than the Middle Eastern countries, North Africa, Latin America, and Europe (23%, 23%, 24%, and 18%, respectively).
Only 36% of participants in the English-speaking countries, the Middle East, and North Africa reported “thriving or succeeding,” vs. 45% and 46% in Latin America and Europe, respectively. Venezuela topped the list with an average MHQ of 91, while the United Kingdom and South Africa had the lowest scores, at 46 each.
Mental well-being was slightly higher in males than in females but was dramatically lower in nonbinary/third-gender respondents. In fact, those identifying as nonbinary/third gender had the lowest mental well-being of any group.
Across all countries and languages, higher education was associated with better mental well-being. Employment was also associated with superior mental well-being, compared with being unemployed – particularly in core English-speaking countries.
However, “country indicators of economic prosperity were negatively correlated with mental well-being, particularly for young adults and males, belying the commonly held belief that national economic prosperity translates into greater mental well-being,” said Dr. Thiagarajan.
‘Stark’ contrast
The most dramatic finding was the difference in mental well-being between younger and older adults, which was two- to threefold larger than differences in other dimensions (for example, age, gender, employment). Even the maximum difference between countries overall (15%) was still smaller than the generational gap within any region.
While only 7% (6%- 9%) of participants aged ≥65 years were “distressed and struggling” with their mental well-being to a “clinical” extent, 44% (38%-50%) of those aged 18-24 years reported mental well-being scores in the “distressed or struggling” range – representing a “growing gap between generations that, while present prior to the COVID-19 pandemic, has since been exacerbated,” the authors state.
With every successive decrement in age group, mental well-being “plummeted,” Dr. Thiagarajan said. She noted that research conducted prior to 2010 in several regions of the world showed that young adults typically had the highest well-being. “Our findings stand in stark contrast to these previous patterns.”
The relationship between lockdown stringency and poorer mental health could play a role. “The impact of social isolation may be most strongly felt in younger people,” she said.
Internet a culprit?
“Within almost every region, scores for cognition and drive and motivation were highest while mood and outlook and social self were the lowest,” the authors report.
The aggregate percentage of respondents who reported being “distressed or struggling” in the various MHQ dimensions is shown in the following table.
In particular, English-speaking countries scored lowest on the social self scale.
The sense of social self is “how you see yourself with respect to others, how you relate to others and the ability to form strong, stable relationships and maintain them with other people,” said Dr. Thiagarajan.
Internet use might account for the “massive” difference between the youngest and the oldest generations, she suggested. “Following 2010, mobile phone penetration picked up and rose rapidly. ... Mobile phones took over the world.”
Time spent on the Internet – an estimated 7-10 hours per day – “eats into the time people in older generations used in building the social self. Kids who grow up on the Internet are losing thousands of hours in social interactions, which is challenging their ability to form relationships, how they see themselves, and how they fit into the social fabric,” Dr. Thiagarajan added
Sedentary time
Commenting for this news organization, Bernardo Ng, MD, a member of the American Psychiatric Association’s Council on International Psychiatry and Global Health and medical director of Sun Valley Research Center, Imperial, Calif., called the report “interesting, with an impressive sample size” and an “impressive geographic distribution.”
Dr. Ng, who was not involved in the report, said, “I did not think the impact of Internet use on mental health was as dramatic before looking at this report.
“On the other hand, I have personally been interested in the impact of sedentarism in mental health – not only emotionally but also biologically. Sedentarism, which is directly related to screen use time, produces inflammation that worsens brain function.”
Also commenting, Ken Duckworth, MD, chief medical officer of the National Alliance of Mental Illness, called the survey “extremely well timed and creative, although it looked only at Internet-enabled populations, so one cannot make too many overall pronouncements, because a lot of people don’t have access to the Internet.”
The data regarding young people are particularly powerful. “The idea that young people are having a decrease in their experience of mental health across the world is something I haven’t seen before.”
Dr. Duckworth suggested the reason might “have to do with the impact of the COVID lockdown on normal development that young people go through, while older people don’t struggle with these developmental challenges in the same way.”
A version of this article first appeared on Medscape.com.
The Mental Health Million project of Sapien Labs issued its second report, published online March 15, encompassing 34 countries and over 220,000 Internet-enabled adults. It found a continued decline in mental health in all age groups and genders, with English-speaking countries having the lowest mental well-being.
The decline was significantly correlated with the stringency of COVID-19 lockdown measures in each country and was directionally correlated to the cases and deaths per million.
The youngest age group (18-24 years) reported the poorest mental well-being, with better mental health scores rising in every successively older age group.
“Some of our findings, especially regarding mental health in young adults, are alarming,” Tara Thiagarajan, PhD, Sapien Labs founder and chief scientist, told this news organization.
“Our data, which are continually updated in real time, are freely available for nonprofit, noncommercial use and research, and we hope that researchers will get involved in an interdisciplinary way that spans sociology, economics, psychiatry, and other fields,” she said.
Pioneering research
Dr. Thiagarajan and her team pioneered the Mental Health Million project, an ongoing research initiative utilizing a “free and anonymous assessment tool,” the Mental Health Quotient (MHQ), which “encompasses a comprehensive view of our emotional, social, and cognitive function and capability.”
The MHQ consists of 47 “elements of mental well-being,” with scores ranging from –100 to +200. (Negative scores indicate poorer mental well-being.) The MHQ categorizes respondents as “clinical, at-risk, enduring, managing, succeeding, and thriving” and computes scores on the basis of six broad dimensions of mental health: core cognition, complex cognition, mood and outlook, drive and motivation, social self, and mind-body connection.
As reported by this news organization, Sapien Lab’s first Mental Health State of the World report (n = 49,000 adults) was conducted in eight English-speaking countries in 2020. Participants were compared to a smaller sample of people from the same countries polled in 2019.
In this year’s report, “we expanded quite substantially,” Dr. Thiagarajan said. The project added Spanish, French, and Arabic and recruited participants from 34 countries on six continents (n = 223,087) via advertising on Google and Facebook.
Economic prosperity not protective
Across the eight English-speaking countries, there was a decline in mental well-being of 3% from 2020 to 2021, which was smaller than the 8% decline from 2019 to 2020. The percentage of people who were “distressed or struggling” increased from 26% to 30% in 2021.
“Now that a lot of pandemic issue seems to be easing up, I hope we’ll see mental well-being coming back up, but at least it’s a smaller decline than we saw between 2019 and 2020,” said Dr. Thiagarajan.
The decline across countries from 2019 to 2021 was significantly correlated with the stringency of governmental COVID-19-related measures (based on the Oxford COVID-19 Government Response Tracker, 2022; r = .54) and directionally correlated to the cases and deaths per million.
In total, 30% of respondents in English-speaking countries had mental well-being scores in the “distressed” or “struggling” range – higher than the Middle Eastern countries, North Africa, Latin America, and Europe (23%, 23%, 24%, and 18%, respectively).
Only 36% of participants in the English-speaking countries, the Middle East, and North Africa reported “thriving or succeeding,” vs. 45% and 46% in Latin America and Europe, respectively. Venezuela topped the list with an average MHQ of 91, while the United Kingdom and South Africa had the lowest scores, at 46 each.
Mental well-being was slightly higher in males than in females but was dramatically lower in nonbinary/third-gender respondents. In fact, those identifying as nonbinary/third gender had the lowest mental well-being of any group.
Across all countries and languages, higher education was associated with better mental well-being. Employment was also associated with superior mental well-being, compared with being unemployed – particularly in core English-speaking countries.
However, “country indicators of economic prosperity were negatively correlated with mental well-being, particularly for young adults and males, belying the commonly held belief that national economic prosperity translates into greater mental well-being,” said Dr. Thiagarajan.
‘Stark’ contrast
The most dramatic finding was the difference in mental well-being between younger and older adults, which was two- to threefold larger than differences in other dimensions (for example, age, gender, employment). Even the maximum difference between countries overall (15%) was still smaller than the generational gap within any region.
While only 7% (6%- 9%) of participants aged ≥65 years were “distressed and struggling” with their mental well-being to a “clinical” extent, 44% (38%-50%) of those aged 18-24 years reported mental well-being scores in the “distressed or struggling” range – representing a “growing gap between generations that, while present prior to the COVID-19 pandemic, has since been exacerbated,” the authors state.
With every successive decrement in age group, mental well-being “plummeted,” Dr. Thiagarajan said. She noted that research conducted prior to 2010 in several regions of the world showed that young adults typically had the highest well-being. “Our findings stand in stark contrast to these previous patterns.”
The relationship between lockdown stringency and poorer mental health could play a role. “The impact of social isolation may be most strongly felt in younger people,” she said.
Internet a culprit?
“Within almost every region, scores for cognition and drive and motivation were highest while mood and outlook and social self were the lowest,” the authors report.
The aggregate percentage of respondents who reported being “distressed or struggling” in the various MHQ dimensions is shown in the following table.
In particular, English-speaking countries scored lowest on the social self scale.
The sense of social self is “how you see yourself with respect to others, how you relate to others and the ability to form strong, stable relationships and maintain them with other people,” said Dr. Thiagarajan.
Internet use might account for the “massive” difference between the youngest and the oldest generations, she suggested. “Following 2010, mobile phone penetration picked up and rose rapidly. ... Mobile phones took over the world.”
Time spent on the Internet – an estimated 7-10 hours per day – “eats into the time people in older generations used in building the social self. Kids who grow up on the Internet are losing thousands of hours in social interactions, which is challenging their ability to form relationships, how they see themselves, and how they fit into the social fabric,” Dr. Thiagarajan added
Sedentary time
Commenting for this news organization, Bernardo Ng, MD, a member of the American Psychiatric Association’s Council on International Psychiatry and Global Health and medical director of Sun Valley Research Center, Imperial, Calif., called the report “interesting, with an impressive sample size” and an “impressive geographic distribution.”
Dr. Ng, who was not involved in the report, said, “I did not think the impact of Internet use on mental health was as dramatic before looking at this report.
“On the other hand, I have personally been interested in the impact of sedentarism in mental health – not only emotionally but also biologically. Sedentarism, which is directly related to screen use time, produces inflammation that worsens brain function.”
Also commenting, Ken Duckworth, MD, chief medical officer of the National Alliance of Mental Illness, called the survey “extremely well timed and creative, although it looked only at Internet-enabled populations, so one cannot make too many overall pronouncements, because a lot of people don’t have access to the Internet.”
The data regarding young people are particularly powerful. “The idea that young people are having a decrease in their experience of mental health across the world is something I haven’t seen before.”
Dr. Duckworth suggested the reason might “have to do with the impact of the COVID lockdown on normal development that young people go through, while older people don’t struggle with these developmental challenges in the same way.”
A version of this article first appeared on Medscape.com.
Mechanical ventilation in children tied to slightly lower IQ
Children who survive an episode of acute respiratory failure that requires invasive mechanical ventilation may be at risk for slightly lower long-term neurocognitive function, new research suggests.
Investigators found lower IQs in children without previous neurocognitive problems who survived pediatric intensive care unit admission for acute respiratory failure, compared with their biological siblings.
Although this magnitude of difference was small on average, more than twice as many patients as siblings had an IQ of ≤85, and children hospitalized at the youngest ages did worse than their siblings.
“Children surviving acute respiratory failure may benefit from routine evaluation of neurocognitive function after hospital discharge and may require serial evaluation to identify deficits that emerge over the course of child’s continued development to facilitate early intervention to prevent disability and optimize school performance,” study investigator R. Scott Watson, MD, MPH, professor of pediatrics, University of Washington, Seattle, told this news organization.
The study was published online March 1 in JAMA.
Unknown long-term effects
“Approximately 23,700 U.S. children undergo invasive mechanical ventilation for acute respiratory failure annually, with unknown long-term effects on neurocognitive function,” the authors write.
“With improvements in pediatric critical care over the past several decades, critical illness–associated mortality has improved dramatically [but] as survivorship has increased, we are starting to learn that many patients and their families suffer from long-term morbidity associated with the illness and its treatment,” said Dr. Watson, who is the associate division chief, pediatric critical care medicine, Seattle Children’s Hospital, Center for Child Health, Behavior, and Development.
Animal studies “have found that some sedative medications commonly used to keep children safe during mechanical ventilation may have detrimental neurologic effects, particularly in the developing brain,” Dr. Watson added.
To gain a better understanding of this potential association, the researchers turned to a subset of participants in the previously conducted Randomized Evaluation of Sedation Titration for Respiratory Failure (RESTORE) trial of pediatric patients receiving mechanical ventilation for acute respiratory failure.
For the current study (RESTORE-Cognition), multiple domains of neurocognitive function were assessed 3-8 years after hospital discharge in trial patients who did not have a history of neurocognitive dysfunction, as well as matched, healthy siblings.
To be included in the study, the children had to be ≤8 years old at trial enrollment, have a Pediatric Cerebral Performance Category (PCPC) score of 1 (normal) prior to PICU admission, and have no worse than moderate neurocognitive dysfunction at PICU discharge.
Siblings of enrolled patients were required to be between 4 and 16 years old at the time of neurocognitive testing, have a PCPC score of 1, have the same biological parents as the patient, and live with the patient.
The primary outcome was IQ, estimated by the age-appropriate Vocabulary and Block Design subtests of the Wechsler Intelligence Scale. Secondary outcomes included attention, processing speed, learning and memory, visuospatial skills, motor skills, language, and executive function. Enough time was allowed after hospitalization “for transient deficits to resolve and longer-lasting neurocognitive sequelae to manifest.”
‘Uncertain’ clinical importance
Of the 121 sibling pairs (67% non-Hispanic White, 47% from families in which one or both parents worked full-time), 116 were included in the primary outcome analysis, and 66-19 were included in analyses of secondary outcomes.
Patients had been in the PICU at a median (interquartile range [IQR]) age of 1.0 (0.2-3.2) years and had received a median of 5.5 (3.1-7.7) days of invasive mechanical ventilation.
The median age at testing for patients and matched siblings was 6.6 (5.4-9.1) and 8.4 (7.0-10.2) years, respectively. Interviews with parents and testing of patients were conducted a median (IQR) of 3.8 (3.2-5.2) and 5.2 (4.3-6.1) years, respectively, after hospitalization.
The most common etiologies of respiratory failure were bronchiolitis and asthma and pneumonia (44% and 37%, respectively). Beyond respiratory failure, most patients (72%) also had experienced multiple organ dysfunction syndrome.
Patients had a lower mean estimated IQ, compared with the matched siblings (101.5 vs. 104.3; mean difference, –2.8 [95% confidence interval, –5.4 to –0.2]), and more patients than siblings had an estimated IQ of ≤5 but not of ≤70.
Patients also had significantly lower scores on nonverbal memory, visuospatial skills, and fine motor control (mean differences, –0.9 [–1.6 to –0.3]; –0.9 [–1.8 to –.1]; and –-3.1 [–4.9 to –1.4], respectively), compared with matched siblings. They also had significantly higher scores on processing speed (mean difference, 4.4 [0.2-8.5]). There were no significant differences in the other secondary outcomes.
Differences in scores between patients and siblings varied significantly by age at hospitalization in several tests – for example, Block Design scores in patients were lower than those of siblings for patients hospitalized at <1 year old, versus those hospitalized between ages 4 and 8 years.
“When adjusting for patient age at PICU admission, patient age at testing, sibling age at testing, and duration between hospital discharge and testing, the difference in estimated IQ between patients and siblings remained statistically significantly different,” the authors note.
The investigators point out several limitations, including the fact that “little is known about sibling outcomes after critical illness, nor about whether parenting of siblings or child development differs based on birth order or on relationship between patient critical illness and the birth of siblings. ... If siblings also incur negative effects related to the critical illness, differences between critically ill children and the control siblings would be blunted.”
Despite the statistical significance of the difference between the patients and the matched controls, ultimately, the magnitude of the difference was “small and of uncertain clinical importance,” the authors conclude.
Filling a research gap
Commenting on the findings, Alexandre T. Rotta, MD, professor of pediatrics and chief of the division of pediatric critical care medicine, Duke University Medical Center, Durham, N.C., said the study “addresses an important yet vastly understudied gap: long-term neurocognitive morbidity in children exposed to critical care.”
Dr. Rotta, who is also a coauthor of an accompanying editorial, noted that the fact that the “vast majority of children with an IQ significantly lower than their siblings were under the age of 4 years suggests that the developing immature brain may be particularly susceptible to the effects of critical illness and therapies required to treat it.”
The study “underscores the need to include assessments of long-term morbidity as part of any future trial evaluating interventions in pediatric critical care,” he added.
The study was supported by grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development for RESTORE-Cognition and by grants for the RESTORE trial from the National Heart, Lung, and Blood Institute and the National Institute of Nursing Research, National Institutes of Health. Dr. Watson and coauthors report no relevant financial relationships. Dr. Rotta has received personal fees from Vapotherm for lecturing and development of educational materials and from Breas US for participation in a scientific advisory board, as well as royalties from Elsevier for editorial work outside the submitted work. His coauthor reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Children who survive an episode of acute respiratory failure that requires invasive mechanical ventilation may be at risk for slightly lower long-term neurocognitive function, new research suggests.
Investigators found lower IQs in children without previous neurocognitive problems who survived pediatric intensive care unit admission for acute respiratory failure, compared with their biological siblings.
Although this magnitude of difference was small on average, more than twice as many patients as siblings had an IQ of ≤85, and children hospitalized at the youngest ages did worse than their siblings.
“Children surviving acute respiratory failure may benefit from routine evaluation of neurocognitive function after hospital discharge and may require serial evaluation to identify deficits that emerge over the course of child’s continued development to facilitate early intervention to prevent disability and optimize school performance,” study investigator R. Scott Watson, MD, MPH, professor of pediatrics, University of Washington, Seattle, told this news organization.
The study was published online March 1 in JAMA.
Unknown long-term effects
“Approximately 23,700 U.S. children undergo invasive mechanical ventilation for acute respiratory failure annually, with unknown long-term effects on neurocognitive function,” the authors write.
“With improvements in pediatric critical care over the past several decades, critical illness–associated mortality has improved dramatically [but] as survivorship has increased, we are starting to learn that many patients and their families suffer from long-term morbidity associated with the illness and its treatment,” said Dr. Watson, who is the associate division chief, pediatric critical care medicine, Seattle Children’s Hospital, Center for Child Health, Behavior, and Development.
Animal studies “have found that some sedative medications commonly used to keep children safe during mechanical ventilation may have detrimental neurologic effects, particularly in the developing brain,” Dr. Watson added.
To gain a better understanding of this potential association, the researchers turned to a subset of participants in the previously conducted Randomized Evaluation of Sedation Titration for Respiratory Failure (RESTORE) trial of pediatric patients receiving mechanical ventilation for acute respiratory failure.
For the current study (RESTORE-Cognition), multiple domains of neurocognitive function were assessed 3-8 years after hospital discharge in trial patients who did not have a history of neurocognitive dysfunction, as well as matched, healthy siblings.
To be included in the study, the children had to be ≤8 years old at trial enrollment, have a Pediatric Cerebral Performance Category (PCPC) score of 1 (normal) prior to PICU admission, and have no worse than moderate neurocognitive dysfunction at PICU discharge.
Siblings of enrolled patients were required to be between 4 and 16 years old at the time of neurocognitive testing, have a PCPC score of 1, have the same biological parents as the patient, and live with the patient.
The primary outcome was IQ, estimated by the age-appropriate Vocabulary and Block Design subtests of the Wechsler Intelligence Scale. Secondary outcomes included attention, processing speed, learning and memory, visuospatial skills, motor skills, language, and executive function. Enough time was allowed after hospitalization “for transient deficits to resolve and longer-lasting neurocognitive sequelae to manifest.”
‘Uncertain’ clinical importance
Of the 121 sibling pairs (67% non-Hispanic White, 47% from families in which one or both parents worked full-time), 116 were included in the primary outcome analysis, and 66-19 were included in analyses of secondary outcomes.
Patients had been in the PICU at a median (interquartile range [IQR]) age of 1.0 (0.2-3.2) years and had received a median of 5.5 (3.1-7.7) days of invasive mechanical ventilation.
The median age at testing for patients and matched siblings was 6.6 (5.4-9.1) and 8.4 (7.0-10.2) years, respectively. Interviews with parents and testing of patients were conducted a median (IQR) of 3.8 (3.2-5.2) and 5.2 (4.3-6.1) years, respectively, after hospitalization.
The most common etiologies of respiratory failure were bronchiolitis and asthma and pneumonia (44% and 37%, respectively). Beyond respiratory failure, most patients (72%) also had experienced multiple organ dysfunction syndrome.
Patients had a lower mean estimated IQ, compared with the matched siblings (101.5 vs. 104.3; mean difference, –2.8 [95% confidence interval, –5.4 to –0.2]), and more patients than siblings had an estimated IQ of ≤5 but not of ≤70.
Patients also had significantly lower scores on nonverbal memory, visuospatial skills, and fine motor control (mean differences, –0.9 [–1.6 to –0.3]; –0.9 [–1.8 to –.1]; and –-3.1 [–4.9 to –1.4], respectively), compared with matched siblings. They also had significantly higher scores on processing speed (mean difference, 4.4 [0.2-8.5]). There were no significant differences in the other secondary outcomes.
Differences in scores between patients and siblings varied significantly by age at hospitalization in several tests – for example, Block Design scores in patients were lower than those of siblings for patients hospitalized at <1 year old, versus those hospitalized between ages 4 and 8 years.
“When adjusting for patient age at PICU admission, patient age at testing, sibling age at testing, and duration between hospital discharge and testing, the difference in estimated IQ between patients and siblings remained statistically significantly different,” the authors note.
The investigators point out several limitations, including the fact that “little is known about sibling outcomes after critical illness, nor about whether parenting of siblings or child development differs based on birth order or on relationship between patient critical illness and the birth of siblings. ... If siblings also incur negative effects related to the critical illness, differences between critically ill children and the control siblings would be blunted.”
Despite the statistical significance of the difference between the patients and the matched controls, ultimately, the magnitude of the difference was “small and of uncertain clinical importance,” the authors conclude.
Filling a research gap
Commenting on the findings, Alexandre T. Rotta, MD, professor of pediatrics and chief of the division of pediatric critical care medicine, Duke University Medical Center, Durham, N.C., said the study “addresses an important yet vastly understudied gap: long-term neurocognitive morbidity in children exposed to critical care.”
Dr. Rotta, who is also a coauthor of an accompanying editorial, noted that the fact that the “vast majority of children with an IQ significantly lower than their siblings were under the age of 4 years suggests that the developing immature brain may be particularly susceptible to the effects of critical illness and therapies required to treat it.”
The study “underscores the need to include assessments of long-term morbidity as part of any future trial evaluating interventions in pediatric critical care,” he added.
The study was supported by grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development for RESTORE-Cognition and by grants for the RESTORE trial from the National Heart, Lung, and Blood Institute and the National Institute of Nursing Research, National Institutes of Health. Dr. Watson and coauthors report no relevant financial relationships. Dr. Rotta has received personal fees from Vapotherm for lecturing and development of educational materials and from Breas US for participation in a scientific advisory board, as well as royalties from Elsevier for editorial work outside the submitted work. His coauthor reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Children who survive an episode of acute respiratory failure that requires invasive mechanical ventilation may be at risk for slightly lower long-term neurocognitive function, new research suggests.
Investigators found lower IQs in children without previous neurocognitive problems who survived pediatric intensive care unit admission for acute respiratory failure, compared with their biological siblings.
Although this magnitude of difference was small on average, more than twice as many patients as siblings had an IQ of ≤85, and children hospitalized at the youngest ages did worse than their siblings.
“Children surviving acute respiratory failure may benefit from routine evaluation of neurocognitive function after hospital discharge and may require serial evaluation to identify deficits that emerge over the course of child’s continued development to facilitate early intervention to prevent disability and optimize school performance,” study investigator R. Scott Watson, MD, MPH, professor of pediatrics, University of Washington, Seattle, told this news organization.
The study was published online March 1 in JAMA.
Unknown long-term effects
“Approximately 23,700 U.S. children undergo invasive mechanical ventilation for acute respiratory failure annually, with unknown long-term effects on neurocognitive function,” the authors write.
“With improvements in pediatric critical care over the past several decades, critical illness–associated mortality has improved dramatically [but] as survivorship has increased, we are starting to learn that many patients and their families suffer from long-term morbidity associated with the illness and its treatment,” said Dr. Watson, who is the associate division chief, pediatric critical care medicine, Seattle Children’s Hospital, Center for Child Health, Behavior, and Development.
Animal studies “have found that some sedative medications commonly used to keep children safe during mechanical ventilation may have detrimental neurologic effects, particularly in the developing brain,” Dr. Watson added.
To gain a better understanding of this potential association, the researchers turned to a subset of participants in the previously conducted Randomized Evaluation of Sedation Titration for Respiratory Failure (RESTORE) trial of pediatric patients receiving mechanical ventilation for acute respiratory failure.
For the current study (RESTORE-Cognition), multiple domains of neurocognitive function were assessed 3-8 years after hospital discharge in trial patients who did not have a history of neurocognitive dysfunction, as well as matched, healthy siblings.
To be included in the study, the children had to be ≤8 years old at trial enrollment, have a Pediatric Cerebral Performance Category (PCPC) score of 1 (normal) prior to PICU admission, and have no worse than moderate neurocognitive dysfunction at PICU discharge.
Siblings of enrolled patients were required to be between 4 and 16 years old at the time of neurocognitive testing, have a PCPC score of 1, have the same biological parents as the patient, and live with the patient.
The primary outcome was IQ, estimated by the age-appropriate Vocabulary and Block Design subtests of the Wechsler Intelligence Scale. Secondary outcomes included attention, processing speed, learning and memory, visuospatial skills, motor skills, language, and executive function. Enough time was allowed after hospitalization “for transient deficits to resolve and longer-lasting neurocognitive sequelae to manifest.”
‘Uncertain’ clinical importance
Of the 121 sibling pairs (67% non-Hispanic White, 47% from families in which one or both parents worked full-time), 116 were included in the primary outcome analysis, and 66-19 were included in analyses of secondary outcomes.
Patients had been in the PICU at a median (interquartile range [IQR]) age of 1.0 (0.2-3.2) years and had received a median of 5.5 (3.1-7.7) days of invasive mechanical ventilation.
The median age at testing for patients and matched siblings was 6.6 (5.4-9.1) and 8.4 (7.0-10.2) years, respectively. Interviews with parents and testing of patients were conducted a median (IQR) of 3.8 (3.2-5.2) and 5.2 (4.3-6.1) years, respectively, after hospitalization.
The most common etiologies of respiratory failure were bronchiolitis and asthma and pneumonia (44% and 37%, respectively). Beyond respiratory failure, most patients (72%) also had experienced multiple organ dysfunction syndrome.
Patients had a lower mean estimated IQ, compared with the matched siblings (101.5 vs. 104.3; mean difference, –2.8 [95% confidence interval, –5.4 to –0.2]), and more patients than siblings had an estimated IQ of ≤5 but not of ≤70.
Patients also had significantly lower scores on nonverbal memory, visuospatial skills, and fine motor control (mean differences, –0.9 [–1.6 to –0.3]; –0.9 [–1.8 to –.1]; and –-3.1 [–4.9 to –1.4], respectively), compared with matched siblings. They also had significantly higher scores on processing speed (mean difference, 4.4 [0.2-8.5]). There were no significant differences in the other secondary outcomes.
Differences in scores between patients and siblings varied significantly by age at hospitalization in several tests – for example, Block Design scores in patients were lower than those of siblings for patients hospitalized at <1 year old, versus those hospitalized between ages 4 and 8 years.
“When adjusting for patient age at PICU admission, patient age at testing, sibling age at testing, and duration between hospital discharge and testing, the difference in estimated IQ between patients and siblings remained statistically significantly different,” the authors note.
The investigators point out several limitations, including the fact that “little is known about sibling outcomes after critical illness, nor about whether parenting of siblings or child development differs based on birth order or on relationship between patient critical illness and the birth of siblings. ... If siblings also incur negative effects related to the critical illness, differences between critically ill children and the control siblings would be blunted.”
Despite the statistical significance of the difference between the patients and the matched controls, ultimately, the magnitude of the difference was “small and of uncertain clinical importance,” the authors conclude.
Filling a research gap
Commenting on the findings, Alexandre T. Rotta, MD, professor of pediatrics and chief of the division of pediatric critical care medicine, Duke University Medical Center, Durham, N.C., said the study “addresses an important yet vastly understudied gap: long-term neurocognitive morbidity in children exposed to critical care.”
Dr. Rotta, who is also a coauthor of an accompanying editorial, noted that the fact that the “vast majority of children with an IQ significantly lower than their siblings were under the age of 4 years suggests that the developing immature brain may be particularly susceptible to the effects of critical illness and therapies required to treat it.”
The study “underscores the need to include assessments of long-term morbidity as part of any future trial evaluating interventions in pediatric critical care,” he added.
The study was supported by grants from the Eunice Kennedy Shriver National Institute of Child Health and Human Development for RESTORE-Cognition and by grants for the RESTORE trial from the National Heart, Lung, and Blood Institute and the National Institute of Nursing Research, National Institutes of Health. Dr. Watson and coauthors report no relevant financial relationships. Dr. Rotta has received personal fees from Vapotherm for lecturing and development of educational materials and from Breas US for participation in a scientific advisory board, as well as royalties from Elsevier for editorial work outside the submitted work. His coauthor reports no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA
Mental illness tied to increased dementia risk
Results of a large, longitudinal, population-based study show that individuals hospitalized for a mental health disorder had a fourfold increased relative risk (RR) for developing dementia, compared with those who were not hospitalized with a mental illness.
In addition, those with dementia plus a mental disorder developed dementia almost 6 years earlier than those without a mental illness.
The findings were consistent among men and women, in patients with early- and late-onset dementia, in those with Alzheimer’s and non-Alzheimer’s dementia, and across all mental health disorders – and remained so after accounting for pre-existing physical illness and socioeconomic factors.
“Dementia is not typically treated until later in life, but our study suggests that we need to be thinking about dementia prevention much earlier in the life course,” study investigator Leah Richmond-Rakerd, PhD, assistant professor, department of psychology, University of Michigan, said in an interview.
“Supporting young people’s mental health could be a window of opportunity to help reduce the burden of dementia in older adults,” she said.
The findings were published online Feb. 16.
Underappreciated risk factor
“Recognition of the outsized influence of dementia on later-life functioning has fueled research into modifiable risk factors and prevention targets,” the investigators write.
Previous research suggests mental disorders may “comprise an underappreciated category of modifiable risk factors.” However, those studies focused primarily on midlife and older individuals, not on capturing mental disorders during young adulthood, which is the time of “peak prevalence,” they add. In addition, most studies have not explored the full range of mental disorders.
Dr. Richmond-Rakerd noted that it is well known that mental health disorders peak in adolescence and young adulthood – and are treatable.
“If the same people who have mental disorders when they are young tend to develop dementia when they are older, that would mean that preventing mental health problems in younger people might reduce or delay the burden of dementia in older people,” she said.
The investigators assessed records from the New Zealand Integrated Data Infrastructure, which is a de-identified register that includes the entire New Zealand population. They also examined information about hospitalizations and diagnoses from records kept by the New Zealand Ministry of Health.
The researchers followed 1,711,386 individuals born between 1928 and 1967 (50.6% men, aged 21 to 60 years at baseline) for 30 years. The population was subdivided into age groups based on birth years: 1928-1937 (14.8%), 1938-1947 (20.85%), 1948-1957 (29.35%), and 1958-1967 (35.1%).
Earlier onset
During the study period, 3.8% of individuals were identified as having a mental disorder, and 2% were identified as having dementia. Similar percentages of men and women had a mental disorder, and similar percentages had dementia.
Dementia was “over-represented” among participants with versus without a mental disorder (6.1% vs. 1.8%). This finding held across all age groups.
Those diagnosed with a mental disorder were also more likely to develop dementia, compared with their peers without a mental disorder (RR, 3.51; 95% confidence interval, 3.39-3.64), which is a larger association than that between physical diseases and dementia (RR, 1.19; 95% CI, 1.16-1.21).
These associations were present in both sexes and in all age groups, although the associations were stronger in more recently born cohorts.
A sixfold higher risk for dementia remained even after adjusting for pre-existing physical illnesses (HR, 6.49; 95% CI, 6.25-6.73); and the elevated risk was evident across different lengths of follow-up from the index mental disorder.
When the researchers focused specifically on individuals diagnosed with dementia, they found that those diagnosed with a mental disorder developed dementia a mean of 5.60 years earlier than those without a mental disorder diagnosis – an association observed across both sexes and all age groups.
“Individuals diagnosed with psychotic, substance use, mood, neurotic, and all other mental disorders and who engaged in self-harm were all more likely than those without a mental disorder to be diagnosed with subsequent dementia, even after accounting for their physical disease histories,” the investigators write.
Although there was a link between mental disorders in both Alzheimer’s and non-Alzheimer’s dementias, the association was larger in non-Alzheimer’s.
The researchers note that the study has several limitations, including the fact that it was conducted in New Zealand and therefore the results may not be generalizable to other regions. In addition, inpatient hospital records do not capture less severe mental disorder cases treated in the outpatient setting.
Dr. Richmond-Rakerd suggested several potential mechanisms that could account for the link between mental illness and dementia, including poor lifestyle choices and metabolic side effects associated with some psychiatric medications.
“There could also be shared risk factors for both mental disorders and dementia, such as shared genetics, or individuals may experience a lifelong brain vulnerability that shows up as mental health problems earlier in life and shows up as dementia later in life,” she said.
An important risk factor
Commenting for this article, Ken Duckworth, MD, chief medical officer of the National Alliance on Mental Illness, said a major strength of the study was its longitudinal scope and large population size.
He described the study as allowing clinicians to “watch the movie,” as opposed to looking at a “snapshot” of data.
“Although you can learn things from snapshots, a large, comprehensive public health system looking at 30 years of claims – something not possible in the U.S. because of our more fragmented health care system – offers more insight,” said Dr. Duckworth, who was not involved with the research.
The investigators are “painting a picture of a correlation of risk, and to me, that’s the beginning of further inquiry,” he added. “Would preventive efforts targeting dementia, such as exercise and socialization, be helpful? It’s a great study that raises these interesting questions.”
Also commenting in an interview, Claire Sexton, DPhil, director of scientific programs and outreach at the Alzheimer’s Association, said the study “adds a wealth of data to our understanding” of mental disorders as a dementia risk factor.
However, the study was observational, so “the findings cannot imply causation, [and just] because someone has depression, that does not mean they will go on to develop Alzheimer’s,” said Dr. Sexton, who also was not involved with the research.
Still, “these data support the idea that taking care of one’s mental health is incredibly important for overall wellbeing. For providers, it’s important to have mental health evaluation be a part of your patient’s regular checkups,” she added.
Dr. Richmond-Rakerd noted that even if mental health conditions are not a causal risk factor for dementia, “the presence of a mental health problem is still an important indicator of risk. Mental health providers may wish to target other risk factors for dementia that are more common in individuals with mental health conditions, such as social disconnection.”
The study was funded by grants from the National Institute on Aging, the U.K. Medical Research Council, the National Institute of Child Health and Development through the Duke Population Research Center, and the National Institute on Aging through the Center for Advancing Sociodemographic and Economic Study of Alzheimer’s Disease and Related Dementias. Dr. Richmond-Rakerd reports no relevant financial relationships. The other investigators’ disclosures are listed in the original article. Dr. Sexton and Dr. Duckworth report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Results of a large, longitudinal, population-based study show that individuals hospitalized for a mental health disorder had a fourfold increased relative risk (RR) for developing dementia, compared with those who were not hospitalized with a mental illness.
In addition, those with dementia plus a mental disorder developed dementia almost 6 years earlier than those without a mental illness.
The findings were consistent among men and women, in patients with early- and late-onset dementia, in those with Alzheimer’s and non-Alzheimer’s dementia, and across all mental health disorders – and remained so after accounting for pre-existing physical illness and socioeconomic factors.
“Dementia is not typically treated until later in life, but our study suggests that we need to be thinking about dementia prevention much earlier in the life course,” study investigator Leah Richmond-Rakerd, PhD, assistant professor, department of psychology, University of Michigan, said in an interview.
“Supporting young people’s mental health could be a window of opportunity to help reduce the burden of dementia in older adults,” she said.
The findings were published online Feb. 16.
Underappreciated risk factor
“Recognition of the outsized influence of dementia on later-life functioning has fueled research into modifiable risk factors and prevention targets,” the investigators write.
Previous research suggests mental disorders may “comprise an underappreciated category of modifiable risk factors.” However, those studies focused primarily on midlife and older individuals, not on capturing mental disorders during young adulthood, which is the time of “peak prevalence,” they add. In addition, most studies have not explored the full range of mental disorders.
Dr. Richmond-Rakerd noted that it is well known that mental health disorders peak in adolescence and young adulthood – and are treatable.
“If the same people who have mental disorders when they are young tend to develop dementia when they are older, that would mean that preventing mental health problems in younger people might reduce or delay the burden of dementia in older people,” she said.
The investigators assessed records from the New Zealand Integrated Data Infrastructure, which is a de-identified register that includes the entire New Zealand population. They also examined information about hospitalizations and diagnoses from records kept by the New Zealand Ministry of Health.
The researchers followed 1,711,386 individuals born between 1928 and 1967 (50.6% men, aged 21 to 60 years at baseline) for 30 years. The population was subdivided into age groups based on birth years: 1928-1937 (14.8%), 1938-1947 (20.85%), 1948-1957 (29.35%), and 1958-1967 (35.1%).
Earlier onset
During the study period, 3.8% of individuals were identified as having a mental disorder, and 2% were identified as having dementia. Similar percentages of men and women had a mental disorder, and similar percentages had dementia.
Dementia was “over-represented” among participants with versus without a mental disorder (6.1% vs. 1.8%). This finding held across all age groups.
Those diagnosed with a mental disorder were also more likely to develop dementia, compared with their peers without a mental disorder (RR, 3.51; 95% confidence interval, 3.39-3.64), which is a larger association than that between physical diseases and dementia (RR, 1.19; 95% CI, 1.16-1.21).
These associations were present in both sexes and in all age groups, although the associations were stronger in more recently born cohorts.
A sixfold higher risk for dementia remained even after adjusting for pre-existing physical illnesses (HR, 6.49; 95% CI, 6.25-6.73); and the elevated risk was evident across different lengths of follow-up from the index mental disorder.
When the researchers focused specifically on individuals diagnosed with dementia, they found that those diagnosed with a mental disorder developed dementia a mean of 5.60 years earlier than those without a mental disorder diagnosis – an association observed across both sexes and all age groups.
“Individuals diagnosed with psychotic, substance use, mood, neurotic, and all other mental disorders and who engaged in self-harm were all more likely than those without a mental disorder to be diagnosed with subsequent dementia, even after accounting for their physical disease histories,” the investigators write.
Although there was a link between mental disorders in both Alzheimer’s and non-Alzheimer’s dementias, the association was larger in non-Alzheimer’s.
The researchers note that the study has several limitations, including the fact that it was conducted in New Zealand and therefore the results may not be generalizable to other regions. In addition, inpatient hospital records do not capture less severe mental disorder cases treated in the outpatient setting.
Dr. Richmond-Rakerd suggested several potential mechanisms that could account for the link between mental illness and dementia, including poor lifestyle choices and metabolic side effects associated with some psychiatric medications.
“There could also be shared risk factors for both mental disorders and dementia, such as shared genetics, or individuals may experience a lifelong brain vulnerability that shows up as mental health problems earlier in life and shows up as dementia later in life,” she said.
An important risk factor
Commenting for this article, Ken Duckworth, MD, chief medical officer of the National Alliance on Mental Illness, said a major strength of the study was its longitudinal scope and large population size.
He described the study as allowing clinicians to “watch the movie,” as opposed to looking at a “snapshot” of data.
“Although you can learn things from snapshots, a large, comprehensive public health system looking at 30 years of claims – something not possible in the U.S. because of our more fragmented health care system – offers more insight,” said Dr. Duckworth, who was not involved with the research.
The investigators are “painting a picture of a correlation of risk, and to me, that’s the beginning of further inquiry,” he added. “Would preventive efforts targeting dementia, such as exercise and socialization, be helpful? It’s a great study that raises these interesting questions.”
Also commenting in an interview, Claire Sexton, DPhil, director of scientific programs and outreach at the Alzheimer’s Association, said the study “adds a wealth of data to our understanding” of mental disorders as a dementia risk factor.
However, the study was observational, so “the findings cannot imply causation, [and just] because someone has depression, that does not mean they will go on to develop Alzheimer’s,” said Dr. Sexton, who also was not involved with the research.
Still, “these data support the idea that taking care of one’s mental health is incredibly important for overall wellbeing. For providers, it’s important to have mental health evaluation be a part of your patient’s regular checkups,” she added.
Dr. Richmond-Rakerd noted that even if mental health conditions are not a causal risk factor for dementia, “the presence of a mental health problem is still an important indicator of risk. Mental health providers may wish to target other risk factors for dementia that are more common in individuals with mental health conditions, such as social disconnection.”
The study was funded by grants from the National Institute on Aging, the U.K. Medical Research Council, the National Institute of Child Health and Development through the Duke Population Research Center, and the National Institute on Aging through the Center for Advancing Sociodemographic and Economic Study of Alzheimer’s Disease and Related Dementias. Dr. Richmond-Rakerd reports no relevant financial relationships. The other investigators’ disclosures are listed in the original article. Dr. Sexton and Dr. Duckworth report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Results of a large, longitudinal, population-based study show that individuals hospitalized for a mental health disorder had a fourfold increased relative risk (RR) for developing dementia, compared with those who were not hospitalized with a mental illness.
In addition, those with dementia plus a mental disorder developed dementia almost 6 years earlier than those without a mental illness.
The findings were consistent among men and women, in patients with early- and late-onset dementia, in those with Alzheimer’s and non-Alzheimer’s dementia, and across all mental health disorders – and remained so after accounting for pre-existing physical illness and socioeconomic factors.
“Dementia is not typically treated until later in life, but our study suggests that we need to be thinking about dementia prevention much earlier in the life course,” study investigator Leah Richmond-Rakerd, PhD, assistant professor, department of psychology, University of Michigan, said in an interview.
“Supporting young people’s mental health could be a window of opportunity to help reduce the burden of dementia in older adults,” she said.
The findings were published online Feb. 16.
Underappreciated risk factor
“Recognition of the outsized influence of dementia on later-life functioning has fueled research into modifiable risk factors and prevention targets,” the investigators write.
Previous research suggests mental disorders may “comprise an underappreciated category of modifiable risk factors.” However, those studies focused primarily on midlife and older individuals, not on capturing mental disorders during young adulthood, which is the time of “peak prevalence,” they add. In addition, most studies have not explored the full range of mental disorders.
Dr. Richmond-Rakerd noted that it is well known that mental health disorders peak in adolescence and young adulthood – and are treatable.
“If the same people who have mental disorders when they are young tend to develop dementia when they are older, that would mean that preventing mental health problems in younger people might reduce or delay the burden of dementia in older people,” she said.
The investigators assessed records from the New Zealand Integrated Data Infrastructure, which is a de-identified register that includes the entire New Zealand population. They also examined information about hospitalizations and diagnoses from records kept by the New Zealand Ministry of Health.
The researchers followed 1,711,386 individuals born between 1928 and 1967 (50.6% men, aged 21 to 60 years at baseline) for 30 years. The population was subdivided into age groups based on birth years: 1928-1937 (14.8%), 1938-1947 (20.85%), 1948-1957 (29.35%), and 1958-1967 (35.1%).
Earlier onset
During the study period, 3.8% of individuals were identified as having a mental disorder, and 2% were identified as having dementia. Similar percentages of men and women had a mental disorder, and similar percentages had dementia.
Dementia was “over-represented” among participants with versus without a mental disorder (6.1% vs. 1.8%). This finding held across all age groups.
Those diagnosed with a mental disorder were also more likely to develop dementia, compared with their peers without a mental disorder (RR, 3.51; 95% confidence interval, 3.39-3.64), which is a larger association than that between physical diseases and dementia (RR, 1.19; 95% CI, 1.16-1.21).
These associations were present in both sexes and in all age groups, although the associations were stronger in more recently born cohorts.
A sixfold higher risk for dementia remained even after adjusting for pre-existing physical illnesses (HR, 6.49; 95% CI, 6.25-6.73); and the elevated risk was evident across different lengths of follow-up from the index mental disorder.
When the researchers focused specifically on individuals diagnosed with dementia, they found that those diagnosed with a mental disorder developed dementia a mean of 5.60 years earlier than those without a mental disorder diagnosis – an association observed across both sexes and all age groups.
“Individuals diagnosed with psychotic, substance use, mood, neurotic, and all other mental disorders and who engaged in self-harm were all more likely than those without a mental disorder to be diagnosed with subsequent dementia, even after accounting for their physical disease histories,” the investigators write.
Although there was a link between mental disorders in both Alzheimer’s and non-Alzheimer’s dementias, the association was larger in non-Alzheimer’s.
The researchers note that the study has several limitations, including the fact that it was conducted in New Zealand and therefore the results may not be generalizable to other regions. In addition, inpatient hospital records do not capture less severe mental disorder cases treated in the outpatient setting.
Dr. Richmond-Rakerd suggested several potential mechanisms that could account for the link between mental illness and dementia, including poor lifestyle choices and metabolic side effects associated with some psychiatric medications.
“There could also be shared risk factors for both mental disorders and dementia, such as shared genetics, or individuals may experience a lifelong brain vulnerability that shows up as mental health problems earlier in life and shows up as dementia later in life,” she said.
An important risk factor
Commenting for this article, Ken Duckworth, MD, chief medical officer of the National Alliance on Mental Illness, said a major strength of the study was its longitudinal scope and large population size.
He described the study as allowing clinicians to “watch the movie,” as opposed to looking at a “snapshot” of data.
“Although you can learn things from snapshots, a large, comprehensive public health system looking at 30 years of claims – something not possible in the U.S. because of our more fragmented health care system – offers more insight,” said Dr. Duckworth, who was not involved with the research.
The investigators are “painting a picture of a correlation of risk, and to me, that’s the beginning of further inquiry,” he added. “Would preventive efforts targeting dementia, such as exercise and socialization, be helpful? It’s a great study that raises these interesting questions.”
Also commenting in an interview, Claire Sexton, DPhil, director of scientific programs and outreach at the Alzheimer’s Association, said the study “adds a wealth of data to our understanding” of mental disorders as a dementia risk factor.
However, the study was observational, so “the findings cannot imply causation, [and just] because someone has depression, that does not mean they will go on to develop Alzheimer’s,” said Dr. Sexton, who also was not involved with the research.
Still, “these data support the idea that taking care of one’s mental health is incredibly important for overall wellbeing. For providers, it’s important to have mental health evaluation be a part of your patient’s regular checkups,” she added.
Dr. Richmond-Rakerd noted that even if mental health conditions are not a causal risk factor for dementia, “the presence of a mental health problem is still an important indicator of risk. Mental health providers may wish to target other risk factors for dementia that are more common in individuals with mental health conditions, such as social disconnection.”
The study was funded by grants from the National Institute on Aging, the U.K. Medical Research Council, the National Institute of Child Health and Development through the Duke Population Research Center, and the National Institute on Aging through the Center for Advancing Sociodemographic and Economic Study of Alzheimer’s Disease and Related Dementias. Dr. Richmond-Rakerd reports no relevant financial relationships. The other investigators’ disclosures are listed in the original article. Dr. Sexton and Dr. Duckworth report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA PSYCHIATRY
Healthy gut tied to better cognition
Investigators conducted cognitive testing and analyzed stool samples in close to 600 adults and found that beta-diversity, which is a between-person measure of gut microbial community composition, was significantly associated with cognitive scores.
Three specific bacterial genera showed a positive association with performance on at least one cognitive test, while one showed a negative association.
“Data from our study support an association between the gut microbial community and measure of cognitive function – results that are consistent with findings from other human and animal research,” study investigator Katie Meyer, ScD, assistant professor, department of nutrition, UNC Gillings School of Public Health, Chapel Hill, N.C., told this news organization.
“However, it is also important to recognize that we are still learning about how to characterize the role of this dynamic ecological community and delineate mechanistic pathways,” she said.
The study was published online Feb 8 in JAMA Network Open.
‘Novel’ research
“Communication pathways between gut bacteria and neurologic function (referred to as the ‘gut-brain axis’) have emerged as a novel area of research into potential mechanisms regulating brain health through immunologic, metabolic, and endocrine pathways,” the authors wrote.
A number of studies have “shown associations between gut microbial measures and neurological outcomes, including cognitive function and dementia,” but mechanisms underlying these associations “have not been fully established.”
Animal and small-scale human studies have suggested that reduced microbial diversity is associated with poorer cognition, but studies have not been conducted in community-based large and diverse populations.
The researchers therefore examined cross-sectional associations of gut microbial diversity and taxonomic composition with cognitive status in a large group of community-dwelling, sociodemographically diverse Black and White adults living in four metropolitan areas who were participants in the Coronary Artery Risk Development in Young Adults (CARDIA) study.
They hypothesized that microbial diversity would be positively associated with global as well as domain-specific cognitive status and that higher cognitive status would be associated with specific taxonomic groups involved in short-chain fatty acid production.
The CARDIA’s year 30 follow-up examination took place during 2015-2016, when the original participants ranged in age from 48 to 60 years. During that examination, participants took a battery of cognitive assessments, and 615 also provided a stool sample for a microbiome substudy; of these, 597 (mean [SD] age, 55.2 [3.5] years, 44.7% Black, 45.2% White) had both stool DNA available for sequencing and a complete complement of cognitive tests and were included in the current study.
The cognitive tests included the Digit Symbol Substitution Test (DSST); Rey-Auditory Verbal Learning Test (RAVLT); the timed Stroop test; letter fluency and category fluency; and the Montreal Cognitive Assessment (MoCA).
Covariates that might confound associations between microbial and cognitive measures, including body mass index, diabetes, age, sex, race, field center, education, physical activity, current smoking, diet quality, number of medications, and hypertension, were included in the analyses.
The investigators conducted three standard microbial analyses: within-person alpha-diversity; between-person beta-diversity; and individual taxa.
Potential pathways
The strongest associations in the variance tests for beta-diversity, which were significant for all cognition measures in multivariable-adjusted principal coordinates analysis (all Ps = .001 except for the Stroop, which was .007). However, the association with letter fluency was not deemed significant (P = .07).
After fully adjusting for sociodemographic variables, health behaviors, and clinical covariates, the researchers found that three genera were positively associated, while one was negatively associated with cognitive measures.
“The strongest results from our study were from a multivariate analysis that can be considered a test of the overall community,” said Dr. Meyer.
She pointed to several pathways through which gut microbiota can contribute to brain health.
“We know from mechanistic studies in animal models that the gut microbiota is involved in systemic inflammation, which is a risk factor for brain pathology,” she said.
Moreover, “the gut microbiota is involved in the production of metabolites that may impact the brain, including tryptophan metabolites and short-chain fatty acids, many of which derive from dietary components, which may help explain associations between diet and cognition (e.g., the Mediterranean-style diet can be protective),” she added.
Starting point
Commenting for this news organization, Timothy Dinan, MD, PhD, professor of psychiatry and an investigator, APC Microbiome Institute, University College Cork, Ireland, said, “This is an important study, adding to the growing body of evidence that gut microbes influence brain function.”
Dr. Dinan, who was not involved with the study, continued: “In an impressively large sample, an association between cognition and gut microbiota architecture was demonstrated.”
He cautioned that the study “is limited by the fact that it is cross-sectional, and the relationships are correlational.” Nevertheless, “despite these obvious caveats, the paper undoubtedly advances the field.”
Dr. Meyer agreed, noting that there is “a paucity of biomarkers that can be used to predict cognitive decline and dementia,” but because their study was cross-sectional, “we cannot assess temporality (i.e., whether gut microbiota predicts cognitive decline); but, as a start, we can assess associations.”
She added that “at this point, we know far more about modifiable risk factors that have been shown to be positively associated with cognitive function,” including eating a Mediterranean diet and engaging in physical activity.
“It is possible that protective effects of diet and activity may, in part, operate thorough the gut microbiota,” Dr. Meyer suggested.
The CARDIA study is supported by the National Heart, Lung, and Blood Institute, the Intramural Research Program of the National Institute on Aging, and the University of North Carolina Nutrition Research Institute. Dr. Meyer and coauthors and Dr. Dinan report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Investigators conducted cognitive testing and analyzed stool samples in close to 600 adults and found that beta-diversity, which is a between-person measure of gut microbial community composition, was significantly associated with cognitive scores.
Three specific bacterial genera showed a positive association with performance on at least one cognitive test, while one showed a negative association.
“Data from our study support an association between the gut microbial community and measure of cognitive function – results that are consistent with findings from other human and animal research,” study investigator Katie Meyer, ScD, assistant professor, department of nutrition, UNC Gillings School of Public Health, Chapel Hill, N.C., told this news organization.
“However, it is also important to recognize that we are still learning about how to characterize the role of this dynamic ecological community and delineate mechanistic pathways,” she said.
The study was published online Feb 8 in JAMA Network Open.
‘Novel’ research
“Communication pathways between gut bacteria and neurologic function (referred to as the ‘gut-brain axis’) have emerged as a novel area of research into potential mechanisms regulating brain health through immunologic, metabolic, and endocrine pathways,” the authors wrote.
A number of studies have “shown associations between gut microbial measures and neurological outcomes, including cognitive function and dementia,” but mechanisms underlying these associations “have not been fully established.”
Animal and small-scale human studies have suggested that reduced microbial diversity is associated with poorer cognition, but studies have not been conducted in community-based large and diverse populations.
The researchers therefore examined cross-sectional associations of gut microbial diversity and taxonomic composition with cognitive status in a large group of community-dwelling, sociodemographically diverse Black and White adults living in four metropolitan areas who were participants in the Coronary Artery Risk Development in Young Adults (CARDIA) study.
They hypothesized that microbial diversity would be positively associated with global as well as domain-specific cognitive status and that higher cognitive status would be associated with specific taxonomic groups involved in short-chain fatty acid production.
The CARDIA’s year 30 follow-up examination took place during 2015-2016, when the original participants ranged in age from 48 to 60 years. During that examination, participants took a battery of cognitive assessments, and 615 also provided a stool sample for a microbiome substudy; of these, 597 (mean [SD] age, 55.2 [3.5] years, 44.7% Black, 45.2% White) had both stool DNA available for sequencing and a complete complement of cognitive tests and were included in the current study.
The cognitive tests included the Digit Symbol Substitution Test (DSST); Rey-Auditory Verbal Learning Test (RAVLT); the timed Stroop test; letter fluency and category fluency; and the Montreal Cognitive Assessment (MoCA).
Covariates that might confound associations between microbial and cognitive measures, including body mass index, diabetes, age, sex, race, field center, education, physical activity, current smoking, diet quality, number of medications, and hypertension, were included in the analyses.
The investigators conducted three standard microbial analyses: within-person alpha-diversity; between-person beta-diversity; and individual taxa.
Potential pathways
The strongest associations in the variance tests for beta-diversity, which were significant for all cognition measures in multivariable-adjusted principal coordinates analysis (all Ps = .001 except for the Stroop, which was .007). However, the association with letter fluency was not deemed significant (P = .07).
After fully adjusting for sociodemographic variables, health behaviors, and clinical covariates, the researchers found that three genera were positively associated, while one was negatively associated with cognitive measures.
“The strongest results from our study were from a multivariate analysis that can be considered a test of the overall community,” said Dr. Meyer.
She pointed to several pathways through which gut microbiota can contribute to brain health.
“We know from mechanistic studies in animal models that the gut microbiota is involved in systemic inflammation, which is a risk factor for brain pathology,” she said.
Moreover, “the gut microbiota is involved in the production of metabolites that may impact the brain, including tryptophan metabolites and short-chain fatty acids, many of which derive from dietary components, which may help explain associations between diet and cognition (e.g., the Mediterranean-style diet can be protective),” she added.
Starting point
Commenting for this news organization, Timothy Dinan, MD, PhD, professor of psychiatry and an investigator, APC Microbiome Institute, University College Cork, Ireland, said, “This is an important study, adding to the growing body of evidence that gut microbes influence brain function.”
Dr. Dinan, who was not involved with the study, continued: “In an impressively large sample, an association between cognition and gut microbiota architecture was demonstrated.”
He cautioned that the study “is limited by the fact that it is cross-sectional, and the relationships are correlational.” Nevertheless, “despite these obvious caveats, the paper undoubtedly advances the field.”
Dr. Meyer agreed, noting that there is “a paucity of biomarkers that can be used to predict cognitive decline and dementia,” but because their study was cross-sectional, “we cannot assess temporality (i.e., whether gut microbiota predicts cognitive decline); but, as a start, we can assess associations.”
She added that “at this point, we know far more about modifiable risk factors that have been shown to be positively associated with cognitive function,” including eating a Mediterranean diet and engaging in physical activity.
“It is possible that protective effects of diet and activity may, in part, operate thorough the gut microbiota,” Dr. Meyer suggested.
The CARDIA study is supported by the National Heart, Lung, and Blood Institute, the Intramural Research Program of the National Institute on Aging, and the University of North Carolina Nutrition Research Institute. Dr. Meyer and coauthors and Dr. Dinan report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
Investigators conducted cognitive testing and analyzed stool samples in close to 600 adults and found that beta-diversity, which is a between-person measure of gut microbial community composition, was significantly associated with cognitive scores.
Three specific bacterial genera showed a positive association with performance on at least one cognitive test, while one showed a negative association.
“Data from our study support an association between the gut microbial community and measure of cognitive function – results that are consistent with findings from other human and animal research,” study investigator Katie Meyer, ScD, assistant professor, department of nutrition, UNC Gillings School of Public Health, Chapel Hill, N.C., told this news organization.
“However, it is also important to recognize that we are still learning about how to characterize the role of this dynamic ecological community and delineate mechanistic pathways,” she said.
The study was published online Feb 8 in JAMA Network Open.
‘Novel’ research
“Communication pathways between gut bacteria and neurologic function (referred to as the ‘gut-brain axis’) have emerged as a novel area of research into potential mechanisms regulating brain health through immunologic, metabolic, and endocrine pathways,” the authors wrote.
A number of studies have “shown associations between gut microbial measures and neurological outcomes, including cognitive function and dementia,” but mechanisms underlying these associations “have not been fully established.”
Animal and small-scale human studies have suggested that reduced microbial diversity is associated with poorer cognition, but studies have not been conducted in community-based large and diverse populations.
The researchers therefore examined cross-sectional associations of gut microbial diversity and taxonomic composition with cognitive status in a large group of community-dwelling, sociodemographically diverse Black and White adults living in four metropolitan areas who were participants in the Coronary Artery Risk Development in Young Adults (CARDIA) study.
They hypothesized that microbial diversity would be positively associated with global as well as domain-specific cognitive status and that higher cognitive status would be associated with specific taxonomic groups involved in short-chain fatty acid production.
The CARDIA’s year 30 follow-up examination took place during 2015-2016, when the original participants ranged in age from 48 to 60 years. During that examination, participants took a battery of cognitive assessments, and 615 also provided a stool sample for a microbiome substudy; of these, 597 (mean [SD] age, 55.2 [3.5] years, 44.7% Black, 45.2% White) had both stool DNA available for sequencing and a complete complement of cognitive tests and were included in the current study.
The cognitive tests included the Digit Symbol Substitution Test (DSST); Rey-Auditory Verbal Learning Test (RAVLT); the timed Stroop test; letter fluency and category fluency; and the Montreal Cognitive Assessment (MoCA).
Covariates that might confound associations between microbial and cognitive measures, including body mass index, diabetes, age, sex, race, field center, education, physical activity, current smoking, diet quality, number of medications, and hypertension, were included in the analyses.
The investigators conducted three standard microbial analyses: within-person alpha-diversity; between-person beta-diversity; and individual taxa.
Potential pathways
The strongest associations in the variance tests for beta-diversity, which were significant for all cognition measures in multivariable-adjusted principal coordinates analysis (all Ps = .001 except for the Stroop, which was .007). However, the association with letter fluency was not deemed significant (P = .07).
After fully adjusting for sociodemographic variables, health behaviors, and clinical covariates, the researchers found that three genera were positively associated, while one was negatively associated with cognitive measures.
“The strongest results from our study were from a multivariate analysis that can be considered a test of the overall community,” said Dr. Meyer.
She pointed to several pathways through which gut microbiota can contribute to brain health.
“We know from mechanistic studies in animal models that the gut microbiota is involved in systemic inflammation, which is a risk factor for brain pathology,” she said.
Moreover, “the gut microbiota is involved in the production of metabolites that may impact the brain, including tryptophan metabolites and short-chain fatty acids, many of which derive from dietary components, which may help explain associations between diet and cognition (e.g., the Mediterranean-style diet can be protective),” she added.
Starting point
Commenting for this news organization, Timothy Dinan, MD, PhD, professor of psychiatry and an investigator, APC Microbiome Institute, University College Cork, Ireland, said, “This is an important study, adding to the growing body of evidence that gut microbes influence brain function.”
Dr. Dinan, who was not involved with the study, continued: “In an impressively large sample, an association between cognition and gut microbiota architecture was demonstrated.”
He cautioned that the study “is limited by the fact that it is cross-sectional, and the relationships are correlational.” Nevertheless, “despite these obvious caveats, the paper undoubtedly advances the field.”
Dr. Meyer agreed, noting that there is “a paucity of biomarkers that can be used to predict cognitive decline and dementia,” but because their study was cross-sectional, “we cannot assess temporality (i.e., whether gut microbiota predicts cognitive decline); but, as a start, we can assess associations.”
She added that “at this point, we know far more about modifiable risk factors that have been shown to be positively associated with cognitive function,” including eating a Mediterranean diet and engaging in physical activity.
“It is possible that protective effects of diet and activity may, in part, operate thorough the gut microbiota,” Dr. Meyer suggested.
The CARDIA study is supported by the National Heart, Lung, and Blood Institute, the Intramural Research Program of the National Institute on Aging, and the University of North Carolina Nutrition Research Institute. Dr. Meyer and coauthors and Dr. Dinan report no relevant financial relationships.
A version of this article first appeared on Medscape.com.
FROM JAMA NETWORK OPEN
Innovative ‘chatbot’ reduces eating disorder risk
Results of a randomized trial show that at-risk women who interacted with the chatbot showed lower concern about their weight and body shape compared to a wait-list control group.
“Chatbots are widely used in industry and have begun to be used in medical settings, although few studies have examined their effectiveness for mental health issues and none address EDs or ED prevention,” senior investigator C. Barr Taylor, MD, a research faculty member at Palo Alto (Calif.) University, said in a press release.
“We found that the group with access to the chatbot had a greater reduction in weight and shape concerns, both right after using it at 3 months and at the 6-month follow-up. The effects had sustainability over time, and we also found indication that the chatbot may reduce ED onset more so than the control group, where there was a greater incidence of EDs,” Dr. Taylor told this news organization.
The study was published online Dec. 28, 2021, in the International Journal of Eating Disorders.
Deadly disorders
“EDs are a common problem with huge risk factors; and, given how widespread they are, we need scalable tools that can reach a lot of people at low cost, reduce risk factors for developing an ED – which is the second most deadly of all psychiatric illnesses – so prevention is of the utmost importance,” Dr. Taylor said.
The investigators developed a targeted Internet-based preventive program called StudentBodies that utilizes cognitive-behavioral therapy approaches. The program was successful in reducing weight/shape concerns in women at high risk for the onset of an ED, and it reduced ED onset in the highest-risk women.
However, it required trained moderators who spent over 45 minutes with participants. Given the large number of people at risk for an ED who might benefit, the researchers noted that it is unlikely that a human-moderated version would be widely disseminated.
A chatbot may represent a “possible solution to reducing delivery costs” because it mimics aspects of human moderation in simulating conversations, the investigators noted.
“We wanted to take the earlier program we developed into this century and program it for delivery in this new format that would allow for bite-size pieces of information for the chatbot to communicate to the user,” lead author Ellen Fitzsimmons-Craft, PhD, assistant professor of psychiatry, Washington University, St. Louis, told this news organization.
“Our ED prevention online version was more effective when there was guidance from a human moderator who could provide feedback on progress, encourage you to go on, and apply the skills in daily life. But that’s not the most scalable. So we thought that a chatbot, in addition to providing content in this perhaps more engaging format, could also provide some aspect of human moderation, although the person is chatting with a robot,” added Dr. Fitzsimmons-Craft, associate director of the Center for Healthy Weight and Wellness.
Tessa will speak to you now
Participants (n = 700 women; mean [SD] age, 21.08 [3.09] years; 84.6% White; 53.8% heterosexual; 31.08% bisexual), were randomized to an intervention group or a wait-list control group (n = 352 and 348, respectively). There were no significant differences between groups in age, race, ethnicity, education, or sexual orientation.
The StudentBodies program was adapted for delivery via a chatbot named Tessa “while retaining the core intervention principles” and referred to as “Body Positive.”
It consisted of several components programmed into the chatbot, which initiated each conversation in a predetermined order. Participants were encouraged to engage in two conversations weekly. The program included an introduction and eight sessions as well as a crisis module that provided users with a referral to a crisis hotline in case of emergency. Referral was triggered on the basis of “recognized keywords,” such as “hurting myself.”
The researchers used the Weight Concerns Scale questionnaire to assess weight and shape concerns and the Internalization: Thin/Low Body Fat subscale of the Sociocultural Attitudes Toward Appearance Questionnaire–4 to “assess the cognitive aspect of thin-ideal internalization.”
Secondary outcomes tested the hypothesis that the chatbot would be more likely to reduce clinical outcomes (ED psychopathology, depression, and anxiety) and prevent ED onset, compared to the control condition.
Ready for prime time
At 3- and 6-month follow-up, there was significantly greater reduction in the intervention group compared with the control group in weight/shape concerns (d = -.20, P = .03 and d = -.19, P = .04, respectively), although there were no differences in thin-ideal internalization change.
The chatbot intervention was associated with significantly greater reductions in overall ED psychopathology at 3 months (d = -.29, P = .003) compared to the control condition, but not at 6 months.
Notably, the intervention group had significantly higher odds than the control group of remaining nonclinical for EDs at 3- and 6-month follow-up (OR, 2.37 [95% confidence interval, 1.37-4.11] and OR, 2.13 [95% CI,1.26-3.59], respectively).
“We were very excited about the study, and frankly, I was surprised by the effectiveness [of the chatbot intervention] because I didn’t think it would have as much of an impact as it did,” said Dr. Taylor. “Prevention gets short shrift everywhere, and I think we succeeded very well.”
Dr. Fitzsimmons-Craft added that the National Eating Disorders Association (NEDA) has agreed to make the chatbot available on its website for people who screen positive for having an ED or for being at high risk, and so their group is working with their industry partner, a company called X2AI, which developed the chatbot, to make this happen.
“This is definitely the fastest research-to-practice translation I’ve ever seen, where we can so quickly show that it works and make it available to tens of thousands almost immediately.”
Dr. Fitzsimmons-Craft is optimistic that it will be available to launch the week of Feb. 21, which is National Eating Disorders Week.
Innovative, creative research
Commenting on the research, Evelyn Attia, MD, professor of psychiatry, Columbia University Medical Center, and director of the Columbia Center for Eating Disorders New York–Presbyterian Hospital, New York, described the study as “innovative and creative.”
Dr. Attia, a member of the Research Advisory Council of the NEDA, noted that the structure of the study is “very preliminary” and that the comparison to a wait-list control makes it hard to know whether this is an effective intervention compared with other types of interventions, rather than compared with no intervention.
“But I’m sure that when the researchers are set up and primed to study this more robustly, they will consider a more active control intervention to see whether this preliminary finding holds up,” she said.
Also commenting on the study, Deborah R. Glasofer, PhD, associate professor of clinical medical psychology (in psychiatry), Columbia Center for Eating Disorders, said, “Higher-than-average concern about appearance – body shape, size, or weight – and a tightly held belief that it is ideal to be thin are known risk factors for the development of an eating disorder.
“This study offers an indication that technology can be leveraged to fill a gap and help folks before unhelpful and sometimes misguided thoughts about food, eating, and appearance evolve into a full-blown eating disorder,” said Dr. Glasofer, who was not involved with the study.
The study was supported by the NEDA Feeding Hope Fund, the National Institute of Mental Health, the National Heart, Lung, and Blood Institute, and the Swedish Research Council. The authors and Dr. Glasofer have disclosed no relevant financial relationships. Dr. Attia is on the board and the Research Advisory Council of NEDA.
A version of this article first appeared on Medscape.com.
Results of a randomized trial show that at-risk women who interacted with the chatbot showed lower concern about their weight and body shape compared to a wait-list control group.
“Chatbots are widely used in industry and have begun to be used in medical settings, although few studies have examined their effectiveness for mental health issues and none address EDs or ED prevention,” senior investigator C. Barr Taylor, MD, a research faculty member at Palo Alto (Calif.) University, said in a press release.
“We found that the group with access to the chatbot had a greater reduction in weight and shape concerns, both right after using it at 3 months and at the 6-month follow-up. The effects had sustainability over time, and we also found indication that the chatbot may reduce ED onset more so than the control group, where there was a greater incidence of EDs,” Dr. Taylor told this news organization.
The study was published online Dec. 28, 2021, in the International Journal of Eating Disorders.
Deadly disorders
“EDs are a common problem with huge risk factors; and, given how widespread they are, we need scalable tools that can reach a lot of people at low cost, reduce risk factors for developing an ED – which is the second most deadly of all psychiatric illnesses – so prevention is of the utmost importance,” Dr. Taylor said.
The investigators developed a targeted Internet-based preventive program called StudentBodies that utilizes cognitive-behavioral therapy approaches. The program was successful in reducing weight/shape concerns in women at high risk for the onset of an ED, and it reduced ED onset in the highest-risk women.
However, it required trained moderators who spent over 45 minutes with participants. Given the large number of people at risk for an ED who might benefit, the researchers noted that it is unlikely that a human-moderated version would be widely disseminated.
A chatbot may represent a “possible solution to reducing delivery costs” because it mimics aspects of human moderation in simulating conversations, the investigators noted.
“We wanted to take the earlier program we developed into this century and program it for delivery in this new format that would allow for bite-size pieces of information for the chatbot to communicate to the user,” lead author Ellen Fitzsimmons-Craft, PhD, assistant professor of psychiatry, Washington University, St. Louis, told this news organization.
“Our ED prevention online version was more effective when there was guidance from a human moderator who could provide feedback on progress, encourage you to go on, and apply the skills in daily life. But that’s not the most scalable. So we thought that a chatbot, in addition to providing content in this perhaps more engaging format, could also provide some aspect of human moderation, although the person is chatting with a robot,” added Dr. Fitzsimmons-Craft, associate director of the Center for Healthy Weight and Wellness.
Tessa will speak to you now
Participants (n = 700 women; mean [SD] age, 21.08 [3.09] years; 84.6% White; 53.8% heterosexual; 31.08% bisexual), were randomized to an intervention group or a wait-list control group (n = 352 and 348, respectively). There were no significant differences between groups in age, race, ethnicity, education, or sexual orientation.
The StudentBodies program was adapted for delivery via a chatbot named Tessa “while retaining the core intervention principles” and referred to as “Body Positive.”
It consisted of several components programmed into the chatbot, which initiated each conversation in a predetermined order. Participants were encouraged to engage in two conversations weekly. The program included an introduction and eight sessions as well as a crisis module that provided users with a referral to a crisis hotline in case of emergency. Referral was triggered on the basis of “recognized keywords,” such as “hurting myself.”
The researchers used the Weight Concerns Scale questionnaire to assess weight and shape concerns and the Internalization: Thin/Low Body Fat subscale of the Sociocultural Attitudes Toward Appearance Questionnaire–4 to “assess the cognitive aspect of thin-ideal internalization.”
Secondary outcomes tested the hypothesis that the chatbot would be more likely to reduce clinical outcomes (ED psychopathology, depression, and anxiety) and prevent ED onset, compared to the control condition.
Ready for prime time
At 3- and 6-month follow-up, there was significantly greater reduction in the intervention group compared with the control group in weight/shape concerns (d = -.20, P = .03 and d = -.19, P = .04, respectively), although there were no differences in thin-ideal internalization change.
The chatbot intervention was associated with significantly greater reductions in overall ED psychopathology at 3 months (d = -.29, P = .003) compared to the control condition, but not at 6 months.
Notably, the intervention group had significantly higher odds than the control group of remaining nonclinical for EDs at 3- and 6-month follow-up (OR, 2.37 [95% confidence interval, 1.37-4.11] and OR, 2.13 [95% CI,1.26-3.59], respectively).
“We were very excited about the study, and frankly, I was surprised by the effectiveness [of the chatbot intervention] because I didn’t think it would have as much of an impact as it did,” said Dr. Taylor. “Prevention gets short shrift everywhere, and I think we succeeded very well.”
Dr. Fitzsimmons-Craft added that the National Eating Disorders Association (NEDA) has agreed to make the chatbot available on its website for people who screen positive for having an ED or for being at high risk, and so their group is working with their industry partner, a company called X2AI, which developed the chatbot, to make this happen.
“This is definitely the fastest research-to-practice translation I’ve ever seen, where we can so quickly show that it works and make it available to tens of thousands almost immediately.”
Dr. Fitzsimmons-Craft is optimistic that it will be available to launch the week of Feb. 21, which is National Eating Disorders Week.
Innovative, creative research
Commenting on the research, Evelyn Attia, MD, professor of psychiatry, Columbia University Medical Center, and director of the Columbia Center for Eating Disorders New York–Presbyterian Hospital, New York, described the study as “innovative and creative.”
Dr. Attia, a member of the Research Advisory Council of the NEDA, noted that the structure of the study is “very preliminary” and that the comparison to a wait-list control makes it hard to know whether this is an effective intervention compared with other types of interventions, rather than compared with no intervention.
“But I’m sure that when the researchers are set up and primed to study this more robustly, they will consider a more active control intervention to see whether this preliminary finding holds up,” she said.
Also commenting on the study, Deborah R. Glasofer, PhD, associate professor of clinical medical psychology (in psychiatry), Columbia Center for Eating Disorders, said, “Higher-than-average concern about appearance – body shape, size, or weight – and a tightly held belief that it is ideal to be thin are known risk factors for the development of an eating disorder.
“This study offers an indication that technology can be leveraged to fill a gap and help folks before unhelpful and sometimes misguided thoughts about food, eating, and appearance evolve into a full-blown eating disorder,” said Dr. Glasofer, who was not involved with the study.
The study was supported by the NEDA Feeding Hope Fund, the National Institute of Mental Health, the National Heart, Lung, and Blood Institute, and the Swedish Research Council. The authors and Dr. Glasofer have disclosed no relevant financial relationships. Dr. Attia is on the board and the Research Advisory Council of NEDA.
A version of this article first appeared on Medscape.com.
Results of a randomized trial show that at-risk women who interacted with the chatbot showed lower concern about their weight and body shape compared to a wait-list control group.
“Chatbots are widely used in industry and have begun to be used in medical settings, although few studies have examined their effectiveness for mental health issues and none address EDs or ED prevention,” senior investigator C. Barr Taylor, MD, a research faculty member at Palo Alto (Calif.) University, said in a press release.
“We found that the group with access to the chatbot had a greater reduction in weight and shape concerns, both right after using it at 3 months and at the 6-month follow-up. The effects had sustainability over time, and we also found indication that the chatbot may reduce ED onset more so than the control group, where there was a greater incidence of EDs,” Dr. Taylor told this news organization.
The study was published online Dec. 28, 2021, in the International Journal of Eating Disorders.
Deadly disorders
“EDs are a common problem with huge risk factors; and, given how widespread they are, we need scalable tools that can reach a lot of people at low cost, reduce risk factors for developing an ED – which is the second most deadly of all psychiatric illnesses – so prevention is of the utmost importance,” Dr. Taylor said.
The investigators developed a targeted Internet-based preventive program called StudentBodies that utilizes cognitive-behavioral therapy approaches. The program was successful in reducing weight/shape concerns in women at high risk for the onset of an ED, and it reduced ED onset in the highest-risk women.
However, it required trained moderators who spent over 45 minutes with participants. Given the large number of people at risk for an ED who might benefit, the researchers noted that it is unlikely that a human-moderated version would be widely disseminated.
A chatbot may represent a “possible solution to reducing delivery costs” because it mimics aspects of human moderation in simulating conversations, the investigators noted.
“We wanted to take the earlier program we developed into this century and program it for delivery in this new format that would allow for bite-size pieces of information for the chatbot to communicate to the user,” lead author Ellen Fitzsimmons-Craft, PhD, assistant professor of psychiatry, Washington University, St. Louis, told this news organization.
“Our ED prevention online version was more effective when there was guidance from a human moderator who could provide feedback on progress, encourage you to go on, and apply the skills in daily life. But that’s not the most scalable. So we thought that a chatbot, in addition to providing content in this perhaps more engaging format, could also provide some aspect of human moderation, although the person is chatting with a robot,” added Dr. Fitzsimmons-Craft, associate director of the Center for Healthy Weight and Wellness.
Tessa will speak to you now
Participants (n = 700 women; mean [SD] age, 21.08 [3.09] years; 84.6% White; 53.8% heterosexual; 31.08% bisexual), were randomized to an intervention group or a wait-list control group (n = 352 and 348, respectively). There were no significant differences between groups in age, race, ethnicity, education, or sexual orientation.
The StudentBodies program was adapted for delivery via a chatbot named Tessa “while retaining the core intervention principles” and referred to as “Body Positive.”
It consisted of several components programmed into the chatbot, which initiated each conversation in a predetermined order. Participants were encouraged to engage in two conversations weekly. The program included an introduction and eight sessions as well as a crisis module that provided users with a referral to a crisis hotline in case of emergency. Referral was triggered on the basis of “recognized keywords,” such as “hurting myself.”
The researchers used the Weight Concerns Scale questionnaire to assess weight and shape concerns and the Internalization: Thin/Low Body Fat subscale of the Sociocultural Attitudes Toward Appearance Questionnaire–4 to “assess the cognitive aspect of thin-ideal internalization.”
Secondary outcomes tested the hypothesis that the chatbot would be more likely to reduce clinical outcomes (ED psychopathology, depression, and anxiety) and prevent ED onset, compared to the control condition.
Ready for prime time
At 3- and 6-month follow-up, there was significantly greater reduction in the intervention group compared with the control group in weight/shape concerns (d = -.20, P = .03 and d = -.19, P = .04, respectively), although there were no differences in thin-ideal internalization change.
The chatbot intervention was associated with significantly greater reductions in overall ED psychopathology at 3 months (d = -.29, P = .003) compared to the control condition, but not at 6 months.
Notably, the intervention group had significantly higher odds than the control group of remaining nonclinical for EDs at 3- and 6-month follow-up (OR, 2.37 [95% confidence interval, 1.37-4.11] and OR, 2.13 [95% CI,1.26-3.59], respectively).
“We were very excited about the study, and frankly, I was surprised by the effectiveness [of the chatbot intervention] because I didn’t think it would have as much of an impact as it did,” said Dr. Taylor. “Prevention gets short shrift everywhere, and I think we succeeded very well.”
Dr. Fitzsimmons-Craft added that the National Eating Disorders Association (NEDA) has agreed to make the chatbot available on its website for people who screen positive for having an ED or for being at high risk, and so their group is working with their industry partner, a company called X2AI, which developed the chatbot, to make this happen.
“This is definitely the fastest research-to-practice translation I’ve ever seen, where we can so quickly show that it works and make it available to tens of thousands almost immediately.”
Dr. Fitzsimmons-Craft is optimistic that it will be available to launch the week of Feb. 21, which is National Eating Disorders Week.
Innovative, creative research
Commenting on the research, Evelyn Attia, MD, professor of psychiatry, Columbia University Medical Center, and director of the Columbia Center for Eating Disorders New York–Presbyterian Hospital, New York, described the study as “innovative and creative.”
Dr. Attia, a member of the Research Advisory Council of the NEDA, noted that the structure of the study is “very preliminary” and that the comparison to a wait-list control makes it hard to know whether this is an effective intervention compared with other types of interventions, rather than compared with no intervention.
“But I’m sure that when the researchers are set up and primed to study this more robustly, they will consider a more active control intervention to see whether this preliminary finding holds up,” she said.
Also commenting on the study, Deborah R. Glasofer, PhD, associate professor of clinical medical psychology (in psychiatry), Columbia Center for Eating Disorders, said, “Higher-than-average concern about appearance – body shape, size, or weight – and a tightly held belief that it is ideal to be thin are known risk factors for the development of an eating disorder.
“This study offers an indication that technology can be leveraged to fill a gap and help folks before unhelpful and sometimes misguided thoughts about food, eating, and appearance evolve into a full-blown eating disorder,” said Dr. Glasofer, who was not involved with the study.
The study was supported by the NEDA Feeding Hope Fund, the National Institute of Mental Health, the National Heart, Lung, and Blood Institute, and the Swedish Research Council. The authors and Dr. Glasofer have disclosed no relevant financial relationships. Dr. Attia is on the board and the Research Advisory Council of NEDA.
A version of this article first appeared on Medscape.com.