User login
For MD-IQ use only
The truth about the ‘happy hormone’: Why we shouldn’t mess with dopamine
Google the word “dopamine” and you will learn that its nicknames are the “happy hormone” and the “pleasure molecule” and that it is among the most important chemicals in our brains. With The Guardian branding it “the Kim Kardashian of neurotransmitters,” dopamine has become a true pop-science darling – people across the globe have attempted to boost their mood with dopamine fasts and dopamine dressing.
A century ago, however, newly discovered dopamine was seen as an uninspiring chemical, nothing more than a precursor of noradrenaline. It took several stubborn and hardworking scientists to change that view.
Levodopa: An indifferent precursor
When Casimir Funk, PhD, a Polish biochemist and the discoverer of vitamins, first synthesized the dopamine precursor levodopa in 1911, he had no idea how important the molecule would prove to be in pharmacology and neurobiology. Nor did Markus Guggenheim, PhD, a Swiss biochemist, who isolated levodopa in 1913 from the seeds of a broad bean, Vicia faba. Dr. Guggenheim administered 1 g of levodopa to a rabbit, with no apparent negative consequences. He then prepared a larger dose (2.5 g) and tested it on himself. “Ten minutes after taking it, I felt very nauseous, I had to vomit twice,” he wrote in his paper. In the body, levodopa is converted into dopamine, which may act as an emetic – an effect Dr. Guggenheim didn’t understand. He simply abandoned his human study, erroneously concluding, on the basis of his animal research, that levodopa is “pharmacologically fairly indifferent.”
Around the same time, several scientists across Europe successfully synthesized dopamine, but those discoveries were shelved without much fanfare. For the next 3 decades, dopamine and levodopa were pushed into academic obscurity. Just before World War II, a group of German scientists showed that levodopa is metabolized to dopamine in the body, while another German researcher, Hermann Blaschko, MD, discovered that dopamine is an intermediary in the synthesis of noradrenaline. Even these findings, however, were not immediately accepted.
The dopamine story picked up pace in the post-war years with the observation that the hormone was present in various tissues and body fluids, although nowhere as abundantly as in the central nervous system. Intrigued, Dr. Blaschko, who (after escaping Nazi Germany, changing his name to Hugh, and starting work at Oxford [England] University) hypothesized that dopamine couldn’t be an unremarkable precursor of noradrenaline – it had to have some physiologic functions of its own. He asked his postdoctoral fellow, Oheh Hornykiewicz, MD, to test a few ideas. Dr. Hornykiewicz soon confirmed that dopamine lowered blood pressure in guinea pigs, proving that dopamine indeed had physiologic activity that was independent of other catecholamines.
Reserpine and rabbit ears
While Dr. Blaschko and Dr. Hornykiewicz were puzzling over dopamine’s physiologic role in the body, across the ocean at the National Heart Institute in Maryland, pharmacologist Bernard Brodie, PhD and colleagues were laying the groundwork for the discovery of dopamine’s starring role in the brain.
Spoiler alert: Dr. Brodie’s work showed that a new psychiatric drug known as reserpine was capable of fully depleting the brain’s stores of serotonin and – of greatest significance, as it turned out – mimicking the neuromuscular symptoms typical of Parkinson’s disease. The connection to dopamine would be made by new lab colleague Arvid Carlsson, MD, PhD, who would go on to win a Nobel Prize.
Derived from Rauwolfia serpentina (a plant that for centuries has been used in India for the treatment of mental illness, insomnia, and snake bites), reserpine was introduced in the West as a treatment for schizophrenia.
It worked marvels. In 1954, the press lauded the “dramatic” and seemingly “incredible”: results in treating “hopelessly insane patients.” Reserpine had a downside, however. Reports soon changed in tone regarding the drug’s severe side effects, including headaches, dizziness, vomiting, and, far more disturbingly, symptoms mimicking Parkinson’s disease, from muscular rigidity to tremors.
Dr. Brodie observed that, when reserpine was injected, animals became completely immobile. Serotonin nearly vanished from their brains, but bizarrely, drugs that spur serotonin production did not reverse the rabbits’ immobility.
Dr. Carlsson realized that other catecholamines must be involved in reserpine’s side effects, and he began to search for the culprits. He moved back to his native Sweden and ordered a spectrophotofluorimeter. In one of his experiments, Carlsson injected a pair of rabbits with reserpine, which caused the animals to become catatonic with flattened ears. After the researchers injected the animals with levodopa, within 15 minutes, the rabbits were hopping around, ears proudly vertical. “We were just as excited as the rabbits,” Dr. Carlsson later recalled in a 2016 interview. Dr. Carlsson realized that, because there was no noradrenaline in the rabbits’ brains, dopamine depletion must have been directly responsible for producing reserpine’s motor inhibitory effects.
Skeptics are silenced
In 1960, however, the medical community was not yet ready to accept that dopamine was anything but a boring intermediate between levodopa and noradrenaline. At a prestigious London symposium, Dr. Carlsson and his two colleagues presented their hypothesis that dopamine may be a neurotransmitter, thus implicating it in Parkinson’s disease. They were met with harsh criticism. Some of the experts said levodopa was nothing more than a poison. Dr. Carlsson later recalled facing “a profound and nearly unanimous skepticism regarding our points of view.”
That would soon change. Dr. Hornykiewicz, the biochemist who had earlier discovered dopamine’s BP-lowering effects, tested Dr. Carlsson’s ideas using the postmortem brains of Parkinson’s disease patients. It appeared Dr. Carlsson was right: Unlike in healthy brains, the striatum of patients with Parkinson’s disease contained almost no dopamine whatsoever. Beginning in 1961, in collaboration with neurologist Walther Birkmayer, MD, Hornykiewicz injected levodopa into 20 patients with Parkinson’s disease and observed a “miraculous” (albeit temporary) amelioration of rigidity, motionlessness, and speechlessness.
By the late 1960s, levodopa and dopamine were making headlines. A 1969 New York Times article described similar stunning improvements in patients with Parkinson’s disease who were treated with levodopa. A patient who had arrived at a hospital unable to speak, with hands clenched and rigid expression, was suddenly able to stride into his doctor’s office and even jog around. “I might say I’m a human being,” he told reporters. Although the treatment was expensive – equivalent to $210 in 2022 – physicians were deluged with requests for “dopa.” To this day, levodopa remains a gold standard in the treatment of Parkinson’s disease.
Still misunderstood
The history of dopamine, however, is not only about Parkinson’s disease but extends to the treatment of schizophrenia and addiction. When in the1940s a French military surgeon started giving a new antihistamine drug, promethazine, to prevent shock in soldiers undergoing surgery, he noticed a bizarre side effect: the soldiers would become euphoric yet oddly calm at the same time.
After the drug was modified by adding a chlorine atom and renamed chlorpromazine, it fast became a go-to treatment for psychosis. At the time, no one made the connection to dopamine. Contemporary doctors believed that it calmed people by lowering body temperature (common treatments for mental illness back in the day included swaddling patients in cold, wet sheets). Yet just like reserpine, chlorpromazine produced range of nasty side effects that closely mimicked Parkinson’s disease. This led a Dutch pharmacologist, Jacques van Rossum, to hypothesize that dopamine receptor blockade could explain chlorpromazine’s antipsychotic effects – an idea that remains widely accepted today.
In the 1970s, dopamine was linked with addiction through research on rodents, and this novel idea caught people’s imagination over the coming decades. A story on dopamine titled, “How We Get Addicted,” made the cover of Time in 1997.
Yet as the dopamine/addiction connection became widespread, it also became oversimplified. According to a 2015 article in Nature Reviews Neuroscience, a wave of low-quality research followed – nonreplicated, insufficient – which led the authors to conclude that we are “addicted to the dopamine theory of addiction.” Just about every pleasure under the sun was being attributed to dopamine, from eating delicious foods and playing computer games to sex, music, and hot showers. As recent science shows, however, dopamine is not simply about pleasure – it’s about reward prediction, response to stress, memory, learning, and even the functioning of the immune system. Since its first synthesis in the early 20th century, dopamine has often been misunderstood and oversimplified – and it seems the story is repeating itself now.
In one of his final interviews, Dr. Carlsson, who passed away in 2018 at the age of 95, warned about playing around with dopamine and, in particular, prescribing drugs that have an inhibitory action on this neurotransmitter. “Dopamine is involved in everything that happens in our brains – all its important functions,” he said.
We should be careful how we handle such a delicate and still little-known system.
A version of this article first appeared on Medscape.com.
Google the word “dopamine” and you will learn that its nicknames are the “happy hormone” and the “pleasure molecule” and that it is among the most important chemicals in our brains. With The Guardian branding it “the Kim Kardashian of neurotransmitters,” dopamine has become a true pop-science darling – people across the globe have attempted to boost their mood with dopamine fasts and dopamine dressing.
A century ago, however, newly discovered dopamine was seen as an uninspiring chemical, nothing more than a precursor of noradrenaline. It took several stubborn and hardworking scientists to change that view.
Levodopa: An indifferent precursor
When Casimir Funk, PhD, a Polish biochemist and the discoverer of vitamins, first synthesized the dopamine precursor levodopa in 1911, he had no idea how important the molecule would prove to be in pharmacology and neurobiology. Nor did Markus Guggenheim, PhD, a Swiss biochemist, who isolated levodopa in 1913 from the seeds of a broad bean, Vicia faba. Dr. Guggenheim administered 1 g of levodopa to a rabbit, with no apparent negative consequences. He then prepared a larger dose (2.5 g) and tested it on himself. “Ten minutes after taking it, I felt very nauseous, I had to vomit twice,” he wrote in his paper. In the body, levodopa is converted into dopamine, which may act as an emetic – an effect Dr. Guggenheim didn’t understand. He simply abandoned his human study, erroneously concluding, on the basis of his animal research, that levodopa is “pharmacologically fairly indifferent.”
Around the same time, several scientists across Europe successfully synthesized dopamine, but those discoveries were shelved without much fanfare. For the next 3 decades, dopamine and levodopa were pushed into academic obscurity. Just before World War II, a group of German scientists showed that levodopa is metabolized to dopamine in the body, while another German researcher, Hermann Blaschko, MD, discovered that dopamine is an intermediary in the synthesis of noradrenaline. Even these findings, however, were not immediately accepted.
The dopamine story picked up pace in the post-war years with the observation that the hormone was present in various tissues and body fluids, although nowhere as abundantly as in the central nervous system. Intrigued, Dr. Blaschko, who (after escaping Nazi Germany, changing his name to Hugh, and starting work at Oxford [England] University) hypothesized that dopamine couldn’t be an unremarkable precursor of noradrenaline – it had to have some physiologic functions of its own. He asked his postdoctoral fellow, Oheh Hornykiewicz, MD, to test a few ideas. Dr. Hornykiewicz soon confirmed that dopamine lowered blood pressure in guinea pigs, proving that dopamine indeed had physiologic activity that was independent of other catecholamines.
Reserpine and rabbit ears
While Dr. Blaschko and Dr. Hornykiewicz were puzzling over dopamine’s physiologic role in the body, across the ocean at the National Heart Institute in Maryland, pharmacologist Bernard Brodie, PhD and colleagues were laying the groundwork for the discovery of dopamine’s starring role in the brain.
Spoiler alert: Dr. Brodie’s work showed that a new psychiatric drug known as reserpine was capable of fully depleting the brain’s stores of serotonin and – of greatest significance, as it turned out – mimicking the neuromuscular symptoms typical of Parkinson’s disease. The connection to dopamine would be made by new lab colleague Arvid Carlsson, MD, PhD, who would go on to win a Nobel Prize.
Derived from Rauwolfia serpentina (a plant that for centuries has been used in India for the treatment of mental illness, insomnia, and snake bites), reserpine was introduced in the West as a treatment for schizophrenia.
It worked marvels. In 1954, the press lauded the “dramatic” and seemingly “incredible”: results in treating “hopelessly insane patients.” Reserpine had a downside, however. Reports soon changed in tone regarding the drug’s severe side effects, including headaches, dizziness, vomiting, and, far more disturbingly, symptoms mimicking Parkinson’s disease, from muscular rigidity to tremors.
Dr. Brodie observed that, when reserpine was injected, animals became completely immobile. Serotonin nearly vanished from their brains, but bizarrely, drugs that spur serotonin production did not reverse the rabbits’ immobility.
Dr. Carlsson realized that other catecholamines must be involved in reserpine’s side effects, and he began to search for the culprits. He moved back to his native Sweden and ordered a spectrophotofluorimeter. In one of his experiments, Carlsson injected a pair of rabbits with reserpine, which caused the animals to become catatonic with flattened ears. After the researchers injected the animals with levodopa, within 15 minutes, the rabbits were hopping around, ears proudly vertical. “We were just as excited as the rabbits,” Dr. Carlsson later recalled in a 2016 interview. Dr. Carlsson realized that, because there was no noradrenaline in the rabbits’ brains, dopamine depletion must have been directly responsible for producing reserpine’s motor inhibitory effects.
Skeptics are silenced
In 1960, however, the medical community was not yet ready to accept that dopamine was anything but a boring intermediate between levodopa and noradrenaline. At a prestigious London symposium, Dr. Carlsson and his two colleagues presented their hypothesis that dopamine may be a neurotransmitter, thus implicating it in Parkinson’s disease. They were met with harsh criticism. Some of the experts said levodopa was nothing more than a poison. Dr. Carlsson later recalled facing “a profound and nearly unanimous skepticism regarding our points of view.”
That would soon change. Dr. Hornykiewicz, the biochemist who had earlier discovered dopamine’s BP-lowering effects, tested Dr. Carlsson’s ideas using the postmortem brains of Parkinson’s disease patients. It appeared Dr. Carlsson was right: Unlike in healthy brains, the striatum of patients with Parkinson’s disease contained almost no dopamine whatsoever. Beginning in 1961, in collaboration with neurologist Walther Birkmayer, MD, Hornykiewicz injected levodopa into 20 patients with Parkinson’s disease and observed a “miraculous” (albeit temporary) amelioration of rigidity, motionlessness, and speechlessness.
By the late 1960s, levodopa and dopamine were making headlines. A 1969 New York Times article described similar stunning improvements in patients with Parkinson’s disease who were treated with levodopa. A patient who had arrived at a hospital unable to speak, with hands clenched and rigid expression, was suddenly able to stride into his doctor’s office and even jog around. “I might say I’m a human being,” he told reporters. Although the treatment was expensive – equivalent to $210 in 2022 – physicians were deluged with requests for “dopa.” To this day, levodopa remains a gold standard in the treatment of Parkinson’s disease.
Still misunderstood
The history of dopamine, however, is not only about Parkinson’s disease but extends to the treatment of schizophrenia and addiction. When in the1940s a French military surgeon started giving a new antihistamine drug, promethazine, to prevent shock in soldiers undergoing surgery, he noticed a bizarre side effect: the soldiers would become euphoric yet oddly calm at the same time.
After the drug was modified by adding a chlorine atom and renamed chlorpromazine, it fast became a go-to treatment for psychosis. At the time, no one made the connection to dopamine. Contemporary doctors believed that it calmed people by lowering body temperature (common treatments for mental illness back in the day included swaddling patients in cold, wet sheets). Yet just like reserpine, chlorpromazine produced range of nasty side effects that closely mimicked Parkinson’s disease. This led a Dutch pharmacologist, Jacques van Rossum, to hypothesize that dopamine receptor blockade could explain chlorpromazine’s antipsychotic effects – an idea that remains widely accepted today.
In the 1970s, dopamine was linked with addiction through research on rodents, and this novel idea caught people’s imagination over the coming decades. A story on dopamine titled, “How We Get Addicted,” made the cover of Time in 1997.
Yet as the dopamine/addiction connection became widespread, it also became oversimplified. According to a 2015 article in Nature Reviews Neuroscience, a wave of low-quality research followed – nonreplicated, insufficient – which led the authors to conclude that we are “addicted to the dopamine theory of addiction.” Just about every pleasure under the sun was being attributed to dopamine, from eating delicious foods and playing computer games to sex, music, and hot showers. As recent science shows, however, dopamine is not simply about pleasure – it’s about reward prediction, response to stress, memory, learning, and even the functioning of the immune system. Since its first synthesis in the early 20th century, dopamine has often been misunderstood and oversimplified – and it seems the story is repeating itself now.
In one of his final interviews, Dr. Carlsson, who passed away in 2018 at the age of 95, warned about playing around with dopamine and, in particular, prescribing drugs that have an inhibitory action on this neurotransmitter. “Dopamine is involved in everything that happens in our brains – all its important functions,” he said.
We should be careful how we handle such a delicate and still little-known system.
A version of this article first appeared on Medscape.com.
Google the word “dopamine” and you will learn that its nicknames are the “happy hormone” and the “pleasure molecule” and that it is among the most important chemicals in our brains. With The Guardian branding it “the Kim Kardashian of neurotransmitters,” dopamine has become a true pop-science darling – people across the globe have attempted to boost their mood with dopamine fasts and dopamine dressing.
A century ago, however, newly discovered dopamine was seen as an uninspiring chemical, nothing more than a precursor of noradrenaline. It took several stubborn and hardworking scientists to change that view.
Levodopa: An indifferent precursor
When Casimir Funk, PhD, a Polish biochemist and the discoverer of vitamins, first synthesized the dopamine precursor levodopa in 1911, he had no idea how important the molecule would prove to be in pharmacology and neurobiology. Nor did Markus Guggenheim, PhD, a Swiss biochemist, who isolated levodopa in 1913 from the seeds of a broad bean, Vicia faba. Dr. Guggenheim administered 1 g of levodopa to a rabbit, with no apparent negative consequences. He then prepared a larger dose (2.5 g) and tested it on himself. “Ten minutes after taking it, I felt very nauseous, I had to vomit twice,” he wrote in his paper. In the body, levodopa is converted into dopamine, which may act as an emetic – an effect Dr. Guggenheim didn’t understand. He simply abandoned his human study, erroneously concluding, on the basis of his animal research, that levodopa is “pharmacologically fairly indifferent.”
Around the same time, several scientists across Europe successfully synthesized dopamine, but those discoveries were shelved without much fanfare. For the next 3 decades, dopamine and levodopa were pushed into academic obscurity. Just before World War II, a group of German scientists showed that levodopa is metabolized to dopamine in the body, while another German researcher, Hermann Blaschko, MD, discovered that dopamine is an intermediary in the synthesis of noradrenaline. Even these findings, however, were not immediately accepted.
The dopamine story picked up pace in the post-war years with the observation that the hormone was present in various tissues and body fluids, although nowhere as abundantly as in the central nervous system. Intrigued, Dr. Blaschko, who (after escaping Nazi Germany, changing his name to Hugh, and starting work at Oxford [England] University) hypothesized that dopamine couldn’t be an unremarkable precursor of noradrenaline – it had to have some physiologic functions of its own. He asked his postdoctoral fellow, Oheh Hornykiewicz, MD, to test a few ideas. Dr. Hornykiewicz soon confirmed that dopamine lowered blood pressure in guinea pigs, proving that dopamine indeed had physiologic activity that was independent of other catecholamines.
Reserpine and rabbit ears
While Dr. Blaschko and Dr. Hornykiewicz were puzzling over dopamine’s physiologic role in the body, across the ocean at the National Heart Institute in Maryland, pharmacologist Bernard Brodie, PhD and colleagues were laying the groundwork for the discovery of dopamine’s starring role in the brain.
Spoiler alert: Dr. Brodie’s work showed that a new psychiatric drug known as reserpine was capable of fully depleting the brain’s stores of serotonin and – of greatest significance, as it turned out – mimicking the neuromuscular symptoms typical of Parkinson’s disease. The connection to dopamine would be made by new lab colleague Arvid Carlsson, MD, PhD, who would go on to win a Nobel Prize.
Derived from Rauwolfia serpentina (a plant that for centuries has been used in India for the treatment of mental illness, insomnia, and snake bites), reserpine was introduced in the West as a treatment for schizophrenia.
It worked marvels. In 1954, the press lauded the “dramatic” and seemingly “incredible”: results in treating “hopelessly insane patients.” Reserpine had a downside, however. Reports soon changed in tone regarding the drug’s severe side effects, including headaches, dizziness, vomiting, and, far more disturbingly, symptoms mimicking Parkinson’s disease, from muscular rigidity to tremors.
Dr. Brodie observed that, when reserpine was injected, animals became completely immobile. Serotonin nearly vanished from their brains, but bizarrely, drugs that spur serotonin production did not reverse the rabbits’ immobility.
Dr. Carlsson realized that other catecholamines must be involved in reserpine’s side effects, and he began to search for the culprits. He moved back to his native Sweden and ordered a spectrophotofluorimeter. In one of his experiments, Carlsson injected a pair of rabbits with reserpine, which caused the animals to become catatonic with flattened ears. After the researchers injected the animals with levodopa, within 15 minutes, the rabbits were hopping around, ears proudly vertical. “We were just as excited as the rabbits,” Dr. Carlsson later recalled in a 2016 interview. Dr. Carlsson realized that, because there was no noradrenaline in the rabbits’ brains, dopamine depletion must have been directly responsible for producing reserpine’s motor inhibitory effects.
Skeptics are silenced
In 1960, however, the medical community was not yet ready to accept that dopamine was anything but a boring intermediate between levodopa and noradrenaline. At a prestigious London symposium, Dr. Carlsson and his two colleagues presented their hypothesis that dopamine may be a neurotransmitter, thus implicating it in Parkinson’s disease. They were met with harsh criticism. Some of the experts said levodopa was nothing more than a poison. Dr. Carlsson later recalled facing “a profound and nearly unanimous skepticism regarding our points of view.”
That would soon change. Dr. Hornykiewicz, the biochemist who had earlier discovered dopamine’s BP-lowering effects, tested Dr. Carlsson’s ideas using the postmortem brains of Parkinson’s disease patients. It appeared Dr. Carlsson was right: Unlike in healthy brains, the striatum of patients with Parkinson’s disease contained almost no dopamine whatsoever. Beginning in 1961, in collaboration with neurologist Walther Birkmayer, MD, Hornykiewicz injected levodopa into 20 patients with Parkinson’s disease and observed a “miraculous” (albeit temporary) amelioration of rigidity, motionlessness, and speechlessness.
By the late 1960s, levodopa and dopamine were making headlines. A 1969 New York Times article described similar stunning improvements in patients with Parkinson’s disease who were treated with levodopa. A patient who had arrived at a hospital unable to speak, with hands clenched and rigid expression, was suddenly able to stride into his doctor’s office and even jog around. “I might say I’m a human being,” he told reporters. Although the treatment was expensive – equivalent to $210 in 2022 – physicians were deluged with requests for “dopa.” To this day, levodopa remains a gold standard in the treatment of Parkinson’s disease.
Still misunderstood
The history of dopamine, however, is not only about Parkinson’s disease but extends to the treatment of schizophrenia and addiction. When in the1940s a French military surgeon started giving a new antihistamine drug, promethazine, to prevent shock in soldiers undergoing surgery, he noticed a bizarre side effect: the soldiers would become euphoric yet oddly calm at the same time.
After the drug was modified by adding a chlorine atom and renamed chlorpromazine, it fast became a go-to treatment for psychosis. At the time, no one made the connection to dopamine. Contemporary doctors believed that it calmed people by lowering body temperature (common treatments for mental illness back in the day included swaddling patients in cold, wet sheets). Yet just like reserpine, chlorpromazine produced range of nasty side effects that closely mimicked Parkinson’s disease. This led a Dutch pharmacologist, Jacques van Rossum, to hypothesize that dopamine receptor blockade could explain chlorpromazine’s antipsychotic effects – an idea that remains widely accepted today.
In the 1970s, dopamine was linked with addiction through research on rodents, and this novel idea caught people’s imagination over the coming decades. A story on dopamine titled, “How We Get Addicted,” made the cover of Time in 1997.
Yet as the dopamine/addiction connection became widespread, it also became oversimplified. According to a 2015 article in Nature Reviews Neuroscience, a wave of low-quality research followed – nonreplicated, insufficient – which led the authors to conclude that we are “addicted to the dopamine theory of addiction.” Just about every pleasure under the sun was being attributed to dopamine, from eating delicious foods and playing computer games to sex, music, and hot showers. As recent science shows, however, dopamine is not simply about pleasure – it’s about reward prediction, response to stress, memory, learning, and even the functioning of the immune system. Since its first synthesis in the early 20th century, dopamine has often been misunderstood and oversimplified – and it seems the story is repeating itself now.
In one of his final interviews, Dr. Carlsson, who passed away in 2018 at the age of 95, warned about playing around with dopamine and, in particular, prescribing drugs that have an inhibitory action on this neurotransmitter. “Dopamine is involved in everything that happens in our brains – all its important functions,” he said.
We should be careful how we handle such a delicate and still little-known system.
A version of this article first appeared on Medscape.com.
How do patients with chronic urticaria fare during pregnancy?
In addition, the rates of preterm births and medical problems of newborns in patients with CU are similar to those of the normal population and not linked to treatment used during pregnancy.
Those are the key findings from an analysis of new data from PREG-CU, an international, multicenter study of the Urticaria Centers of Reference and Excellence (UCARE) network. Results from the first PREG-CU analysis published in 2021 found that CU improved in about half of patients with CU during pregnancy. “However, two in five patients reported acute exacerbations of CU especially at the beginning and end of pregnancy,” investigators led by Emek Kocatürk, MD, of the department of dermatology and UCARE at Koç University School of Medicine, Istanbul, wrote in the new study, recently published in the Journal of the European Academy of Dermatology and Venereology.
“In addition, 1 in 10 pregnant CU patients required urticaria emergency care and 1 of 6 had angioedema during pregnancy,” they said. Risk factors for worsening CU during pregnancy, they added, were “mild disease and no angioedema before pregnancy, not taking treatment before pregnancy, chronic inducible urticaria, CU worsening during a previous pregnancy, stress as a driver of exacerbations, and treatment during pregnancy.”
Analysis involved 288 pregnant women
To optimize treatment of CU during pregnancy and to better understand how treatment affects pregnancy outcomes, the researchers analyzed 288 pregnancies in 288 women with CU from 13 countries and 21 centers worldwide. Their mean age at pregnancy was 32.1 years, and their mean duration of CU was 84.9 months. Prior to pregnancy, 35.7% of patients rated the severity of their CU symptoms as mild, 34.2% rated it as moderate, and 29.7% rated it as severe.
The researchers found that during pregnancy, 60% of patients used urticaria medication, including standard-dose second-generation H1-antihistamines (35.1%), first-generation H1-antihistamines (7.6%), high-dose second-generation H1-antihistamines (5.6%), and omalizumab (5.6%). The preterm birth rate was 10.2%, which was similar between patients who did and did not receive treatment during pregnancy (11.6% vs. 8.7%, respectively; P = .59).
On multivariate logistic regression, two predictors for preterm birth emerged: giving birth to twins (a 13.3-fold increased risk; P = .016) and emergency referrals for CU (a 4.3-fold increased risk; P =.016). The cesarean delivery rate was 51.3%, and more than 90% of newborns were healthy at birth. There was no link between any patient or disease characteristics or treatments and medical problems at birth.
In other findings, 78.8% of women with CU breastfed their babies. Of the 58 patients who did not breastfeed, 20.7% indicated severe urticaria/angioedema and/or taking medications as the main reason for not breastfeeding.
“Most CU patients use treatment during pregnancy and such treatments, especially second generation H1 antihistamines, seem to be safe during pregnancy regardless of the trimester,” the researchers concluded. “Outcomes of pregnancy in patients with CU were similar compared to the general population and not linked to treatment used during pregnancy. Notably, emergency referral for CU was an independent risk factor for preterm birth,” and the high cesarean delivery rate was “probably linked to comorbidities associated with the disease,” they added. “Overall, these findings suggest that patients should continue their treatments using an individualized dose to provide optimal symptom control.”
International guidelines
The authors noted that international guidelines for the management of urticaria published in 2022 suggest that modern second-generation H1-antihistamines should be used for pregnant patients, preferably loratadine with a possible extrapolation to desloratadine, cetirizine, or levocetirizine.
“Similarly, in this population, we found that cetirizine and loratadine were the most commonly used antihistamines, followed by levocetirizine and fexofenadine,” Dr. Kocatürk and colleagues wrote.
“Guidelines also suggest that the use of first-generation H1-antihistamines should be avoided given their sedative effects; but if these are to be given, it would be wise to know that use of first-generation H1-antihistamines immediately before parturition could cause respiratory depression and other adverse effects in the neonate,” they added, noting that chlorpheniramine and diphenhydramine are the first-generation H1-antihistamines with the greatest evidence of safety in pregnancy.
They acknowledged certain limitations of the analysis, including its retrospective design and the fact that there were no data on low birth weight, small for gestational age, or miscarriage rates. In addition, disease activity or severity during pregnancy and after birth were not monitored.
Asked to comment on these results, Raj Chovatiya, MD, PhD, who directs the center for eczema and itch in the department of dermatology at Northwestern University, Chicago, noted that despite a higher prevalence of CU among females compared with males, very little is known about how the condition is managed during pregnancy. “This retrospective study shows that most patients continue to utilize CU treatment during pregnancy (primarily second-generation antihistamines), with similar birth outcomes as the general population,” he said. “Interestingly, cesarean rates were higher among mothers with CU, and emergency CU referral was a risk factor for preterm birth. While additional prospective studies are needed, these results suggest that CU patients should be carefully managed, particularly during pregnancy, when treatment should be optimized.”
Dr. Kocatürk reported having received personal fees from Novartis, Ibrahim Etem-Menarini, and Sanofi, outside the submitted work. Many coauthors reported having numerous financial disclosures. Dr. Chovatiya disclosed that he is a consultant to, a speaker for, and/or a member of the advisory board for AbbVie, Arcutis, Arena, Incyte, Pfizer, Regeneron, and Sanofi Genzyme.
In addition, the rates of preterm births and medical problems of newborns in patients with CU are similar to those of the normal population and not linked to treatment used during pregnancy.
Those are the key findings from an analysis of new data from PREG-CU, an international, multicenter study of the Urticaria Centers of Reference and Excellence (UCARE) network. Results from the first PREG-CU analysis published in 2021 found that CU improved in about half of patients with CU during pregnancy. “However, two in five patients reported acute exacerbations of CU especially at the beginning and end of pregnancy,” investigators led by Emek Kocatürk, MD, of the department of dermatology and UCARE at Koç University School of Medicine, Istanbul, wrote in the new study, recently published in the Journal of the European Academy of Dermatology and Venereology.
“In addition, 1 in 10 pregnant CU patients required urticaria emergency care and 1 of 6 had angioedema during pregnancy,” they said. Risk factors for worsening CU during pregnancy, they added, were “mild disease and no angioedema before pregnancy, not taking treatment before pregnancy, chronic inducible urticaria, CU worsening during a previous pregnancy, stress as a driver of exacerbations, and treatment during pregnancy.”
Analysis involved 288 pregnant women
To optimize treatment of CU during pregnancy and to better understand how treatment affects pregnancy outcomes, the researchers analyzed 288 pregnancies in 288 women with CU from 13 countries and 21 centers worldwide. Their mean age at pregnancy was 32.1 years, and their mean duration of CU was 84.9 months. Prior to pregnancy, 35.7% of patients rated the severity of their CU symptoms as mild, 34.2% rated it as moderate, and 29.7% rated it as severe.
The researchers found that during pregnancy, 60% of patients used urticaria medication, including standard-dose second-generation H1-antihistamines (35.1%), first-generation H1-antihistamines (7.6%), high-dose second-generation H1-antihistamines (5.6%), and omalizumab (5.6%). The preterm birth rate was 10.2%, which was similar between patients who did and did not receive treatment during pregnancy (11.6% vs. 8.7%, respectively; P = .59).
On multivariate logistic regression, two predictors for preterm birth emerged: giving birth to twins (a 13.3-fold increased risk; P = .016) and emergency referrals for CU (a 4.3-fold increased risk; P =.016). The cesarean delivery rate was 51.3%, and more than 90% of newborns were healthy at birth. There was no link between any patient or disease characteristics or treatments and medical problems at birth.
In other findings, 78.8% of women with CU breastfed their babies. Of the 58 patients who did not breastfeed, 20.7% indicated severe urticaria/angioedema and/or taking medications as the main reason for not breastfeeding.
“Most CU patients use treatment during pregnancy and such treatments, especially second generation H1 antihistamines, seem to be safe during pregnancy regardless of the trimester,” the researchers concluded. “Outcomes of pregnancy in patients with CU were similar compared to the general population and not linked to treatment used during pregnancy. Notably, emergency referral for CU was an independent risk factor for preterm birth,” and the high cesarean delivery rate was “probably linked to comorbidities associated with the disease,” they added. “Overall, these findings suggest that patients should continue their treatments using an individualized dose to provide optimal symptom control.”
International guidelines
The authors noted that international guidelines for the management of urticaria published in 2022 suggest that modern second-generation H1-antihistamines should be used for pregnant patients, preferably loratadine with a possible extrapolation to desloratadine, cetirizine, or levocetirizine.
“Similarly, in this population, we found that cetirizine and loratadine were the most commonly used antihistamines, followed by levocetirizine and fexofenadine,” Dr. Kocatürk and colleagues wrote.
“Guidelines also suggest that the use of first-generation H1-antihistamines should be avoided given their sedative effects; but if these are to be given, it would be wise to know that use of first-generation H1-antihistamines immediately before parturition could cause respiratory depression and other adverse effects in the neonate,” they added, noting that chlorpheniramine and diphenhydramine are the first-generation H1-antihistamines with the greatest evidence of safety in pregnancy.
They acknowledged certain limitations of the analysis, including its retrospective design and the fact that there were no data on low birth weight, small for gestational age, or miscarriage rates. In addition, disease activity or severity during pregnancy and after birth were not monitored.
Asked to comment on these results, Raj Chovatiya, MD, PhD, who directs the center for eczema and itch in the department of dermatology at Northwestern University, Chicago, noted that despite a higher prevalence of CU among females compared with males, very little is known about how the condition is managed during pregnancy. “This retrospective study shows that most patients continue to utilize CU treatment during pregnancy (primarily second-generation antihistamines), with similar birth outcomes as the general population,” he said. “Interestingly, cesarean rates were higher among mothers with CU, and emergency CU referral was a risk factor for preterm birth. While additional prospective studies are needed, these results suggest that CU patients should be carefully managed, particularly during pregnancy, when treatment should be optimized.”
Dr. Kocatürk reported having received personal fees from Novartis, Ibrahim Etem-Menarini, and Sanofi, outside the submitted work. Many coauthors reported having numerous financial disclosures. Dr. Chovatiya disclosed that he is a consultant to, a speaker for, and/or a member of the advisory board for AbbVie, Arcutis, Arena, Incyte, Pfizer, Regeneron, and Sanofi Genzyme.
In addition, the rates of preterm births and medical problems of newborns in patients with CU are similar to those of the normal population and not linked to treatment used during pregnancy.
Those are the key findings from an analysis of new data from PREG-CU, an international, multicenter study of the Urticaria Centers of Reference and Excellence (UCARE) network. Results from the first PREG-CU analysis published in 2021 found that CU improved in about half of patients with CU during pregnancy. “However, two in five patients reported acute exacerbations of CU especially at the beginning and end of pregnancy,” investigators led by Emek Kocatürk, MD, of the department of dermatology and UCARE at Koç University School of Medicine, Istanbul, wrote in the new study, recently published in the Journal of the European Academy of Dermatology and Venereology.
“In addition, 1 in 10 pregnant CU patients required urticaria emergency care and 1 of 6 had angioedema during pregnancy,” they said. Risk factors for worsening CU during pregnancy, they added, were “mild disease and no angioedema before pregnancy, not taking treatment before pregnancy, chronic inducible urticaria, CU worsening during a previous pregnancy, stress as a driver of exacerbations, and treatment during pregnancy.”
Analysis involved 288 pregnant women
To optimize treatment of CU during pregnancy and to better understand how treatment affects pregnancy outcomes, the researchers analyzed 288 pregnancies in 288 women with CU from 13 countries and 21 centers worldwide. Their mean age at pregnancy was 32.1 years, and their mean duration of CU was 84.9 months. Prior to pregnancy, 35.7% of patients rated the severity of their CU symptoms as mild, 34.2% rated it as moderate, and 29.7% rated it as severe.
The researchers found that during pregnancy, 60% of patients used urticaria medication, including standard-dose second-generation H1-antihistamines (35.1%), first-generation H1-antihistamines (7.6%), high-dose second-generation H1-antihistamines (5.6%), and omalizumab (5.6%). The preterm birth rate was 10.2%, which was similar between patients who did and did not receive treatment during pregnancy (11.6% vs. 8.7%, respectively; P = .59).
On multivariate logistic regression, two predictors for preterm birth emerged: giving birth to twins (a 13.3-fold increased risk; P = .016) and emergency referrals for CU (a 4.3-fold increased risk; P =.016). The cesarean delivery rate was 51.3%, and more than 90% of newborns were healthy at birth. There was no link between any patient or disease characteristics or treatments and medical problems at birth.
In other findings, 78.8% of women with CU breastfed their babies. Of the 58 patients who did not breastfeed, 20.7% indicated severe urticaria/angioedema and/or taking medications as the main reason for not breastfeeding.
“Most CU patients use treatment during pregnancy and such treatments, especially second generation H1 antihistamines, seem to be safe during pregnancy regardless of the trimester,” the researchers concluded. “Outcomes of pregnancy in patients with CU were similar compared to the general population and not linked to treatment used during pregnancy. Notably, emergency referral for CU was an independent risk factor for preterm birth,” and the high cesarean delivery rate was “probably linked to comorbidities associated with the disease,” they added. “Overall, these findings suggest that patients should continue their treatments using an individualized dose to provide optimal symptom control.”
International guidelines
The authors noted that international guidelines for the management of urticaria published in 2022 suggest that modern second-generation H1-antihistamines should be used for pregnant patients, preferably loratadine with a possible extrapolation to desloratadine, cetirizine, or levocetirizine.
“Similarly, in this population, we found that cetirizine and loratadine were the most commonly used antihistamines, followed by levocetirizine and fexofenadine,” Dr. Kocatürk and colleagues wrote.
“Guidelines also suggest that the use of first-generation H1-antihistamines should be avoided given their sedative effects; but if these are to be given, it would be wise to know that use of first-generation H1-antihistamines immediately before parturition could cause respiratory depression and other adverse effects in the neonate,” they added, noting that chlorpheniramine and diphenhydramine are the first-generation H1-antihistamines with the greatest evidence of safety in pregnancy.
They acknowledged certain limitations of the analysis, including its retrospective design and the fact that there were no data on low birth weight, small for gestational age, or miscarriage rates. In addition, disease activity or severity during pregnancy and after birth were not monitored.
Asked to comment on these results, Raj Chovatiya, MD, PhD, who directs the center for eczema and itch in the department of dermatology at Northwestern University, Chicago, noted that despite a higher prevalence of CU among females compared with males, very little is known about how the condition is managed during pregnancy. “This retrospective study shows that most patients continue to utilize CU treatment during pregnancy (primarily second-generation antihistamines), with similar birth outcomes as the general population,” he said. “Interestingly, cesarean rates were higher among mothers with CU, and emergency CU referral was a risk factor for preterm birth. While additional prospective studies are needed, these results suggest that CU patients should be carefully managed, particularly during pregnancy, when treatment should be optimized.”
Dr. Kocatürk reported having received personal fees from Novartis, Ibrahim Etem-Menarini, and Sanofi, outside the submitted work. Many coauthors reported having numerous financial disclosures. Dr. Chovatiya disclosed that he is a consultant to, a speaker for, and/or a member of the advisory board for AbbVie, Arcutis, Arena, Incyte, Pfizer, Regeneron, and Sanofi Genzyme.
FROM JEADV
Roselyn Tso confirmed to head Indian Health Service
It took 609 days, but the US Senate has finally (unanimously) confirmed President Biden’s choice to head the Indian Health Service (IHS: Roselyn Tso.)
President Biden nominated Tso in March 2022, and she was formally sworn in on September 27, 2022. The long-awaited confirmation filled a space that hadn’t had a permanent director since Michael Weahkee, a Pueblo of Zuni citizen, stepped down in 2021. In the interim, Elizabeth Fowler, of the Comanche Nation, served as acting director.
Tso’s resume includes almost 40 years of professional experience working at all levels of the IHS. Before taking over as IHS director, she led the IHS Navajo area, the largest IHS regional area, managing more than 4000 employees and a budget of nearly $1 billion.
She also brings “decades of lived experience as a member of the Navajo Nation,” she said in a 40-minute Senate hearing with the US Senate Committee on Indian Affairs in May.
The first Navajo Nation citizen to head the IHS (and only the second woman to do so), Tso introduced herself in Navajo: Deeschii’nii (Start of the Red Streak People) and born for Hashk’aa hadzohi (Yucca Fruit Strung Out). “This is a historic achievement for all of our Navajo people and tribal nations across the country,” Navajo Nation President Jonathan Nez said. “To have one of our own Navajo members in the highest position with IHS is remarkable.”
Tso spoke of having to “navigate the services provided by the Agency for myself, family, and friends.” Her personal and professional backgrounds, she said, help her understand how patients experience the system and how that can be improved. “The health care provided at IHS is critical for those we serve. I understand this not just because I work there,” she said. “My family relies on IHS. My friends rely on IHS. I rely on the IHS.”
The long lacuna in confirming a permanent IHS director left the Native peoples particularly vulnerable—when the COVID-19 pandemic essentially worsened the existing problems they faced, such as diabetes mellitus and cancer. Life expectancy for Native people fell by more than 6 years between 2019 and 2021, to 65 years, compared with the US average of 76 years.
Without a full-time IHS leader, the National Council of Urban Indian Health said in a statement, tribal nations and other Native health care providers struggled to raise and address the issues they were facing amid the pandemic. “Since the resignation of Rear Admiral Weahkee, there have been countless requests from Indian Country calling on Congress and the Administration to nominate a new IHS director to address the growing health disparities experienced by AI/ANs.”
Tso laid out her priorities in her May testimony: creating a more unified health care system using the latest technology to develop centralized systems; improving accountability, transparency, and patient safety; addressing workforce needs and challenges, improving recruitment and retention.
Meeting her goals, she noted, would take “strong partnerships and communication with our Tribal partners…. Each tribe has unique needs, and those needs cannot be met if you do not understand them.”
Last year, President Joseph R. Biden asked Congress to significantly increase IHS funding, but his proposal was cut to $400 million. “For years, IHS has been funded at a rate that is far below its level of need, and the results of this historical neglect can be seen in the disparities in health outcomes for AI/AN people,” William Smith, Valdez Native Tribe, Chairman of the National Indian Health Board (NIHB), wrote to the Senate Committee on Indian Affairs, on the topic of the next IHS director. “Perhaps one of the greatest challenges facing the [Indian, tribal and urban] system is the chronic and severe underfunding and budgetary instability for health care and public health services infrastructure and delivery. Since its creation in 1955, IHS has been chronically underfunded, with annual appropriations never exceeding 50% of demonstrated need. This underfunding has contributed to substandard investment in health delivery systems, some of the worst health disparities among any US population and a severe lack of public health infrastructure and services for our people. At the start of the COVID-19 pandemic these vulnerabilities were starkly exposed and while Congress moved decisively to invest into Tribal health and public health, the new Director must work to maintain these one-time investments.”
Stacy Bohlen, NIHB chief executive, told The Oklahoman that tribal leaders will look to Tso to press Congress for more money and to secure mandatory full funding for IHS—in contrast with the current annual appropriations, where Congress includes IHS in much larger budget bills. “When those bills stall, so does the money tribal clinics need to pay employees and suppliers,” making it hard to recruit and retain employees. “In the Indian Health System,” Bohlen says, “we simply can’t afford that kind of vulnerability.”
Securing advance appropriations and, ultimately, full mandatory funding for IHS, Smith wrote in his letter to the Senate committee, “fulfills the commitment made to our people generations ago and breaks down the systemic healthcare funding inequities the federal government tolerates for Tribes.”
Tso emphasized her intent to “improve the physical, mental, social, and spiritual health and well-being of all American Indians and Alaskan Natives served by the Agency.” Tso “understands the healthcare needs that many first people of this country deal with,” President Nez said. “Her work ethic, value system and approach to problem solving demonstrates the resilience of Indigenous peoples and the commitment to combat the systemic inequities that impact tribal nations.”
It took 609 days, but the US Senate has finally (unanimously) confirmed President Biden’s choice to head the Indian Health Service (IHS: Roselyn Tso.)
President Biden nominated Tso in March 2022, and she was formally sworn in on September 27, 2022. The long-awaited confirmation filled a space that hadn’t had a permanent director since Michael Weahkee, a Pueblo of Zuni citizen, stepped down in 2021. In the interim, Elizabeth Fowler, of the Comanche Nation, served as acting director.
Tso’s resume includes almost 40 years of professional experience working at all levels of the IHS. Before taking over as IHS director, she led the IHS Navajo area, the largest IHS regional area, managing more than 4000 employees and a budget of nearly $1 billion.
She also brings “decades of lived experience as a member of the Navajo Nation,” she said in a 40-minute Senate hearing with the US Senate Committee on Indian Affairs in May.
The first Navajo Nation citizen to head the IHS (and only the second woman to do so), Tso introduced herself in Navajo: Deeschii’nii (Start of the Red Streak People) and born for Hashk’aa hadzohi (Yucca Fruit Strung Out). “This is a historic achievement for all of our Navajo people and tribal nations across the country,” Navajo Nation President Jonathan Nez said. “To have one of our own Navajo members in the highest position with IHS is remarkable.”
Tso spoke of having to “navigate the services provided by the Agency for myself, family, and friends.” Her personal and professional backgrounds, she said, help her understand how patients experience the system and how that can be improved. “The health care provided at IHS is critical for those we serve. I understand this not just because I work there,” she said. “My family relies on IHS. My friends rely on IHS. I rely on the IHS.”
The long lacuna in confirming a permanent IHS director left the Native peoples particularly vulnerable—when the COVID-19 pandemic essentially worsened the existing problems they faced, such as diabetes mellitus and cancer. Life expectancy for Native people fell by more than 6 years between 2019 and 2021, to 65 years, compared with the US average of 76 years.
Without a full-time IHS leader, the National Council of Urban Indian Health said in a statement, tribal nations and other Native health care providers struggled to raise and address the issues they were facing amid the pandemic. “Since the resignation of Rear Admiral Weahkee, there have been countless requests from Indian Country calling on Congress and the Administration to nominate a new IHS director to address the growing health disparities experienced by AI/ANs.”
Tso laid out her priorities in her May testimony: creating a more unified health care system using the latest technology to develop centralized systems; improving accountability, transparency, and patient safety; addressing workforce needs and challenges, improving recruitment and retention.
Meeting her goals, she noted, would take “strong partnerships and communication with our Tribal partners…. Each tribe has unique needs, and those needs cannot be met if you do not understand them.”
Last year, President Joseph R. Biden asked Congress to significantly increase IHS funding, but his proposal was cut to $400 million. “For years, IHS has been funded at a rate that is far below its level of need, and the results of this historical neglect can be seen in the disparities in health outcomes for AI/AN people,” William Smith, Valdez Native Tribe, Chairman of the National Indian Health Board (NIHB), wrote to the Senate Committee on Indian Affairs, on the topic of the next IHS director. “Perhaps one of the greatest challenges facing the [Indian, tribal and urban] system is the chronic and severe underfunding and budgetary instability for health care and public health services infrastructure and delivery. Since its creation in 1955, IHS has been chronically underfunded, with annual appropriations never exceeding 50% of demonstrated need. This underfunding has contributed to substandard investment in health delivery systems, some of the worst health disparities among any US population and a severe lack of public health infrastructure and services for our people. At the start of the COVID-19 pandemic these vulnerabilities were starkly exposed and while Congress moved decisively to invest into Tribal health and public health, the new Director must work to maintain these one-time investments.”
Stacy Bohlen, NIHB chief executive, told The Oklahoman that tribal leaders will look to Tso to press Congress for more money and to secure mandatory full funding for IHS—in contrast with the current annual appropriations, where Congress includes IHS in much larger budget bills. “When those bills stall, so does the money tribal clinics need to pay employees and suppliers,” making it hard to recruit and retain employees. “In the Indian Health System,” Bohlen says, “we simply can’t afford that kind of vulnerability.”
Securing advance appropriations and, ultimately, full mandatory funding for IHS, Smith wrote in his letter to the Senate committee, “fulfills the commitment made to our people generations ago and breaks down the systemic healthcare funding inequities the federal government tolerates for Tribes.”
Tso emphasized her intent to “improve the physical, mental, social, and spiritual health and well-being of all American Indians and Alaskan Natives served by the Agency.” Tso “understands the healthcare needs that many first people of this country deal with,” President Nez said. “Her work ethic, value system and approach to problem solving demonstrates the resilience of Indigenous peoples and the commitment to combat the systemic inequities that impact tribal nations.”
It took 609 days, but the US Senate has finally (unanimously) confirmed President Biden’s choice to head the Indian Health Service (IHS: Roselyn Tso.)
President Biden nominated Tso in March 2022, and she was formally sworn in on September 27, 2022. The long-awaited confirmation filled a space that hadn’t had a permanent director since Michael Weahkee, a Pueblo of Zuni citizen, stepped down in 2021. In the interim, Elizabeth Fowler, of the Comanche Nation, served as acting director.
Tso’s resume includes almost 40 years of professional experience working at all levels of the IHS. Before taking over as IHS director, she led the IHS Navajo area, the largest IHS regional area, managing more than 4000 employees and a budget of nearly $1 billion.
She also brings “decades of lived experience as a member of the Navajo Nation,” she said in a 40-minute Senate hearing with the US Senate Committee on Indian Affairs in May.
The first Navajo Nation citizen to head the IHS (and only the second woman to do so), Tso introduced herself in Navajo: Deeschii’nii (Start of the Red Streak People) and born for Hashk’aa hadzohi (Yucca Fruit Strung Out). “This is a historic achievement for all of our Navajo people and tribal nations across the country,” Navajo Nation President Jonathan Nez said. “To have one of our own Navajo members in the highest position with IHS is remarkable.”
Tso spoke of having to “navigate the services provided by the Agency for myself, family, and friends.” Her personal and professional backgrounds, she said, help her understand how patients experience the system and how that can be improved. “The health care provided at IHS is critical for those we serve. I understand this not just because I work there,” she said. “My family relies on IHS. My friends rely on IHS. I rely on the IHS.”
The long lacuna in confirming a permanent IHS director left the Native peoples particularly vulnerable—when the COVID-19 pandemic essentially worsened the existing problems they faced, such as diabetes mellitus and cancer. Life expectancy for Native people fell by more than 6 years between 2019 and 2021, to 65 years, compared with the US average of 76 years.
Without a full-time IHS leader, the National Council of Urban Indian Health said in a statement, tribal nations and other Native health care providers struggled to raise and address the issues they were facing amid the pandemic. “Since the resignation of Rear Admiral Weahkee, there have been countless requests from Indian Country calling on Congress and the Administration to nominate a new IHS director to address the growing health disparities experienced by AI/ANs.”
Tso laid out her priorities in her May testimony: creating a more unified health care system using the latest technology to develop centralized systems; improving accountability, transparency, and patient safety; addressing workforce needs and challenges, improving recruitment and retention.
Meeting her goals, she noted, would take “strong partnerships and communication with our Tribal partners…. Each tribe has unique needs, and those needs cannot be met if you do not understand them.”
Last year, President Joseph R. Biden asked Congress to significantly increase IHS funding, but his proposal was cut to $400 million. “For years, IHS has been funded at a rate that is far below its level of need, and the results of this historical neglect can be seen in the disparities in health outcomes for AI/AN people,” William Smith, Valdez Native Tribe, Chairman of the National Indian Health Board (NIHB), wrote to the Senate Committee on Indian Affairs, on the topic of the next IHS director. “Perhaps one of the greatest challenges facing the [Indian, tribal and urban] system is the chronic and severe underfunding and budgetary instability for health care and public health services infrastructure and delivery. Since its creation in 1955, IHS has been chronically underfunded, with annual appropriations never exceeding 50% of demonstrated need. This underfunding has contributed to substandard investment in health delivery systems, some of the worst health disparities among any US population and a severe lack of public health infrastructure and services for our people. At the start of the COVID-19 pandemic these vulnerabilities were starkly exposed and while Congress moved decisively to invest into Tribal health and public health, the new Director must work to maintain these one-time investments.”
Stacy Bohlen, NIHB chief executive, told The Oklahoman that tribal leaders will look to Tso to press Congress for more money and to secure mandatory full funding for IHS—in contrast with the current annual appropriations, where Congress includes IHS in much larger budget bills. “When those bills stall, so does the money tribal clinics need to pay employees and suppliers,” making it hard to recruit and retain employees. “In the Indian Health System,” Bohlen says, “we simply can’t afford that kind of vulnerability.”
Securing advance appropriations and, ultimately, full mandatory funding for IHS, Smith wrote in his letter to the Senate committee, “fulfills the commitment made to our people generations ago and breaks down the systemic healthcare funding inequities the federal government tolerates for Tribes.”
Tso emphasized her intent to “improve the physical, mental, social, and spiritual health and well-being of all American Indians and Alaskan Natives served by the Agency.” Tso “understands the healthcare needs that many first people of this country deal with,” President Nez said. “Her work ethic, value system and approach to problem solving demonstrates the resilience of Indigenous peoples and the commitment to combat the systemic inequities that impact tribal nations.”
Antioxidant-rich diet may reduce Helicobacter pylori risk
People who eat a balanced diet with sufficient antioxidants from fruits and vegetables may face reduced risks for Heliobacter pylori infections, according to a new report.
In particular, patients with an H. pylori infection were more likely to score lower on the Dietary Antioxidant Index (DAI), which was created to consider a diet’s entire antioxidant profile.
“Available evidence indicates that diet has an important role in developing H. pylori infection. Therefore, protective dietary factors are important from a public health point of view,” Farzad Shidfar, a professor of nutrition at the Iran University of Medical Sciences, Tehran, and member of the university’s colorectal research center, and colleagues write.
“While some nutritional research has widely focused on single nutrients or foods in diet-disease relations, the overall diet could be more informative because humans typically consume a combination of nutrients and foods,” they write. “Dietary indices such as DAI are one of the approaches for this purpose.”
The study was published online in BMC Gastroenterology.
Measuring antioxidant intake
Previous research has indicated an inverse association between the DAI and inflammatory diseases, the study authors write, including gastric cancer, colorectal cancer, nonalcoholic fatty liver disease, and obesity. Studies have also indicated that H. pylori infection is related to deficiencies in vitamins A, C, and E, which have antioxidant properties.
In a case-control study, the research team compared the dietary intake of 148 patients with H. pylori to 302 healthy controls without infection. The patients in the H. pylori–positive group were recruited between June 2021 and November 2021 from the gastroenterology clinic at Rasoul-e-Akram Hospital in Tehran, where they were newly diagnosed with active infection and not yet under treatment.
The researchers calculated the DAI based on dietary intake information from a validated, 168-item food frequency questionnaire used in Iran. The participants were asked about their dietary intake based on the average day, week, month, and year. They also discussed serving sizes of food items, and to increase the accuracy of estimates, interviewers showed household measurements or serving sizes to confirm the measurements with participants.
The average age of the study participants was 39 years, and about 60% were women. Compared with the healthy controls, those with H. pylori were significantly older, had higher body mass index, and smoked more.
Overall, patients with H. pylori had a significantly lower intake of vitamin A, vitamin E, manganese, and selenium. Other differences in dietary intake – for vitamin C and zinc – were not significant.
The average total DAI was significantly higher in the healthy controls, at 7.67, as compared with 3.57 in the patients with H. pylori. The risk for infection decreased as continuous DAI increased.
After adjusting for several variables, the researchers found that participants with less than the median DAI values had an increased risk of developing an H. pylori infection.
“A balanced diet, especially high consumption of fruits and vegetables, might protect people against the consequences of H. pylori infection,” the study authors write. “On the contrary, a diet full of carbohydrates and sweets is related to a higher H. pylori infection prevalence.”
Why a good diet may help combat infection
The findings are consistent with other studies that have noted a higher intake of fruits and vegetables among healthy people compared with those who have H. pylori infections, the study authors write. Animal studies have also indicated that taking vitamins A, C, and E and selenium can lead to a reduction in H. pylori growth.
“Several biologically plausible reasons may explain why dietary antioxidants might be, either directly or indirectly, a protective factor against H. pylori infection,” the researchers write. “It is well-known that antioxidants, with their free radical scavenging activities, can inhibit the growth of H. pylori.”
H. pylori is urease-positive and can synthesize a large amount of urease for ammonia production to neutralize gastric acid, which allows it to colonize in the stomach epithelium, the study authors write. Vitamin C inhibits urease activity and improves the stimulation of granulocytes, macrophages, lymphocytes, and immunoglobulin production. Other nutrients, such as zinc, may inhibit the urease enzyme and prevent H. pylori adhesion to gastric tissues, they write.
“Dietary elements have previously been shown to dramatically alter pathogenic responses to H. pylori infections,” Richard Peek Jr., MD, professor of medicine and director of gastroenterology at Vanderbilt University Medical Center, Nashville, Tenn., told this news organization.
Dr. Peek, who wasn’t involved with this study, and colleagues found that iron deficiency is linked with altered bile metabolism, which can promote H. pylori–induced gastric carcinogenesis.
“The current study is important, as it suggests that shifting to a diet rich in antioxidants may be beneficial in terms of H. pylori infection,” he said.
At the same time, Dr. Peek expressed caution about generalizing the results across populations.
“Most of the persons enrolled in this study were likely infected with H. pylori as children,” he noted. “Therefore, the inverse role of antioxidant-rich diets and H. pylori infection must be interpreted with caution.”
Future studies should confirm the findings in other groups and determine whether antioxidant-rich diets limit the diseases caused by H. pylori infection, Dr. Peek added.
The study was not funded by any research center, and the authors declared no conflicts of interest. Dr. Peek reported no relevant disclosures.
A version of this article first appeared on Medscape.com.
People who eat a balanced diet with sufficient antioxidants from fruits and vegetables may face reduced risks for Heliobacter pylori infections, according to a new report.
In particular, patients with an H. pylori infection were more likely to score lower on the Dietary Antioxidant Index (DAI), which was created to consider a diet’s entire antioxidant profile.
“Available evidence indicates that diet has an important role in developing H. pylori infection. Therefore, protective dietary factors are important from a public health point of view,” Farzad Shidfar, a professor of nutrition at the Iran University of Medical Sciences, Tehran, and member of the university’s colorectal research center, and colleagues write.
“While some nutritional research has widely focused on single nutrients or foods in diet-disease relations, the overall diet could be more informative because humans typically consume a combination of nutrients and foods,” they write. “Dietary indices such as DAI are one of the approaches for this purpose.”
The study was published online in BMC Gastroenterology.
Measuring antioxidant intake
Previous research has indicated an inverse association between the DAI and inflammatory diseases, the study authors write, including gastric cancer, colorectal cancer, nonalcoholic fatty liver disease, and obesity. Studies have also indicated that H. pylori infection is related to deficiencies in vitamins A, C, and E, which have antioxidant properties.
In a case-control study, the research team compared the dietary intake of 148 patients with H. pylori to 302 healthy controls without infection. The patients in the H. pylori–positive group were recruited between June 2021 and November 2021 from the gastroenterology clinic at Rasoul-e-Akram Hospital in Tehran, where they were newly diagnosed with active infection and not yet under treatment.
The researchers calculated the DAI based on dietary intake information from a validated, 168-item food frequency questionnaire used in Iran. The participants were asked about their dietary intake based on the average day, week, month, and year. They also discussed serving sizes of food items, and to increase the accuracy of estimates, interviewers showed household measurements or serving sizes to confirm the measurements with participants.
The average age of the study participants was 39 years, and about 60% were women. Compared with the healthy controls, those with H. pylori were significantly older, had higher body mass index, and smoked more.
Overall, patients with H. pylori had a significantly lower intake of vitamin A, vitamin E, manganese, and selenium. Other differences in dietary intake – for vitamin C and zinc – were not significant.
The average total DAI was significantly higher in the healthy controls, at 7.67, as compared with 3.57 in the patients with H. pylori. The risk for infection decreased as continuous DAI increased.
After adjusting for several variables, the researchers found that participants with less than the median DAI values had an increased risk of developing an H. pylori infection.
“A balanced diet, especially high consumption of fruits and vegetables, might protect people against the consequences of H. pylori infection,” the study authors write. “On the contrary, a diet full of carbohydrates and sweets is related to a higher H. pylori infection prevalence.”
Why a good diet may help combat infection
The findings are consistent with other studies that have noted a higher intake of fruits and vegetables among healthy people compared with those who have H. pylori infections, the study authors write. Animal studies have also indicated that taking vitamins A, C, and E and selenium can lead to a reduction in H. pylori growth.
“Several biologically plausible reasons may explain why dietary antioxidants might be, either directly or indirectly, a protective factor against H. pylori infection,” the researchers write. “It is well-known that antioxidants, with their free radical scavenging activities, can inhibit the growth of H. pylori.”
H. pylori is urease-positive and can synthesize a large amount of urease for ammonia production to neutralize gastric acid, which allows it to colonize in the stomach epithelium, the study authors write. Vitamin C inhibits urease activity and improves the stimulation of granulocytes, macrophages, lymphocytes, and immunoglobulin production. Other nutrients, such as zinc, may inhibit the urease enzyme and prevent H. pylori adhesion to gastric tissues, they write.
“Dietary elements have previously been shown to dramatically alter pathogenic responses to H. pylori infections,” Richard Peek Jr., MD, professor of medicine and director of gastroenterology at Vanderbilt University Medical Center, Nashville, Tenn., told this news organization.
Dr. Peek, who wasn’t involved with this study, and colleagues found that iron deficiency is linked with altered bile metabolism, which can promote H. pylori–induced gastric carcinogenesis.
“The current study is important, as it suggests that shifting to a diet rich in antioxidants may be beneficial in terms of H. pylori infection,” he said.
At the same time, Dr. Peek expressed caution about generalizing the results across populations.
“Most of the persons enrolled in this study were likely infected with H. pylori as children,” he noted. “Therefore, the inverse role of antioxidant-rich diets and H. pylori infection must be interpreted with caution.”
Future studies should confirm the findings in other groups and determine whether antioxidant-rich diets limit the diseases caused by H. pylori infection, Dr. Peek added.
The study was not funded by any research center, and the authors declared no conflicts of interest. Dr. Peek reported no relevant disclosures.
A version of this article first appeared on Medscape.com.
People who eat a balanced diet with sufficient antioxidants from fruits and vegetables may face reduced risks for Heliobacter pylori infections, according to a new report.
In particular, patients with an H. pylori infection were more likely to score lower on the Dietary Antioxidant Index (DAI), which was created to consider a diet’s entire antioxidant profile.
“Available evidence indicates that diet has an important role in developing H. pylori infection. Therefore, protective dietary factors are important from a public health point of view,” Farzad Shidfar, a professor of nutrition at the Iran University of Medical Sciences, Tehran, and member of the university’s colorectal research center, and colleagues write.
“While some nutritional research has widely focused on single nutrients or foods in diet-disease relations, the overall diet could be more informative because humans typically consume a combination of nutrients and foods,” they write. “Dietary indices such as DAI are one of the approaches for this purpose.”
The study was published online in BMC Gastroenterology.
Measuring antioxidant intake
Previous research has indicated an inverse association between the DAI and inflammatory diseases, the study authors write, including gastric cancer, colorectal cancer, nonalcoholic fatty liver disease, and obesity. Studies have also indicated that H. pylori infection is related to deficiencies in vitamins A, C, and E, which have antioxidant properties.
In a case-control study, the research team compared the dietary intake of 148 patients with H. pylori to 302 healthy controls without infection. The patients in the H. pylori–positive group were recruited between June 2021 and November 2021 from the gastroenterology clinic at Rasoul-e-Akram Hospital in Tehran, where they were newly diagnosed with active infection and not yet under treatment.
The researchers calculated the DAI based on dietary intake information from a validated, 168-item food frequency questionnaire used in Iran. The participants were asked about their dietary intake based on the average day, week, month, and year. They also discussed serving sizes of food items, and to increase the accuracy of estimates, interviewers showed household measurements or serving sizes to confirm the measurements with participants.
The average age of the study participants was 39 years, and about 60% were women. Compared with the healthy controls, those with H. pylori were significantly older, had higher body mass index, and smoked more.
Overall, patients with H. pylori had a significantly lower intake of vitamin A, vitamin E, manganese, and selenium. Other differences in dietary intake – for vitamin C and zinc – were not significant.
The average total DAI was significantly higher in the healthy controls, at 7.67, as compared with 3.57 in the patients with H. pylori. The risk for infection decreased as continuous DAI increased.
After adjusting for several variables, the researchers found that participants with less than the median DAI values had an increased risk of developing an H. pylori infection.
“A balanced diet, especially high consumption of fruits and vegetables, might protect people against the consequences of H. pylori infection,” the study authors write. “On the contrary, a diet full of carbohydrates and sweets is related to a higher H. pylori infection prevalence.”
Why a good diet may help combat infection
The findings are consistent with other studies that have noted a higher intake of fruits and vegetables among healthy people compared with those who have H. pylori infections, the study authors write. Animal studies have also indicated that taking vitamins A, C, and E and selenium can lead to a reduction in H. pylori growth.
“Several biologically plausible reasons may explain why dietary antioxidants might be, either directly or indirectly, a protective factor against H. pylori infection,” the researchers write. “It is well-known that antioxidants, with their free radical scavenging activities, can inhibit the growth of H. pylori.”
H. pylori is urease-positive and can synthesize a large amount of urease for ammonia production to neutralize gastric acid, which allows it to colonize in the stomach epithelium, the study authors write. Vitamin C inhibits urease activity and improves the stimulation of granulocytes, macrophages, lymphocytes, and immunoglobulin production. Other nutrients, such as zinc, may inhibit the urease enzyme and prevent H. pylori adhesion to gastric tissues, they write.
“Dietary elements have previously been shown to dramatically alter pathogenic responses to H. pylori infections,” Richard Peek Jr., MD, professor of medicine and director of gastroenterology at Vanderbilt University Medical Center, Nashville, Tenn., told this news organization.
Dr. Peek, who wasn’t involved with this study, and colleagues found that iron deficiency is linked with altered bile metabolism, which can promote H. pylori–induced gastric carcinogenesis.
“The current study is important, as it suggests that shifting to a diet rich in antioxidants may be beneficial in terms of H. pylori infection,” he said.
At the same time, Dr. Peek expressed caution about generalizing the results across populations.
“Most of the persons enrolled in this study were likely infected with H. pylori as children,” he noted. “Therefore, the inverse role of antioxidant-rich diets and H. pylori infection must be interpreted with caution.”
Future studies should confirm the findings in other groups and determine whether antioxidant-rich diets limit the diseases caused by H. pylori infection, Dr. Peek added.
The study was not funded by any research center, and the authors declared no conflicts of interest. Dr. Peek reported no relevant disclosures.
A version of this article first appeared on Medscape.com.
FROM BMC GASTROENTEROLOGY
COMMENT & CONTROVERSY
Misoprostol: Clinical pharmacology in obstetrics and gynecology
ROBERT L. BARBIERI, MD (JULY 2022)
Outcomes from my practice’s pilot study
In his recent editorial, Dr. Barbieri addressed the important topic of office-based cervical ripening prior to inpatient induction of labor. In order to decrease the length of labor and increase the success of vaginal delivery, the cervical factor is of prime importance. Patients with an unfavorable cervix (Bishop score of ≥6) are more likely to experience longer labor, risk of infection, fetal distress, etc, and may end up with an unwanted cesarean delivery. To prevent the above, numerous approaches (mechanical methods, double-balloon catheter, laminaria, misoprostol among others) have been discussed.
The inclusion criteria for office-based cervical ripening are low-risk patients, singleton pregnancies between 39 and 40 weeks of gestation, and cephalic presentation. The details of inclusion and exclusion criteria have to be determined by each practice individually. Our practice went a step further. We performed a small pilot study to assess the safety and efficacy of office cervical ripening in low-risk primigravid patients with low Bishop scores who were not scheduled for induction in anticipation of labor. Ten primigravid patients with poor Bishop scores (6 or less) were administered 50 µg misoprostol at 39+ weeks of pregnancy in the office setting. Bishop scores were taken twice per week until delivery. In 7 out of 10 patients, the Bishop score became favorable within a week of treatment, and in 3 patients the Bishop score remained the same. Three out of 10 patients experienced self-limited episodes of uterine contractility, and 2 of the patients went into labor within 3 days of using misoprostol. All patients were delivered within 2 weeks of treatment without an induction: 8 delivered vaginally, and 2 by cesarean delivery.2
Cesarean delivery was done for fetal distress (1 case) and prolonged second stage of labor (1 case). All neonates were born in satisfactory condition with Apgar scores between 7 and 10. Our preliminary results demonstrated marked improvement in cervical ripening judged by the Bishop score in 70% of patients.2
A prospective randomized study should be performed with the following agenda:
- Does late pregnancy medical cervical ripening in low-risk patients affect labor course and cesarean delivery rate?
- What is the optimal dose and route of administration of misoprostol?3,4
References
- Barbieri R. Office-based ambulatory cervical ripening prior to in patient induction of labor. OBG Manag. 2021;33:9-13.
- Petrikovsky B. Should cervical ripening become routine in primigravid low risk patients [In press]. Neonat Int Care. 2022:1, 4-6.
- Sharami SH, Milani F, Faraji R. Comparison of 25 µg sublingual and 50 µg intravaginal misoprostol for cervical ripening and labor: a randomized controlled equivalence trial. Arch Med. 2014:10:653-656.
- Barbieri R. Misoprostol: clinical pharmacology in obstetrics and gynecology. OBG Manag. 2022:34:7, 8-12.
B. Petrikovsky, MD, PhD
New Hyde Park, New York
Dr. Barbieri responds
I appreciate that Dr. Petrikovsky took time from a busy practice to provide our readers with his very innovative idea. I agree with him that a clinical trial is warranted to test the effects of late pregnancy medical cervical ripening in low-risk patients on labor course and birth outcome. Maybe one of our readers will take on the challenge to complete such a trial! ●
Misoprostol: Clinical pharmacology in obstetrics and gynecology
ROBERT L. BARBIERI, MD (JULY 2022)
Outcomes from my practice’s pilot study
In his recent editorial, Dr. Barbieri addressed the important topic of office-based cervical ripening prior to inpatient induction of labor. In order to decrease the length of labor and increase the success of vaginal delivery, the cervical factor is of prime importance. Patients with an unfavorable cervix (Bishop score of ≥6) are more likely to experience longer labor, risk of infection, fetal distress, etc, and may end up with an unwanted cesarean delivery. To prevent the above, numerous approaches (mechanical methods, double-balloon catheter, laminaria, misoprostol among others) have been discussed.
The inclusion criteria for office-based cervical ripening are low-risk patients, singleton pregnancies between 39 and 40 weeks of gestation, and cephalic presentation. The details of inclusion and exclusion criteria have to be determined by each practice individually. Our practice went a step further. We performed a small pilot study to assess the safety and efficacy of office cervical ripening in low-risk primigravid patients with low Bishop scores who were not scheduled for induction in anticipation of labor. Ten primigravid patients with poor Bishop scores (6 or less) were administered 50 µg misoprostol at 39+ weeks of pregnancy in the office setting. Bishop scores were taken twice per week until delivery. In 7 out of 10 patients, the Bishop score became favorable within a week of treatment, and in 3 patients the Bishop score remained the same. Three out of 10 patients experienced self-limited episodes of uterine contractility, and 2 of the patients went into labor within 3 days of using misoprostol. All patients were delivered within 2 weeks of treatment without an induction: 8 delivered vaginally, and 2 by cesarean delivery.2
Cesarean delivery was done for fetal distress (1 case) and prolonged second stage of labor (1 case). All neonates were born in satisfactory condition with Apgar scores between 7 and 10. Our preliminary results demonstrated marked improvement in cervical ripening judged by the Bishop score in 70% of patients.2
A prospective randomized study should be performed with the following agenda:
- Does late pregnancy medical cervical ripening in low-risk patients affect labor course and cesarean delivery rate?
- What is the optimal dose and route of administration of misoprostol?3,4
References
- Barbieri R. Office-based ambulatory cervical ripening prior to in patient induction of labor. OBG Manag. 2021;33:9-13.
- Petrikovsky B. Should cervical ripening become routine in primigravid low risk patients [In press]. Neonat Int Care. 2022:1, 4-6.
- Sharami SH, Milani F, Faraji R. Comparison of 25 µg sublingual and 50 µg intravaginal misoprostol for cervical ripening and labor: a randomized controlled equivalence trial. Arch Med. 2014:10:653-656.
- Barbieri R. Misoprostol: clinical pharmacology in obstetrics and gynecology. OBG Manag. 2022:34:7, 8-12.
B. Petrikovsky, MD, PhD
New Hyde Park, New York
Dr. Barbieri responds
I appreciate that Dr. Petrikovsky took time from a busy practice to provide our readers with his very innovative idea. I agree with him that a clinical trial is warranted to test the effects of late pregnancy medical cervical ripening in low-risk patients on labor course and birth outcome. Maybe one of our readers will take on the challenge to complete such a trial! ●
Misoprostol: Clinical pharmacology in obstetrics and gynecology
ROBERT L. BARBIERI, MD (JULY 2022)
Outcomes from my practice’s pilot study
In his recent editorial, Dr. Barbieri addressed the important topic of office-based cervical ripening prior to inpatient induction of labor. In order to decrease the length of labor and increase the success of vaginal delivery, the cervical factor is of prime importance. Patients with an unfavorable cervix (Bishop score of ≥6) are more likely to experience longer labor, risk of infection, fetal distress, etc, and may end up with an unwanted cesarean delivery. To prevent the above, numerous approaches (mechanical methods, double-balloon catheter, laminaria, misoprostol among others) have been discussed.
The inclusion criteria for office-based cervical ripening are low-risk patients, singleton pregnancies between 39 and 40 weeks of gestation, and cephalic presentation. The details of inclusion and exclusion criteria have to be determined by each practice individually. Our practice went a step further. We performed a small pilot study to assess the safety and efficacy of office cervical ripening in low-risk primigravid patients with low Bishop scores who were not scheduled for induction in anticipation of labor. Ten primigravid patients with poor Bishop scores (6 or less) were administered 50 µg misoprostol at 39+ weeks of pregnancy in the office setting. Bishop scores were taken twice per week until delivery. In 7 out of 10 patients, the Bishop score became favorable within a week of treatment, and in 3 patients the Bishop score remained the same. Three out of 10 patients experienced self-limited episodes of uterine contractility, and 2 of the patients went into labor within 3 days of using misoprostol. All patients were delivered within 2 weeks of treatment without an induction: 8 delivered vaginally, and 2 by cesarean delivery.2
Cesarean delivery was done for fetal distress (1 case) and prolonged second stage of labor (1 case). All neonates were born in satisfactory condition with Apgar scores between 7 and 10. Our preliminary results demonstrated marked improvement in cervical ripening judged by the Bishop score in 70% of patients.2
A prospective randomized study should be performed with the following agenda:
- Does late pregnancy medical cervical ripening in low-risk patients affect labor course and cesarean delivery rate?
- What is the optimal dose and route of administration of misoprostol?3,4
References
- Barbieri R. Office-based ambulatory cervical ripening prior to in patient induction of labor. OBG Manag. 2021;33:9-13.
- Petrikovsky B. Should cervical ripening become routine in primigravid low risk patients [In press]. Neonat Int Care. 2022:1, 4-6.
- Sharami SH, Milani F, Faraji R. Comparison of 25 µg sublingual and 50 µg intravaginal misoprostol for cervical ripening and labor: a randomized controlled equivalence trial. Arch Med. 2014:10:653-656.
- Barbieri R. Misoprostol: clinical pharmacology in obstetrics and gynecology. OBG Manag. 2022:34:7, 8-12.
B. Petrikovsky, MD, PhD
New Hyde Park, New York
Dr. Barbieri responds
I appreciate that Dr. Petrikovsky took time from a busy practice to provide our readers with his very innovative idea. I agree with him that a clinical trial is warranted to test the effects of late pregnancy medical cervical ripening in low-risk patients on labor course and birth outcome. Maybe one of our readers will take on the challenge to complete such a trial! ●
Clinical psychoeconomics: Accounting for money matters in psychiatric assessment and treatment
Despite money’s central role in our psychic lives, many trainees – and some seasoned practitioners – skirt around financial issues. Some clinicians confess that inquiring about patients’ finances feels “too personal.” They fear that asking about money could suggest that the clinician is primarily concerned with getting paid. Some clinicians feel that looking into patients’ finances might be unprofessional, outside one’s scope of practice. But it is not.
Trainees often receive little guidance concerning money matters in patients’ lives and treatments, considerations we have labeled clinical psychoeconomics. Considerable evidence suggests that financial concerns often provoke emotional distress and dysfunctional behaviors, and directly influence patient’s health care decisions. Financial issues also influence how clinicians view and react to patients.
We have recently reviewed (and illustrated through case vignettes) how money matters might impact psychiatric assessment, case formulation, treatment planning, and ongoing psychiatric treatments including psychotherapies.1 Consider how money affects people’s lives: Money helps people meet multiple practical, psychological, and social needs by enabling them to obtain food, clothing, shelter, other material goods, services, discretionary time, and opportunities. And money strongly influences relationships. Regardless of poverty or wealth, thoughts and behaviors connected to acquiring, possessing, and disposing of money, and feelings accompanying these processes such as greed, neediness, envy, pride, shame, guilt, and self-satisfaction often underly intrapsychic and interpersonal conflicts.
Individuals constantly engage in numerous simultaneous conscious, preconscious, and unconscious neuro-economic trade-offs that determine goals, efforts, and timing. Many are financially influenced. Money influences how virtually all patients seek, receive, and sustain their mental health care including psychotherapy.
Money problems can be associated with insecurity, impotence, feeling unloved, and lack of freedom or subjugation. Individuals may resent how they’re forced to acquire money, and feel shamed or morally injured by their jobs, financial dependence on other family members, public assistance, or their questionable ways of obtaining money.
Impoverished individuals may face choosing between food, housing, medications, and medical care. Domestically abused individuals may reluctantly remain with their abusers, risking physical harm or death rather than face destitution. Some families tolerate severely disabled individuals at home because they rely on their disability checks and caregiver payments. Suicides may turn on how individuals forecast financial repercussions affecting their families. Desires to avoid debt may lead to treatment avoidance.
Individuals with enough money to get by face daily financially related choices involving competing needs, desires, values, and loyalties. They may experience conflicts concerning spending on necessities vs. indulgences or spending on oneself vs. significant others.
Whereas some wealthy individuals may assume unwarranted airs of superiority and entitlement, others may feel guilty about wealth, or fearful that others like them only for their money. Individuals on the receiving end of wealth may feel emotionally and behaviorally manipulated by their benefactors.
Assessment
Assessments should consider how financial matters have shaped patients’ early psychological development as well their current lives. How do patients’ emotions, thoughts, and behaviors reflect money matters? What money-related pathologies are evident? What aspects of the patient’s “financial world” seem modifiable?
Financial questions should be posed colloquially. Screeners include: “Where do you live?”, “Who’s in the home?”, “How do you (all) manage financially?”, “What do you all do for a living?”, “How do you make ends meet?”, and “What financial problems are you facing?” Clinicians can quickly learn about patients’ financial self-sufficiencies, individuals for whom they bear financial responsibility, and others they rely on for support, for example, relatives. If patients avoid answering such questions forthrightly, particularly when financial arrangements are “complicated,” clinicians will want to revisit these issues later after establishing a firmer alliance but continue to wonder about the meaning of the patient’s reluctance.
When explicit, patients, families, and couples are fully aware of the conflicts but have difficulty resolving financial disputes. When conflicts are implicit, money problems may be unacknowledged, avoided, denied, or minimized. Conflicts concerning money are often transmitted trans-generationally.
Psychopathological conditions unequivocally linked to money include compulsive shopping, gambling disorders, miserly hoarding, impulse buying, and spending sprees during hypomanic and manic states. Mounting debts may create progressively insurmountable sources of distress. Money can be weaponized to sadistically create enticement, envy, or deprivation. Some monetarily antisocial individuals compromise interpersonal relationships as well as treatments. Individuals with alcohol/substance use disorders may spend so much on substances that little is left for necessities. Financially needy individuals may engage in morally questionable behaviors they might otherwise shun.
Case formulation and treatment planning
Incorporating money matters into case formulations entails demonstrating how financial concerns influenced maladaptive development and distort current attitudes, perceptions, and behaviors.
Concurrently, clinicians should acknowledge patients’ reality-based fiscal decisions, appreciating cultural and family value differences concerning how money should be acquired and spent. Since money often determines frequency and duration of treatment visits, clinicians are ethically obligated to discuss with patients what they might expect from different medications and psychotherapies, and their comparative costs.
Money matters’ impact on psychotherapies
Money matters often affect transference and countertransference reactions. Some reactions stem from how patients and clinicians compare their own financial situations with those of the other.
To help identify and ameliorate money-related countertransference responses, clinicians can reflect on questions such as: “How comfortable are you with people who are much poorer or richer than you are?” “How comfortable are you with impoverished individuals or with multimillionaires or their children?” And “why?” For trainees, all these reactions should be discussed in supervision.
Conclusions
To summarize, four clinical psychoeconomic issues should be routinely assessed and factored into psychiatric case formulations and treatment plans: how financial issues 1) have impacted patients’ psychological development; 2) impact patients’ current lives; 3) are likely to impact access, type, intensity, and duration of treatment visits; and 4) might provoke money-related transference and countertransference concerns.
In advising patients about treatment options, clinicians should discuss each treatment’s relative effectiveness and estimated costs of care. Patients’ decisions will likely be heavily influenced by financial considerations.
Dr. Yager is based in the department of psychiatry, University of Colorado at Denver, Aurora. Dr. Kay is based in the department of psychiatry, Wright State University, Dayton, Ohio. No external funds were received for this project, and the authors have no conflicts to disclose.
Reference
1. Yager J and Kay J. Money matters in psychiatric assessment, case formulation, treatment planning, and ongoing psychotherapy: Clinical psychoeconomics. J Nerv Ment Dis. 2022 Jun 10. doi: 10.1097/NMD.0000000000001552.
Despite money’s central role in our psychic lives, many trainees – and some seasoned practitioners – skirt around financial issues. Some clinicians confess that inquiring about patients’ finances feels “too personal.” They fear that asking about money could suggest that the clinician is primarily concerned with getting paid. Some clinicians feel that looking into patients’ finances might be unprofessional, outside one’s scope of practice. But it is not.
Trainees often receive little guidance concerning money matters in patients’ lives and treatments, considerations we have labeled clinical psychoeconomics. Considerable evidence suggests that financial concerns often provoke emotional distress and dysfunctional behaviors, and directly influence patient’s health care decisions. Financial issues also influence how clinicians view and react to patients.
We have recently reviewed (and illustrated through case vignettes) how money matters might impact psychiatric assessment, case formulation, treatment planning, and ongoing psychiatric treatments including psychotherapies.1 Consider how money affects people’s lives: Money helps people meet multiple practical, psychological, and social needs by enabling them to obtain food, clothing, shelter, other material goods, services, discretionary time, and opportunities. And money strongly influences relationships. Regardless of poverty or wealth, thoughts and behaviors connected to acquiring, possessing, and disposing of money, and feelings accompanying these processes such as greed, neediness, envy, pride, shame, guilt, and self-satisfaction often underly intrapsychic and interpersonal conflicts.
Individuals constantly engage in numerous simultaneous conscious, preconscious, and unconscious neuro-economic trade-offs that determine goals, efforts, and timing. Many are financially influenced. Money influences how virtually all patients seek, receive, and sustain their mental health care including psychotherapy.
Money problems can be associated with insecurity, impotence, feeling unloved, and lack of freedom or subjugation. Individuals may resent how they’re forced to acquire money, and feel shamed or morally injured by their jobs, financial dependence on other family members, public assistance, or their questionable ways of obtaining money.
Impoverished individuals may face choosing between food, housing, medications, and medical care. Domestically abused individuals may reluctantly remain with their abusers, risking physical harm or death rather than face destitution. Some families tolerate severely disabled individuals at home because they rely on their disability checks and caregiver payments. Suicides may turn on how individuals forecast financial repercussions affecting their families. Desires to avoid debt may lead to treatment avoidance.
Individuals with enough money to get by face daily financially related choices involving competing needs, desires, values, and loyalties. They may experience conflicts concerning spending on necessities vs. indulgences or spending on oneself vs. significant others.
Whereas some wealthy individuals may assume unwarranted airs of superiority and entitlement, others may feel guilty about wealth, or fearful that others like them only for their money. Individuals on the receiving end of wealth may feel emotionally and behaviorally manipulated by their benefactors.
Assessment
Assessments should consider how financial matters have shaped patients’ early psychological development as well their current lives. How do patients’ emotions, thoughts, and behaviors reflect money matters? What money-related pathologies are evident? What aspects of the patient’s “financial world” seem modifiable?
Financial questions should be posed colloquially. Screeners include: “Where do you live?”, “Who’s in the home?”, “How do you (all) manage financially?”, “What do you all do for a living?”, “How do you make ends meet?”, and “What financial problems are you facing?” Clinicians can quickly learn about patients’ financial self-sufficiencies, individuals for whom they bear financial responsibility, and others they rely on for support, for example, relatives. If patients avoid answering such questions forthrightly, particularly when financial arrangements are “complicated,” clinicians will want to revisit these issues later after establishing a firmer alliance but continue to wonder about the meaning of the patient’s reluctance.
When explicit, patients, families, and couples are fully aware of the conflicts but have difficulty resolving financial disputes. When conflicts are implicit, money problems may be unacknowledged, avoided, denied, or minimized. Conflicts concerning money are often transmitted trans-generationally.
Psychopathological conditions unequivocally linked to money include compulsive shopping, gambling disorders, miserly hoarding, impulse buying, and spending sprees during hypomanic and manic states. Mounting debts may create progressively insurmountable sources of distress. Money can be weaponized to sadistically create enticement, envy, or deprivation. Some monetarily antisocial individuals compromise interpersonal relationships as well as treatments. Individuals with alcohol/substance use disorders may spend so much on substances that little is left for necessities. Financially needy individuals may engage in morally questionable behaviors they might otherwise shun.
Case formulation and treatment planning
Incorporating money matters into case formulations entails demonstrating how financial concerns influenced maladaptive development and distort current attitudes, perceptions, and behaviors.
Concurrently, clinicians should acknowledge patients’ reality-based fiscal decisions, appreciating cultural and family value differences concerning how money should be acquired and spent. Since money often determines frequency and duration of treatment visits, clinicians are ethically obligated to discuss with patients what they might expect from different medications and psychotherapies, and their comparative costs.
Money matters’ impact on psychotherapies
Money matters often affect transference and countertransference reactions. Some reactions stem from how patients and clinicians compare their own financial situations with those of the other.
To help identify and ameliorate money-related countertransference responses, clinicians can reflect on questions such as: “How comfortable are you with people who are much poorer or richer than you are?” “How comfortable are you with impoverished individuals or with multimillionaires or their children?” And “why?” For trainees, all these reactions should be discussed in supervision.
Conclusions
To summarize, four clinical psychoeconomic issues should be routinely assessed and factored into psychiatric case formulations and treatment plans: how financial issues 1) have impacted patients’ psychological development; 2) impact patients’ current lives; 3) are likely to impact access, type, intensity, and duration of treatment visits; and 4) might provoke money-related transference and countertransference concerns.
In advising patients about treatment options, clinicians should discuss each treatment’s relative effectiveness and estimated costs of care. Patients’ decisions will likely be heavily influenced by financial considerations.
Dr. Yager is based in the department of psychiatry, University of Colorado at Denver, Aurora. Dr. Kay is based in the department of psychiatry, Wright State University, Dayton, Ohio. No external funds were received for this project, and the authors have no conflicts to disclose.
Reference
1. Yager J and Kay J. Money matters in psychiatric assessment, case formulation, treatment planning, and ongoing psychotherapy: Clinical psychoeconomics. J Nerv Ment Dis. 2022 Jun 10. doi: 10.1097/NMD.0000000000001552.
Despite money’s central role in our psychic lives, many trainees – and some seasoned practitioners – skirt around financial issues. Some clinicians confess that inquiring about patients’ finances feels “too personal.” They fear that asking about money could suggest that the clinician is primarily concerned with getting paid. Some clinicians feel that looking into patients’ finances might be unprofessional, outside one’s scope of practice. But it is not.
Trainees often receive little guidance concerning money matters in patients’ lives and treatments, considerations we have labeled clinical psychoeconomics. Considerable evidence suggests that financial concerns often provoke emotional distress and dysfunctional behaviors, and directly influence patient’s health care decisions. Financial issues also influence how clinicians view and react to patients.
We have recently reviewed (and illustrated through case vignettes) how money matters might impact psychiatric assessment, case formulation, treatment planning, and ongoing psychiatric treatments including psychotherapies.1 Consider how money affects people’s lives: Money helps people meet multiple practical, psychological, and social needs by enabling them to obtain food, clothing, shelter, other material goods, services, discretionary time, and opportunities. And money strongly influences relationships. Regardless of poverty or wealth, thoughts and behaviors connected to acquiring, possessing, and disposing of money, and feelings accompanying these processes such as greed, neediness, envy, pride, shame, guilt, and self-satisfaction often underly intrapsychic and interpersonal conflicts.
Individuals constantly engage in numerous simultaneous conscious, preconscious, and unconscious neuro-economic trade-offs that determine goals, efforts, and timing. Many are financially influenced. Money influences how virtually all patients seek, receive, and sustain their mental health care including psychotherapy.
Money problems can be associated with insecurity, impotence, feeling unloved, and lack of freedom or subjugation. Individuals may resent how they’re forced to acquire money, and feel shamed or morally injured by their jobs, financial dependence on other family members, public assistance, or their questionable ways of obtaining money.
Impoverished individuals may face choosing between food, housing, medications, and medical care. Domestically abused individuals may reluctantly remain with their abusers, risking physical harm or death rather than face destitution. Some families tolerate severely disabled individuals at home because they rely on their disability checks and caregiver payments. Suicides may turn on how individuals forecast financial repercussions affecting their families. Desires to avoid debt may lead to treatment avoidance.
Individuals with enough money to get by face daily financially related choices involving competing needs, desires, values, and loyalties. They may experience conflicts concerning spending on necessities vs. indulgences or spending on oneself vs. significant others.
Whereas some wealthy individuals may assume unwarranted airs of superiority and entitlement, others may feel guilty about wealth, or fearful that others like them only for their money. Individuals on the receiving end of wealth may feel emotionally and behaviorally manipulated by their benefactors.
Assessment
Assessments should consider how financial matters have shaped patients’ early psychological development as well their current lives. How do patients’ emotions, thoughts, and behaviors reflect money matters? What money-related pathologies are evident? What aspects of the patient’s “financial world” seem modifiable?
Financial questions should be posed colloquially. Screeners include: “Where do you live?”, “Who’s in the home?”, “How do you (all) manage financially?”, “What do you all do for a living?”, “How do you make ends meet?”, and “What financial problems are you facing?” Clinicians can quickly learn about patients’ financial self-sufficiencies, individuals for whom they bear financial responsibility, and others they rely on for support, for example, relatives. If patients avoid answering such questions forthrightly, particularly when financial arrangements are “complicated,” clinicians will want to revisit these issues later after establishing a firmer alliance but continue to wonder about the meaning of the patient’s reluctance.
When explicit, patients, families, and couples are fully aware of the conflicts but have difficulty resolving financial disputes. When conflicts are implicit, money problems may be unacknowledged, avoided, denied, or minimized. Conflicts concerning money are often transmitted trans-generationally.
Psychopathological conditions unequivocally linked to money include compulsive shopping, gambling disorders, miserly hoarding, impulse buying, and spending sprees during hypomanic and manic states. Mounting debts may create progressively insurmountable sources of distress. Money can be weaponized to sadistically create enticement, envy, or deprivation. Some monetarily antisocial individuals compromise interpersonal relationships as well as treatments. Individuals with alcohol/substance use disorders may spend so much on substances that little is left for necessities. Financially needy individuals may engage in morally questionable behaviors they might otherwise shun.
Case formulation and treatment planning
Incorporating money matters into case formulations entails demonstrating how financial concerns influenced maladaptive development and distort current attitudes, perceptions, and behaviors.
Concurrently, clinicians should acknowledge patients’ reality-based fiscal decisions, appreciating cultural and family value differences concerning how money should be acquired and spent. Since money often determines frequency and duration of treatment visits, clinicians are ethically obligated to discuss with patients what they might expect from different medications and psychotherapies, and their comparative costs.
Money matters’ impact on psychotherapies
Money matters often affect transference and countertransference reactions. Some reactions stem from how patients and clinicians compare their own financial situations with those of the other.
To help identify and ameliorate money-related countertransference responses, clinicians can reflect on questions such as: “How comfortable are you with people who are much poorer or richer than you are?” “How comfortable are you with impoverished individuals or with multimillionaires or their children?” And “why?” For trainees, all these reactions should be discussed in supervision.
Conclusions
To summarize, four clinical psychoeconomic issues should be routinely assessed and factored into psychiatric case formulations and treatment plans: how financial issues 1) have impacted patients’ psychological development; 2) impact patients’ current lives; 3) are likely to impact access, type, intensity, and duration of treatment visits; and 4) might provoke money-related transference and countertransference concerns.
In advising patients about treatment options, clinicians should discuss each treatment’s relative effectiveness and estimated costs of care. Patients’ decisions will likely be heavily influenced by financial considerations.
Dr. Yager is based in the department of psychiatry, University of Colorado at Denver, Aurora. Dr. Kay is based in the department of psychiatry, Wright State University, Dayton, Ohio. No external funds were received for this project, and the authors have no conflicts to disclose.
Reference
1. Yager J and Kay J. Money matters in psychiatric assessment, case formulation, treatment planning, and ongoing psychotherapy: Clinical psychoeconomics. J Nerv Ment Dis. 2022 Jun 10. doi: 10.1097/NMD.0000000000001552.
Long-term antidepressant use tied to an increase in CVD, mortality risk
The investigators drew on 10-year data from the UK Biobank on over 220,000 adults and compared the risk of developing adverse health outcomes among those taking antidepressants with the risk among those who were not taking antidepressants.
After adjusting for preexisting risk factors, they found that 10-year antidepressant use was associated with a twofold higher risk of CHD, an almost-twofold higher risk of CVD as well as CVD mortality, a higher risk of cerebrovascular disease, and more than double the risk of all-cause mortality.
On the other hand, at 10 years, antidepressant use was associated with a 23% lower risk of developing hypertension and a 32% lower risk of diabetes.
The main culprits were mirtazapine, venlafaxine, duloxetine, and trazodone, although SSRIs were also tied to increased risk.
“Our message for clinicians is that prescribing of antidepressants in the long term may not be harm free [and] we hope that this study will help doctors and patients have more informed conversations when they weigh up the potential risks and benefits of treatments for depression,” study investigator Narinder Bansal, MD, honorary research fellow, Centre for Academic Health and Centre for Academic Primary Care, University of Bristol (England), said in a news release.
“Regardless of whether the drugs are the underlying cause of these problems, our findings emphasize the importance of proactive cardiovascular monitoring and prevention in patients who have depression and are on antidepressants, given that both have been associated with higher risks,” she added.
The study was published online in the British Journal of Psychiatry Open.
Monitoring of CVD risk ‘critical’
Antidepressants are among the most widely prescribed drugs; 70 million prescriptions were dispensed in 2018 alone, representing a doubling of prescriptions for these agents in a decade, the investigators noted. “This striking rise in prescribing is attributed to long-term treatment rather than an increased incidence of depression.”
Most trials that have assessed antidepressant efficacy have been “poorly suited to examining adverse outcomes.” One reason for this is that many of the trials are short-term studies. Since depression is “strongly associated” with CVD risk factors, “careful assessment of the long-term cardiometabolic effects of antidepressant treatment is critical.”
Moreover, information about “a wide range of prospectively measured confounders ... is needed to provide robust estimates of the risks associated with long-term antidepressant use,” the authors noted.
The researchers examined the association between antidepressant use and four cardiometabolic morbidity outcomes – diabetes, hypertension, cerebrovascular disease, and CHD. In addition, they assessed two mortality outcomes – CVD mortality and all-cause mortality. Participants were divided into cohorts on the basis of outcome of interest.
The dataset contains detailed information on socioeconomic status, demographics, anthropometric, behavioral, and biochemical risk factors, disability, and health status and is linked to datasets of primary care records and deaths.
The study included 222,121 participants whose data had been linked to primary care records during 2018 (median age of participants, 56-57 years). About half were women, and 96% were of White ethnicity.
Participants were excluded if they had been prescribed antidepressants 12 months or less before baseline, if they had previously been diagnosed for the outcome of interest, if they had been previously prescribed psychotropic drugs, if they used cardiometabolic drugs at baseline, or if they had undergone treatment with antidepressant polytherapy.
Potential confounders included age, gender, body mass index, waist/hip ratio, smoking and alcohol intake status, physical activity, parental history of outcome, biochemical and hematologic biomarkers, socioeconomic status, and long-term illness, disability, or infirmity.
Mechanism unclear
By the end of the 5- and 10-year follow-up periods, an average of 8% and 6% of participants in each cohort, respectively, had been prescribed an antidepressant. SSRIs constituted the most commonly prescribed class (80%-82%), and citalopram was the most commonly prescribed SSRI (46%-47%). Mirtazapine was the most frequently prescribed non-SSRI antidepressant (44%-46%).
At 5 years, any antidepressant use was associated with an increased risk for diabetes, CHD, and all-cause mortality, but the findings were attenuated after further adjustment for confounders. In fact, SSRIs were associated with a reduced risk of diabetes at 5 years (hazard ratio, 0.64; 95% confidence interval, 0.49-0.83).
At 10 years, SSRIs were associated with an increased risk of cerebrovascular disease, CVD mortality, and all-cause mortality; non-SSRIs were associated with an increased risk of CHD, CVD, and all-cause mortality.
On the other hand, SSRIs were associated with a decrease in risk of diabetes and hypertension at 10 years (HR, 0.68; 95% CI, 0.53-0.87; and HR, 0.77; 95% CI, 0.66-0.89, respectively).
“While we have taken into account a wide range of pre-existing risk factors for cardiovascular disease, including those that are linked to depression such as excess weight, smoking, and low physical activity, it is difficult to fully control for the effects of depression in this kind of study, partly because there is considerable variability in the recording of depression severity in primary care,” said Dr. Bansal.
“This is important because many people taking antidepressants such as mirtazapine, venlafaxine, duloxetine and trazodone may have a more severe depression. This makes it difficult to fully separate the effects of the depression from the effects of medication,” she said.
Further research “is needed to assess whether the associations we have seen are genuinely due to the drugs; and, if so, why this might be,” she added.
Strengths, limitations
Commenting on the study, Roger McIntyre, MD, professor of psychiatry and pharmacology and head of the mood disorders psychopharmacology unit at the University of Toronto,, discussed the strengths and weaknesses of the study.
The UK Biobank is a “well-described, well-phenotyped dataset of good quality,” said Dr. McIntyre, chairperson and executive director of the Brain and Cognitive Discover Foundation, Toronto, who was not involved with the study. Another strength is the “impressive number of variables the database contains, which enabled the authors to go much deeper into the topics.”
A “significant limitation” is the confounding that is inherent to the disorder itself – “people with depression have a much higher intrinsic risk of CVD, [cerebrovascular disease], and cardiovascular mortality,” Dr. McIntyre noted.
The researchers did not adjust for trauma or childhood maltreatment, “which are the biggest risk factors for both depression and CVD; and drug and alcohol misuse were also not accounted for.”
Additionally, “to determine whether something is an association or potentially causative, it must satisfy the Bradford-Hill criteria,” said Dr. McIntyre. “Since we’re moving more toward using these big databases and because we depend on them to give us long-term perspectives, we would want to see coherent, compelling Bradford-Hill criteria regarding causation. If you don’t have any, that’s fine too, but then it’s important to make clear that there is no clear causative line, just an association.”
The research was funded by the National Institute of Health Research School for Primary Care Research and was supported by the NI Biomedical Research Centre at University Hospitals Bristol and Weston NHS Foundation Trust and the University of Bristol. Dr. McIntyre has received research grant support from CI/GACD/National Natural Science Foundation of China and the Milken Institute and speaker/consultation fees from numerous companies. Dr. McIntyre is a CEO of Braxia Scientific.
A version of this article first appeared on Medscape.com.
The investigators drew on 10-year data from the UK Biobank on over 220,000 adults and compared the risk of developing adverse health outcomes among those taking antidepressants with the risk among those who were not taking antidepressants.
After adjusting for preexisting risk factors, they found that 10-year antidepressant use was associated with a twofold higher risk of CHD, an almost-twofold higher risk of CVD as well as CVD mortality, a higher risk of cerebrovascular disease, and more than double the risk of all-cause mortality.
On the other hand, at 10 years, antidepressant use was associated with a 23% lower risk of developing hypertension and a 32% lower risk of diabetes.
The main culprits were mirtazapine, venlafaxine, duloxetine, and trazodone, although SSRIs were also tied to increased risk.
“Our message for clinicians is that prescribing of antidepressants in the long term may not be harm free [and] we hope that this study will help doctors and patients have more informed conversations when they weigh up the potential risks and benefits of treatments for depression,” study investigator Narinder Bansal, MD, honorary research fellow, Centre for Academic Health and Centre for Academic Primary Care, University of Bristol (England), said in a news release.
“Regardless of whether the drugs are the underlying cause of these problems, our findings emphasize the importance of proactive cardiovascular monitoring and prevention in patients who have depression and are on antidepressants, given that both have been associated with higher risks,” she added.
The study was published online in the British Journal of Psychiatry Open.
Monitoring of CVD risk ‘critical’
Antidepressants are among the most widely prescribed drugs; 70 million prescriptions were dispensed in 2018 alone, representing a doubling of prescriptions for these agents in a decade, the investigators noted. “This striking rise in prescribing is attributed to long-term treatment rather than an increased incidence of depression.”
Most trials that have assessed antidepressant efficacy have been “poorly suited to examining adverse outcomes.” One reason for this is that many of the trials are short-term studies. Since depression is “strongly associated” with CVD risk factors, “careful assessment of the long-term cardiometabolic effects of antidepressant treatment is critical.”
Moreover, information about “a wide range of prospectively measured confounders ... is needed to provide robust estimates of the risks associated with long-term antidepressant use,” the authors noted.
The researchers examined the association between antidepressant use and four cardiometabolic morbidity outcomes – diabetes, hypertension, cerebrovascular disease, and CHD. In addition, they assessed two mortality outcomes – CVD mortality and all-cause mortality. Participants were divided into cohorts on the basis of outcome of interest.
The dataset contains detailed information on socioeconomic status, demographics, anthropometric, behavioral, and biochemical risk factors, disability, and health status and is linked to datasets of primary care records and deaths.
The study included 222,121 participants whose data had been linked to primary care records during 2018 (median age of participants, 56-57 years). About half were women, and 96% were of White ethnicity.
Participants were excluded if they had been prescribed antidepressants 12 months or less before baseline, if they had previously been diagnosed for the outcome of interest, if they had been previously prescribed psychotropic drugs, if they used cardiometabolic drugs at baseline, or if they had undergone treatment with antidepressant polytherapy.
Potential confounders included age, gender, body mass index, waist/hip ratio, smoking and alcohol intake status, physical activity, parental history of outcome, biochemical and hematologic biomarkers, socioeconomic status, and long-term illness, disability, or infirmity.
Mechanism unclear
By the end of the 5- and 10-year follow-up periods, an average of 8% and 6% of participants in each cohort, respectively, had been prescribed an antidepressant. SSRIs constituted the most commonly prescribed class (80%-82%), and citalopram was the most commonly prescribed SSRI (46%-47%). Mirtazapine was the most frequently prescribed non-SSRI antidepressant (44%-46%).
At 5 years, any antidepressant use was associated with an increased risk for diabetes, CHD, and all-cause mortality, but the findings were attenuated after further adjustment for confounders. In fact, SSRIs were associated with a reduced risk of diabetes at 5 years (hazard ratio, 0.64; 95% confidence interval, 0.49-0.83).
At 10 years, SSRIs were associated with an increased risk of cerebrovascular disease, CVD mortality, and all-cause mortality; non-SSRIs were associated with an increased risk of CHD, CVD, and all-cause mortality.
On the other hand, SSRIs were associated with a decrease in risk of diabetes and hypertension at 10 years (HR, 0.68; 95% CI, 0.53-0.87; and HR, 0.77; 95% CI, 0.66-0.89, respectively).
“While we have taken into account a wide range of pre-existing risk factors for cardiovascular disease, including those that are linked to depression such as excess weight, smoking, and low physical activity, it is difficult to fully control for the effects of depression in this kind of study, partly because there is considerable variability in the recording of depression severity in primary care,” said Dr. Bansal.
“This is important because many people taking antidepressants such as mirtazapine, venlafaxine, duloxetine and trazodone may have a more severe depression. This makes it difficult to fully separate the effects of the depression from the effects of medication,” she said.
Further research “is needed to assess whether the associations we have seen are genuinely due to the drugs; and, if so, why this might be,” she added.
Strengths, limitations
Commenting on the study, Roger McIntyre, MD, professor of psychiatry and pharmacology and head of the mood disorders psychopharmacology unit at the University of Toronto,, discussed the strengths and weaknesses of the study.
The UK Biobank is a “well-described, well-phenotyped dataset of good quality,” said Dr. McIntyre, chairperson and executive director of the Brain and Cognitive Discover Foundation, Toronto, who was not involved with the study. Another strength is the “impressive number of variables the database contains, which enabled the authors to go much deeper into the topics.”
A “significant limitation” is the confounding that is inherent to the disorder itself – “people with depression have a much higher intrinsic risk of CVD, [cerebrovascular disease], and cardiovascular mortality,” Dr. McIntyre noted.
The researchers did not adjust for trauma or childhood maltreatment, “which are the biggest risk factors for both depression and CVD; and drug and alcohol misuse were also not accounted for.”
Additionally, “to determine whether something is an association or potentially causative, it must satisfy the Bradford-Hill criteria,” said Dr. McIntyre. “Since we’re moving more toward using these big databases and because we depend on them to give us long-term perspectives, we would want to see coherent, compelling Bradford-Hill criteria regarding causation. If you don’t have any, that’s fine too, but then it’s important to make clear that there is no clear causative line, just an association.”
The research was funded by the National Institute of Health Research School for Primary Care Research and was supported by the NI Biomedical Research Centre at University Hospitals Bristol and Weston NHS Foundation Trust and the University of Bristol. Dr. McIntyre has received research grant support from CI/GACD/National Natural Science Foundation of China and the Milken Institute and speaker/consultation fees from numerous companies. Dr. McIntyre is a CEO of Braxia Scientific.
A version of this article first appeared on Medscape.com.
The investigators drew on 10-year data from the UK Biobank on over 220,000 adults and compared the risk of developing adverse health outcomes among those taking antidepressants with the risk among those who were not taking antidepressants.
After adjusting for preexisting risk factors, they found that 10-year antidepressant use was associated with a twofold higher risk of CHD, an almost-twofold higher risk of CVD as well as CVD mortality, a higher risk of cerebrovascular disease, and more than double the risk of all-cause mortality.
On the other hand, at 10 years, antidepressant use was associated with a 23% lower risk of developing hypertension and a 32% lower risk of diabetes.
The main culprits were mirtazapine, venlafaxine, duloxetine, and trazodone, although SSRIs were also tied to increased risk.
“Our message for clinicians is that prescribing of antidepressants in the long term may not be harm free [and] we hope that this study will help doctors and patients have more informed conversations when they weigh up the potential risks and benefits of treatments for depression,” study investigator Narinder Bansal, MD, honorary research fellow, Centre for Academic Health and Centre for Academic Primary Care, University of Bristol (England), said in a news release.
“Regardless of whether the drugs are the underlying cause of these problems, our findings emphasize the importance of proactive cardiovascular monitoring and prevention in patients who have depression and are on antidepressants, given that both have been associated with higher risks,” she added.
The study was published online in the British Journal of Psychiatry Open.
Monitoring of CVD risk ‘critical’
Antidepressants are among the most widely prescribed drugs; 70 million prescriptions were dispensed in 2018 alone, representing a doubling of prescriptions for these agents in a decade, the investigators noted. “This striking rise in prescribing is attributed to long-term treatment rather than an increased incidence of depression.”
Most trials that have assessed antidepressant efficacy have been “poorly suited to examining adverse outcomes.” One reason for this is that many of the trials are short-term studies. Since depression is “strongly associated” with CVD risk factors, “careful assessment of the long-term cardiometabolic effects of antidepressant treatment is critical.”
Moreover, information about “a wide range of prospectively measured confounders ... is needed to provide robust estimates of the risks associated with long-term antidepressant use,” the authors noted.
The researchers examined the association between antidepressant use and four cardiometabolic morbidity outcomes – diabetes, hypertension, cerebrovascular disease, and CHD. In addition, they assessed two mortality outcomes – CVD mortality and all-cause mortality. Participants were divided into cohorts on the basis of outcome of interest.
The dataset contains detailed information on socioeconomic status, demographics, anthropometric, behavioral, and biochemical risk factors, disability, and health status and is linked to datasets of primary care records and deaths.
The study included 222,121 participants whose data had been linked to primary care records during 2018 (median age of participants, 56-57 years). About half were women, and 96% were of White ethnicity.
Participants were excluded if they had been prescribed antidepressants 12 months or less before baseline, if they had previously been diagnosed for the outcome of interest, if they had been previously prescribed psychotropic drugs, if they used cardiometabolic drugs at baseline, or if they had undergone treatment with antidepressant polytherapy.
Potential confounders included age, gender, body mass index, waist/hip ratio, smoking and alcohol intake status, physical activity, parental history of outcome, biochemical and hematologic biomarkers, socioeconomic status, and long-term illness, disability, or infirmity.
Mechanism unclear
By the end of the 5- and 10-year follow-up periods, an average of 8% and 6% of participants in each cohort, respectively, had been prescribed an antidepressant. SSRIs constituted the most commonly prescribed class (80%-82%), and citalopram was the most commonly prescribed SSRI (46%-47%). Mirtazapine was the most frequently prescribed non-SSRI antidepressant (44%-46%).
At 5 years, any antidepressant use was associated with an increased risk for diabetes, CHD, and all-cause mortality, but the findings were attenuated after further adjustment for confounders. In fact, SSRIs were associated with a reduced risk of diabetes at 5 years (hazard ratio, 0.64; 95% confidence interval, 0.49-0.83).
At 10 years, SSRIs were associated with an increased risk of cerebrovascular disease, CVD mortality, and all-cause mortality; non-SSRIs were associated with an increased risk of CHD, CVD, and all-cause mortality.
On the other hand, SSRIs were associated with a decrease in risk of diabetes and hypertension at 10 years (HR, 0.68; 95% CI, 0.53-0.87; and HR, 0.77; 95% CI, 0.66-0.89, respectively).
“While we have taken into account a wide range of pre-existing risk factors for cardiovascular disease, including those that are linked to depression such as excess weight, smoking, and low physical activity, it is difficult to fully control for the effects of depression in this kind of study, partly because there is considerable variability in the recording of depression severity in primary care,” said Dr. Bansal.
“This is important because many people taking antidepressants such as mirtazapine, venlafaxine, duloxetine and trazodone may have a more severe depression. This makes it difficult to fully separate the effects of the depression from the effects of medication,” she said.
Further research “is needed to assess whether the associations we have seen are genuinely due to the drugs; and, if so, why this might be,” she added.
Strengths, limitations
Commenting on the study, Roger McIntyre, MD, professor of psychiatry and pharmacology and head of the mood disorders psychopharmacology unit at the University of Toronto,, discussed the strengths and weaknesses of the study.
The UK Biobank is a “well-described, well-phenotyped dataset of good quality,” said Dr. McIntyre, chairperson and executive director of the Brain and Cognitive Discover Foundation, Toronto, who was not involved with the study. Another strength is the “impressive number of variables the database contains, which enabled the authors to go much deeper into the topics.”
A “significant limitation” is the confounding that is inherent to the disorder itself – “people with depression have a much higher intrinsic risk of CVD, [cerebrovascular disease], and cardiovascular mortality,” Dr. McIntyre noted.
The researchers did not adjust for trauma or childhood maltreatment, “which are the biggest risk factors for both depression and CVD; and drug and alcohol misuse were also not accounted for.”
Additionally, “to determine whether something is an association or potentially causative, it must satisfy the Bradford-Hill criteria,” said Dr. McIntyre. “Since we’re moving more toward using these big databases and because we depend on them to give us long-term perspectives, we would want to see coherent, compelling Bradford-Hill criteria regarding causation. If you don’t have any, that’s fine too, but then it’s important to make clear that there is no clear causative line, just an association.”
The research was funded by the National Institute of Health Research School for Primary Care Research and was supported by the NI Biomedical Research Centre at University Hospitals Bristol and Weston NHS Foundation Trust and the University of Bristol. Dr. McIntyre has received research grant support from CI/GACD/National Natural Science Foundation of China and the Milken Institute and speaker/consultation fees from numerous companies. Dr. McIntyre is a CEO of Braxia Scientific.
A version of this article first appeared on Medscape.com.
FROM THE BRITISH JOURNAL OF PSYCHIATRY OPEN
Eating earlier offers health benefits, studies say
New research suggests there may be better times during the day for eating and fasting.
Eating earlier in the day may help you lose weight, and eating meals within a 10-hour window could improve blood sugar and cholesterol levels, according to two new studies published in Cell Metabolism.
“You have this internal biological clock that makes you better at doing different things at different times of the day,” Courtney Peterson, PhD, an associate professor of nutrition sciences at the University of Alabama at Birmingham, told NBC News. Dr. Peterson wasn’t involved with the studies.
“It seems like the best time for your metabolism, in most people, is the mid to late morning,” she said.
In one study, researchers found that eating later in the day made people hungrier during a 24-hour period, as compared with eating the same meals earlier in the day. Combined, the changes may increase the risk for obesity, the study authors found.
In another study, among firefighters as shift workers, researchers found that eating meals within a 10-hour window decreased the size of bad cholesterol particles, which could reduce risk factors for heart disease. The 10-hour eating window also improved blood pressure and blood sugar levels among those with health conditions such as diabetes, high blood pressure, and high cholesterol.
The two new studies confirm findings from previous studies that indicate humans may have an ideal eating window based on the body’s circadian rhythms, which regulate sleep and wake cycles and can affect appetite, metabolism, and blood sugar levels.
In the firefighter study, for instance, the 10-hour window appears to be a “sweet spot” for the body, the authors found. More severe restrictions, as found with many intermittent fasting diets, could be difficult for the body to maintain.
“When we think about 6 or 8 hours, you might see a benefit, but people might not stick to it for a long time,” Satchidananda Panda, PhD, one of the study authors and a professor at the Salk Institute, La Jolla, Calif., told NBC News.
The new studies had small sample sizes, though they offer insight for future research. In the first study, 16 people who were overweight or obese tried two eating plans for 24-hour periods. Some of them began eating an hour after their natural wake-up time, and others waited to begin eating until about 5 hours after waking up. They ate the same meals with the same calories and nutrients.
The researchers measured their hormone levels and found that eating later decreased the levels of leptin, which helps people to feel full. Eating later also doubled the odds that people felt hungry throughout the day. Those in the study who ate later in the day also had more cravings for starchy or salty foods, as well as meat and dairy, which are energy-dense foods.
The research team also found changes in fat tissue, which could lead to a higher chance of building up new fat cells and a lower chance of burning fat. Late eaters burned about 60 fewer calories than early eaters during the day.
“Your body processes calories differently when you eat late in the day. It tips the scale in favor of weight gain and fat gain,” Dr. Peterson said. “From this study, we can get pretty clear recommendations that people shouldn’t skip breakfast.”
The second study followed 137 firefighters in San Diego who ate a Mediterranean diet with fish, vegetables, fruit, and olive oil for 12 weeks. Among those, 70 firefighters ate during a 10-hour window, and the rest ate during a longer window, generally about 13 hours. They logged their meals in an app and wore devices to track blood sugar levels.
In the 10-hour group, most firefighters ate between 8 a.m. or 9 a.m. and 6 p.m. or 7 p.m. The time-restricted eating appeared to be linked with health benefits, such as less harmful cholesterol buildup and reduced heart disease.
Among firefighters with risk factors for heart disease, such as high blood pressure and high blood sugar, the time-restricted eating decreased their blood pressure and blood sugar levels.
The restricted window appears to allow the body to break down toxins and get rid of sodium and other things that can drive up blood pressure and blood sugar, the authors wrote.
During periods of fasting, “organs get some rest from digesting food so they can divert their energy toward repairing cells,” Dr. Panda said.
A version of this article first appeared on WebMD.com.
New research suggests there may be better times during the day for eating and fasting.
Eating earlier in the day may help you lose weight, and eating meals within a 10-hour window could improve blood sugar and cholesterol levels, according to two new studies published in Cell Metabolism.
“You have this internal biological clock that makes you better at doing different things at different times of the day,” Courtney Peterson, PhD, an associate professor of nutrition sciences at the University of Alabama at Birmingham, told NBC News. Dr. Peterson wasn’t involved with the studies.
“It seems like the best time for your metabolism, in most people, is the mid to late morning,” she said.
In one study, researchers found that eating later in the day made people hungrier during a 24-hour period, as compared with eating the same meals earlier in the day. Combined, the changes may increase the risk for obesity, the study authors found.
In another study, among firefighters as shift workers, researchers found that eating meals within a 10-hour window decreased the size of bad cholesterol particles, which could reduce risk factors for heart disease. The 10-hour eating window also improved blood pressure and blood sugar levels among those with health conditions such as diabetes, high blood pressure, and high cholesterol.
The two new studies confirm findings from previous studies that indicate humans may have an ideal eating window based on the body’s circadian rhythms, which regulate sleep and wake cycles and can affect appetite, metabolism, and blood sugar levels.
In the firefighter study, for instance, the 10-hour window appears to be a “sweet spot” for the body, the authors found. More severe restrictions, as found with many intermittent fasting diets, could be difficult for the body to maintain.
“When we think about 6 or 8 hours, you might see a benefit, but people might not stick to it for a long time,” Satchidananda Panda, PhD, one of the study authors and a professor at the Salk Institute, La Jolla, Calif., told NBC News.
The new studies had small sample sizes, though they offer insight for future research. In the first study, 16 people who were overweight or obese tried two eating plans for 24-hour periods. Some of them began eating an hour after their natural wake-up time, and others waited to begin eating until about 5 hours after waking up. They ate the same meals with the same calories and nutrients.
The researchers measured their hormone levels and found that eating later decreased the levels of leptin, which helps people to feel full. Eating later also doubled the odds that people felt hungry throughout the day. Those in the study who ate later in the day also had more cravings for starchy or salty foods, as well as meat and dairy, which are energy-dense foods.
The research team also found changes in fat tissue, which could lead to a higher chance of building up new fat cells and a lower chance of burning fat. Late eaters burned about 60 fewer calories than early eaters during the day.
“Your body processes calories differently when you eat late in the day. It tips the scale in favor of weight gain and fat gain,” Dr. Peterson said. “From this study, we can get pretty clear recommendations that people shouldn’t skip breakfast.”
The second study followed 137 firefighters in San Diego who ate a Mediterranean diet with fish, vegetables, fruit, and olive oil for 12 weeks. Among those, 70 firefighters ate during a 10-hour window, and the rest ate during a longer window, generally about 13 hours. They logged their meals in an app and wore devices to track blood sugar levels.
In the 10-hour group, most firefighters ate between 8 a.m. or 9 a.m. and 6 p.m. or 7 p.m. The time-restricted eating appeared to be linked with health benefits, such as less harmful cholesterol buildup and reduced heart disease.
Among firefighters with risk factors for heart disease, such as high blood pressure and high blood sugar, the time-restricted eating decreased their blood pressure and blood sugar levels.
The restricted window appears to allow the body to break down toxins and get rid of sodium and other things that can drive up blood pressure and blood sugar, the authors wrote.
During periods of fasting, “organs get some rest from digesting food so they can divert their energy toward repairing cells,” Dr. Panda said.
A version of this article first appeared on WebMD.com.
New research suggests there may be better times during the day for eating and fasting.
Eating earlier in the day may help you lose weight, and eating meals within a 10-hour window could improve blood sugar and cholesterol levels, according to two new studies published in Cell Metabolism.
“You have this internal biological clock that makes you better at doing different things at different times of the day,” Courtney Peterson, PhD, an associate professor of nutrition sciences at the University of Alabama at Birmingham, told NBC News. Dr. Peterson wasn’t involved with the studies.
“It seems like the best time for your metabolism, in most people, is the mid to late morning,” she said.
In one study, researchers found that eating later in the day made people hungrier during a 24-hour period, as compared with eating the same meals earlier in the day. Combined, the changes may increase the risk for obesity, the study authors found.
In another study, among firefighters as shift workers, researchers found that eating meals within a 10-hour window decreased the size of bad cholesterol particles, which could reduce risk factors for heart disease. The 10-hour eating window also improved blood pressure and blood sugar levels among those with health conditions such as diabetes, high blood pressure, and high cholesterol.
The two new studies confirm findings from previous studies that indicate humans may have an ideal eating window based on the body’s circadian rhythms, which regulate sleep and wake cycles and can affect appetite, metabolism, and blood sugar levels.
In the firefighter study, for instance, the 10-hour window appears to be a “sweet spot” for the body, the authors found. More severe restrictions, as found with many intermittent fasting diets, could be difficult for the body to maintain.
“When we think about 6 or 8 hours, you might see a benefit, but people might not stick to it for a long time,” Satchidananda Panda, PhD, one of the study authors and a professor at the Salk Institute, La Jolla, Calif., told NBC News.
The new studies had small sample sizes, though they offer insight for future research. In the first study, 16 people who were overweight or obese tried two eating plans for 24-hour periods. Some of them began eating an hour after their natural wake-up time, and others waited to begin eating until about 5 hours after waking up. They ate the same meals with the same calories and nutrients.
The researchers measured their hormone levels and found that eating later decreased the levels of leptin, which helps people to feel full. Eating later also doubled the odds that people felt hungry throughout the day. Those in the study who ate later in the day also had more cravings for starchy or salty foods, as well as meat and dairy, which are energy-dense foods.
The research team also found changes in fat tissue, which could lead to a higher chance of building up new fat cells and a lower chance of burning fat. Late eaters burned about 60 fewer calories than early eaters during the day.
“Your body processes calories differently when you eat late in the day. It tips the scale in favor of weight gain and fat gain,” Dr. Peterson said. “From this study, we can get pretty clear recommendations that people shouldn’t skip breakfast.”
The second study followed 137 firefighters in San Diego who ate a Mediterranean diet with fish, vegetables, fruit, and olive oil for 12 weeks. Among those, 70 firefighters ate during a 10-hour window, and the rest ate during a longer window, generally about 13 hours. They logged their meals in an app and wore devices to track blood sugar levels.
In the 10-hour group, most firefighters ate between 8 a.m. or 9 a.m. and 6 p.m. or 7 p.m. The time-restricted eating appeared to be linked with health benefits, such as less harmful cholesterol buildup and reduced heart disease.
Among firefighters with risk factors for heart disease, such as high blood pressure and high blood sugar, the time-restricted eating decreased their blood pressure and blood sugar levels.
The restricted window appears to allow the body to break down toxins and get rid of sodium and other things that can drive up blood pressure and blood sugar, the authors wrote.
During periods of fasting, “organs get some rest from digesting food so they can divert their energy toward repairing cells,” Dr. Panda said.
A version of this article first appeared on WebMD.com.
FROM CELL METABOLISM
‘Plethora’ of new MCL treatment options
Specific research needs include comparative studies of novel treatment combinations like ibrutinib plus venetoclax, which has shown singular promise in clinical trials, and further investigation of emerging immunotherapies like bi-specific T-cell engagers (BiTEs), said review author Mubarak Al-Mansour, MD.
The review article, published online in Clinical Lymphoma, Myeloma & Leukemia, includes a proposed treatment algorithm based on the latest data.
“Since the introduction of [Bruton’s tyrosine kinase] inhibitors, the treatment algorithm and response of R/RMCL patients have dramatically changed. Nevertheless, Bruton's tyrosine kinase resistance is common, which necessitated further investigations to develop novel agents with a more durable response,” explained Dr. Al-Mansour a medical oncologist at Princess Noorah Oncology Center, Jeddah, Saudi Arabia.
Modest clinical activity and tolerability observed with novel agents that targeted B-cell receptor signaling led to investigation of combination strategies in preclinical and early clinical settings, in order to assess whether more durable response rates could be achieved than with single-agent therapy, he said.
“[Of] these combinations, ibrutinib plus venetoclax had the highest response rates in the setting of clinical trials, even in high-risk patients,” Dr. Al-Mansour noted.
Other promising therapies include chimeric antigen receptor (CAR) T-cell therapies (CAR-T) and BiTEs, which “appear to be powerful agents in the therapeutic arsenals of R/RMCL, especially among heavily pretreated patients,” he said, adding, however, that “further investigations are still warranted to assess the clinical activity of CAR-T or BiTEs therapies in combination with other agents.”
Comparative studies also will be needed to assess the relative advantages of various treatment approaches, he said.
These investigations are important given the generally short duration of remission among patients with MCL, which now accounts for between 2% and 6% of all non-Hodgkin lymphoma cases, an incidence that has risen steadily over the past few decades, Dr. Al-Mansour pointed out.
Although many patients achieve an adequate response in the upfront treatment setting, with overall response rates ranging from 60% to 97%, remission is generally short-lived, and the rapid relapses that occur pose a challenge. Additionally, most patients are elderly and have a poor prognosis: Reported progression-free survival in older patients ranges from 2 to 3 years and median overall survival ranges from 28.8 to 52 months, compared with 62 and 139 months, respectively, in young, fit patients, he said.
Furthermore, there is no consensus on the best treatment options in the relapsed/refractory setting, and international guidelines vary widely, he added.
For the current review, Dr. Al-Mansour conducted an online bibliographic search for relevant clinical trial data and meeting abstracts published through the end of March 2022. The data addressed treatment pathways, resistance mechanisms, various approved and investigational agents and treatments used alone or in combination regimens, and stem cell transplant (SCT).
Based on the evidence, Dr. Al-Mansour proposed the following “general algorithm” for the management of R/RMCL:
“Fit patients should be categorized according to their time until disease progression into early (< 24 months) and late (> 24 months) groups. In patients with early progression of the disease, Bruton's tyrosine kinase inhibitors should be offered. Other alternatives should be offered in case of relapse or failure, including CAR-T, [allogeneic-SCT (allo-SCT)], or enrollment in a clinical trial.”
For patients with late disease progression, the algorithm calls for offering Bruton's tyrosine kinase inhibitors, rituximab-bendamustine–based chemotherapy, or rituximab-lenalidomide.
“Other alternatives should be offered in case of relapse or failure, including CAR-T, allo-SCT, or enrollment in a clinical trial. Unfit patients can be offered Bruton's tyrosine kinase inhibitors, considering CAR-T or enrollment in a clinical trial in case of failure.”
Dr. Al-Mansour also noted COVID-19 pandemic–related caveats for the management of R/RMCL.
“Recent epidemiological figures demonstrated that cancer patients are at excessive risk of severe COVID-19. In the case of hematological malignancies, patients are usually on immunosuppressants, which further increase the risk of severe disease and death,” he wrote.
For this reason, and because current treatments consist mainly of targeted agents, which “exert negative effects on patients’ humoral and cell-mediated immunity,” the timing and schedules of treatment regimens should be determined with consideration of COVID-19–related risks, he advised.
Specific research needs include comparative studies of novel treatment combinations like ibrutinib plus venetoclax, which has shown singular promise in clinical trials, and further investigation of emerging immunotherapies like bi-specific T-cell engagers (BiTEs), said review author Mubarak Al-Mansour, MD.
The review article, published online in Clinical Lymphoma, Myeloma & Leukemia, includes a proposed treatment algorithm based on the latest data.
“Since the introduction of [Bruton’s tyrosine kinase] inhibitors, the treatment algorithm and response of R/RMCL patients have dramatically changed. Nevertheless, Bruton's tyrosine kinase resistance is common, which necessitated further investigations to develop novel agents with a more durable response,” explained Dr. Al-Mansour a medical oncologist at Princess Noorah Oncology Center, Jeddah, Saudi Arabia.
Modest clinical activity and tolerability observed with novel agents that targeted B-cell receptor signaling led to investigation of combination strategies in preclinical and early clinical settings, in order to assess whether more durable response rates could be achieved than with single-agent therapy, he said.
“[Of] these combinations, ibrutinib plus venetoclax had the highest response rates in the setting of clinical trials, even in high-risk patients,” Dr. Al-Mansour noted.
Other promising therapies include chimeric antigen receptor (CAR) T-cell therapies (CAR-T) and BiTEs, which “appear to be powerful agents in the therapeutic arsenals of R/RMCL, especially among heavily pretreated patients,” he said, adding, however, that “further investigations are still warranted to assess the clinical activity of CAR-T or BiTEs therapies in combination with other agents.”
Comparative studies also will be needed to assess the relative advantages of various treatment approaches, he said.
These investigations are important given the generally short duration of remission among patients with MCL, which now accounts for between 2% and 6% of all non-Hodgkin lymphoma cases, an incidence that has risen steadily over the past few decades, Dr. Al-Mansour pointed out.
Although many patients achieve an adequate response in the upfront treatment setting, with overall response rates ranging from 60% to 97%, remission is generally short-lived, and the rapid relapses that occur pose a challenge. Additionally, most patients are elderly and have a poor prognosis: Reported progression-free survival in older patients ranges from 2 to 3 years and median overall survival ranges from 28.8 to 52 months, compared with 62 and 139 months, respectively, in young, fit patients, he said.
Furthermore, there is no consensus on the best treatment options in the relapsed/refractory setting, and international guidelines vary widely, he added.
For the current review, Dr. Al-Mansour conducted an online bibliographic search for relevant clinical trial data and meeting abstracts published through the end of March 2022. The data addressed treatment pathways, resistance mechanisms, various approved and investigational agents and treatments used alone or in combination regimens, and stem cell transplant (SCT).
Based on the evidence, Dr. Al-Mansour proposed the following “general algorithm” for the management of R/RMCL:
“Fit patients should be categorized according to their time until disease progression into early (< 24 months) and late (> 24 months) groups. In patients with early progression of the disease, Bruton's tyrosine kinase inhibitors should be offered. Other alternatives should be offered in case of relapse or failure, including CAR-T, [allogeneic-SCT (allo-SCT)], or enrollment in a clinical trial.”
For patients with late disease progression, the algorithm calls for offering Bruton's tyrosine kinase inhibitors, rituximab-bendamustine–based chemotherapy, or rituximab-lenalidomide.
“Other alternatives should be offered in case of relapse or failure, including CAR-T, allo-SCT, or enrollment in a clinical trial. Unfit patients can be offered Bruton's tyrosine kinase inhibitors, considering CAR-T or enrollment in a clinical trial in case of failure.”
Dr. Al-Mansour also noted COVID-19 pandemic–related caveats for the management of R/RMCL.
“Recent epidemiological figures demonstrated that cancer patients are at excessive risk of severe COVID-19. In the case of hematological malignancies, patients are usually on immunosuppressants, which further increase the risk of severe disease and death,” he wrote.
For this reason, and because current treatments consist mainly of targeted agents, which “exert negative effects on patients’ humoral and cell-mediated immunity,” the timing and schedules of treatment regimens should be determined with consideration of COVID-19–related risks, he advised.
Specific research needs include comparative studies of novel treatment combinations like ibrutinib plus venetoclax, which has shown singular promise in clinical trials, and further investigation of emerging immunotherapies like bi-specific T-cell engagers (BiTEs), said review author Mubarak Al-Mansour, MD.
The review article, published online in Clinical Lymphoma, Myeloma & Leukemia, includes a proposed treatment algorithm based on the latest data.
“Since the introduction of [Bruton’s tyrosine kinase] inhibitors, the treatment algorithm and response of R/RMCL patients have dramatically changed. Nevertheless, Bruton's tyrosine kinase resistance is common, which necessitated further investigations to develop novel agents with a more durable response,” explained Dr. Al-Mansour a medical oncologist at Princess Noorah Oncology Center, Jeddah, Saudi Arabia.
Modest clinical activity and tolerability observed with novel agents that targeted B-cell receptor signaling led to investigation of combination strategies in preclinical and early clinical settings, in order to assess whether more durable response rates could be achieved than with single-agent therapy, he said.
“[Of] these combinations, ibrutinib plus venetoclax had the highest response rates in the setting of clinical trials, even in high-risk patients,” Dr. Al-Mansour noted.
Other promising therapies include chimeric antigen receptor (CAR) T-cell therapies (CAR-T) and BiTEs, which “appear to be powerful agents in the therapeutic arsenals of R/RMCL, especially among heavily pretreated patients,” he said, adding, however, that “further investigations are still warranted to assess the clinical activity of CAR-T or BiTEs therapies in combination with other agents.”
Comparative studies also will be needed to assess the relative advantages of various treatment approaches, he said.
These investigations are important given the generally short duration of remission among patients with MCL, which now accounts for between 2% and 6% of all non-Hodgkin lymphoma cases, an incidence that has risen steadily over the past few decades, Dr. Al-Mansour pointed out.
Although many patients achieve an adequate response in the upfront treatment setting, with overall response rates ranging from 60% to 97%, remission is generally short-lived, and the rapid relapses that occur pose a challenge. Additionally, most patients are elderly and have a poor prognosis: Reported progression-free survival in older patients ranges from 2 to 3 years and median overall survival ranges from 28.8 to 52 months, compared with 62 and 139 months, respectively, in young, fit patients, he said.
Furthermore, there is no consensus on the best treatment options in the relapsed/refractory setting, and international guidelines vary widely, he added.
For the current review, Dr. Al-Mansour conducted an online bibliographic search for relevant clinical trial data and meeting abstracts published through the end of March 2022. The data addressed treatment pathways, resistance mechanisms, various approved and investigational agents and treatments used alone or in combination regimens, and stem cell transplant (SCT).
Based on the evidence, Dr. Al-Mansour proposed the following “general algorithm” for the management of R/RMCL:
“Fit patients should be categorized according to their time until disease progression into early (< 24 months) and late (> 24 months) groups. In patients with early progression of the disease, Bruton's tyrosine kinase inhibitors should be offered. Other alternatives should be offered in case of relapse or failure, including CAR-T, [allogeneic-SCT (allo-SCT)], or enrollment in a clinical trial.”
For patients with late disease progression, the algorithm calls for offering Bruton's tyrosine kinase inhibitors, rituximab-bendamustine–based chemotherapy, or rituximab-lenalidomide.
“Other alternatives should be offered in case of relapse or failure, including CAR-T, allo-SCT, or enrollment in a clinical trial. Unfit patients can be offered Bruton's tyrosine kinase inhibitors, considering CAR-T or enrollment in a clinical trial in case of failure.”
Dr. Al-Mansour also noted COVID-19 pandemic–related caveats for the management of R/RMCL.
“Recent epidemiological figures demonstrated that cancer patients are at excessive risk of severe COVID-19. In the case of hematological malignancies, patients are usually on immunosuppressants, which further increase the risk of severe disease and death,” he wrote.
For this reason, and because current treatments consist mainly of targeted agents, which “exert negative effects on patients’ humoral and cell-mediated immunity,” the timing and schedules of treatment regimens should be determined with consideration of COVID-19–related risks, he advised.
FROM CLINICAL LYMPHOMA, MYELOMA & LEUKEMIA
A Patient With Recurrent Immune Stromal Keratitis and Adherence Challenges
Herpes simplex keratitis (HSK) is a common yet potentially blinding condition caused by a primary or reactivated herpetic infection of the cornea.1 The Herpetic Eye Disease Study established the standard of care in HSK management.2 Treatments range from oral antivirals and artificial tears to topical antibiotics, amniotic membranes, and corneal transplantation.3 Patients with immune stromal keratitis (ISK) may experience low-grade chronic keratitis for years.4 ISK is classified by a cellular and neovascularization infiltration of the cornea.5 We present a case of a patient with recurrent ISK and review its presentation, diagnosis, and management.
Case Presentation
A 52-year-old man presented to the eye clinic with a watery and itchy right eye with mildly blurred vision. His ocular history was unremarkable. His medical history was notable for hepatitis C, hypertension, alcohol and drug dependence, homelessness, and a COVID-19–induced coma. His medications included trazodone, nifedipine, clonidine HCl, and buprenorphine/naloxone.
On clinical examination, the patient’s best-corrected visual acuity was 20/40 in the right eye and 20/20 in the left. Corneal sensitivity was absent in the right eye and intact in the left. Anterior segment findings in the right eye included 360-degree superficial corneal neovascularization, deep neovascularization temporally, scattered patches of corneal haze, epithelial irregularity, and 2+ diffuse bulbar conjunctival injection (Figure 1). The anterior segment of the left eye and the posterior segments of both eyes were unremarkable. The differential diagnosis included HSK, syphilis, Cogan syndrome, varicella-zoster virus keratitis, Epstein-Barr virus keratitis, and Lyme disease. With consultation from a corneal specialist, the patient was given the presumptive diagnosis of ISK in the right eye based on unilateral corneal presentation and lack of corneal sensitivity. He was treated with
The patient returned a week later having only used the prednisolone drops for 2 days before discontinuing. Examination showed no change in his corneal appearance from the previous week. The patient was counseled on the importance of adherence to the regimen of topical prednisolone and oral valacyclovir.
The patient followed up 2 weeks later. He reported good adherence to the ISK medication regimen. His symptoms had resolved, and his visual acuity returned to 20/20 in the right eye. Slit-lamp examination showed improvement in injection, and the superficial corneal neovascularization had cleared. A trace ghost vessel was seen temporally at a site of deep neovascularization (Figure 2). He was instructed to continue valacyclovir once daily and prednisolone drops once daily in the right eye and to follow up in 1 month.
At the 1-month follow-up, the patient’s signs and symptoms had reverted to his original presentation. The patient reported poor adherence to the medication regimen, having missed multiple doses of prednisolone drops as well as valacyclovir. The patient was counseled again on the ISK regimen, and the prednisolone drops and 1-g oral valacyclovir were refilled. A follow-up visit was scheduled for 2 weeks. Additional follow-up revealed a resolved corneal appearance and bimonthly follow-ups were scheduled thereafter.
Discussion
HSK is the most common infectious cause of unilateral blindness and vision impairment in the world.2 This case highlights the diagnosis and management of a patient with ISK, a type of HSK characterized by decreased corneal sensitivity and unilateral stromal opacification or neovascularization.6
ISK is caused by the herpes simplex virus (HSV), a double-stranded enveloped DNA virus that occurs worldwide with little variation, replicates in many types of cells, has rapid growth, and is cytolytic, causing necrosis of nearby cells. Transmission is via direct contact and there is a lifelong latency period in the trigeminal ganglia. Both primary and reactivation infections of HSK can affect a broad array of ocular structures, from the lids to the retina. Infectious epithelial keratitis, also known as dendritic keratitis, is the reactivation of the live virus and is the most common presentation of HSK. ISK is responsible for 20% to 48% of recurrent HSV disease and is the leading cause of vision loss. ISK is the result of an immune-mediated inflammatory response due to a retained viral antigen within the stromal tissue.7 Inflammation in the corneal stroma leads to corneal haze and eventually focal or diffuse scarring, reducing the visual potential.7 This presentation may occur days to years after the initial epithelial episode and may persist for years. Although this patient did not present with infectious epithelial keratitis, it is possible he had a previous episode not mentioned as a history was difficult to obtain, and it can be subtle or innocuous, like pink eye.
Symptoms of ISK include unilateral redness, photophobia, tearing, eye pain, and blurred vision, as described by this patient. On examination, initial manifestations of ISK include corneal haze, edema, scarring, and neovascularization.7 Again, this patient presented with edema and neovascularization. These signs may improve with prompt diagnosis and treatment. More frequent reactivated disease leads to a higher propensity of corneal scarring and irregular astigmatism, reducing the visual outcome.
The standard of care established by the Herpetic Eye Disease Study recommends that a patient with presumed ISK should be started on oral antiviral therapy and, in the absence of epithelial disease, topical steroids. Oral antivirals, such as acyclovir and valacyclovir, have good ocular penetration, a good safety profile, a low susceptibility of resistance, and are well tolerated with long-term treatment.2,8 There were no known interactions between any of the patient’s medications and valacyclovir. Oral antivirals should be used in the initial presentation and for maintenance therapy to help reduce the chance of recurrent disease. Initial treatment for ISK is 1-g valacyclovir 3 times daily. When the eye becomes quiet, that dosage can be tapered to 1 g twice daily, to 1 g once daily, and eventually to a maintenance dose of 500 mg daily. Topical steroids block the inflammatory cascade, therefore reducing the corneal inflammation and potential scarring, further reducing the risk of visual impairment.9 Initial treatment is 1 drop 3 times daily, then can be tapered at the same schedule as the oral acyclovir to help simplify adherence for the patient. After 1 drop once daily, steroids may be discontinued while the oral antiviral maintenance dosage continues. Follow-ups should be performed on a monthly to bimonthly basis to evaluate intraocular pressure, ensuring there is no steroid response.
As seen in this patient, adherence with a treatment regimen and awareness of factors, such as a complex psychosocial history that may impact this adherence, are of utmost importance.7
Conclusions
ISK presents unilaterally with decreased or absent corneal sensitivity and nonspecific symptoms. It should be at the top of the list in the differential diagnosis in any patient with unilateral corneal edema, opacification, or neovascularization, and the patient should be started on oral antiviral therapy.
1. Sibley D, Larkin DFP. Update on Herpes simplex keratitis management. Eye (Lond). 2020;34(12):2219-2226. doi:10.1038/s41433-020-01153-x
2. Chodosh J, Ung L. Adoption of innovation in herpes simplex virus keratitis. Cornea. 2020;39(1)(suppl 1):S7-S18. doi:10.1097/ICO.0000000000002425
3. Pérez-Bartolomé F, Botín DM, de Dompablo P, de Arriba P, Arnalich Montiel F, Muñoz Negrete FJ. Post-herpes neurotrophic keratopathy: pathogenesis, clinical signs and current therapies. Arch Soc Esp Oftalmol. 2019;94(4):171-183. doi:10.1016/j.oftal.2019.01.002
4. Holland EJ, Schwartz GS. Classification of herpes simplex virus keratitis. Cornea. 1999;18(2):144-154.
5. Gauthier AS, Noureddine S, Delbosc B. Interstitial keratitis diagnosis and treatment. J Fr Ophtalmol. 2019;42(6):e229-e237. doi:10.1016/j.jfo.2019.04.001
6. Farooq AV, Shukla D. Herpes simplex epithelial and stromal keratitis: an epidemiologic update. Surv Ophthalmol. 2012;5(57):448-462. doi:10.1016/jsurvophthal.2012.01.005
7. Wang L, Wang R, Xu C, Zhou H. Pathogenesis of herpes stromal keratitis: immune inflammatory response mediated by inflammatory regulators. Front Immunol. 2020;11:766. Published 2020 May 13. doi:10.3389/fimmu.2020.00766
8. Tyring SK, Baker D, Snowden W. Valacyclovir for herpes simplex virus infection: long-term safety and sustained efficacy after 20 years’ experience with acyclovir. J Infect Dis. 2002;186(suppl 1):S40-S46. doi:10.1086/342966
9. Dawson CR. The herpetic eye disease study. Arch Ophthalmol. 1990;108(2):191-192. doi:10.1001/archopht.1990.01070040043027
Herpes simplex keratitis (HSK) is a common yet potentially blinding condition caused by a primary or reactivated herpetic infection of the cornea.1 The Herpetic Eye Disease Study established the standard of care in HSK management.2 Treatments range from oral antivirals and artificial tears to topical antibiotics, amniotic membranes, and corneal transplantation.3 Patients with immune stromal keratitis (ISK) may experience low-grade chronic keratitis for years.4 ISK is classified by a cellular and neovascularization infiltration of the cornea.5 We present a case of a patient with recurrent ISK and review its presentation, diagnosis, and management.
Case Presentation
A 52-year-old man presented to the eye clinic with a watery and itchy right eye with mildly blurred vision. His ocular history was unremarkable. His medical history was notable for hepatitis C, hypertension, alcohol and drug dependence, homelessness, and a COVID-19–induced coma. His medications included trazodone, nifedipine, clonidine HCl, and buprenorphine/naloxone.
On clinical examination, the patient’s best-corrected visual acuity was 20/40 in the right eye and 20/20 in the left. Corneal sensitivity was absent in the right eye and intact in the left. Anterior segment findings in the right eye included 360-degree superficial corneal neovascularization, deep neovascularization temporally, scattered patches of corneal haze, epithelial irregularity, and 2+ diffuse bulbar conjunctival injection (Figure 1). The anterior segment of the left eye and the posterior segments of both eyes were unremarkable. The differential diagnosis included HSK, syphilis, Cogan syndrome, varicella-zoster virus keratitis, Epstein-Barr virus keratitis, and Lyme disease. With consultation from a corneal specialist, the patient was given the presumptive diagnosis of ISK in the right eye based on unilateral corneal presentation and lack of corneal sensitivity. He was treated with
The patient returned a week later having only used the prednisolone drops for 2 days before discontinuing. Examination showed no change in his corneal appearance from the previous week. The patient was counseled on the importance of adherence to the regimen of topical prednisolone and oral valacyclovir.
The patient followed up 2 weeks later. He reported good adherence to the ISK medication regimen. His symptoms had resolved, and his visual acuity returned to 20/20 in the right eye. Slit-lamp examination showed improvement in injection, and the superficial corneal neovascularization had cleared. A trace ghost vessel was seen temporally at a site of deep neovascularization (Figure 2). He was instructed to continue valacyclovir once daily and prednisolone drops once daily in the right eye and to follow up in 1 month.
At the 1-month follow-up, the patient’s signs and symptoms had reverted to his original presentation. The patient reported poor adherence to the medication regimen, having missed multiple doses of prednisolone drops as well as valacyclovir. The patient was counseled again on the ISK regimen, and the prednisolone drops and 1-g oral valacyclovir were refilled. A follow-up visit was scheduled for 2 weeks. Additional follow-up revealed a resolved corneal appearance and bimonthly follow-ups were scheduled thereafter.
Discussion
HSK is the most common infectious cause of unilateral blindness and vision impairment in the world.2 This case highlights the diagnosis and management of a patient with ISK, a type of HSK characterized by decreased corneal sensitivity and unilateral stromal opacification or neovascularization.6
ISK is caused by the herpes simplex virus (HSV), a double-stranded enveloped DNA virus that occurs worldwide with little variation, replicates in many types of cells, has rapid growth, and is cytolytic, causing necrosis of nearby cells. Transmission is via direct contact and there is a lifelong latency period in the trigeminal ganglia. Both primary and reactivation infections of HSK can affect a broad array of ocular structures, from the lids to the retina. Infectious epithelial keratitis, also known as dendritic keratitis, is the reactivation of the live virus and is the most common presentation of HSK. ISK is responsible for 20% to 48% of recurrent HSV disease and is the leading cause of vision loss. ISK is the result of an immune-mediated inflammatory response due to a retained viral antigen within the stromal tissue.7 Inflammation in the corneal stroma leads to corneal haze and eventually focal or diffuse scarring, reducing the visual potential.7 This presentation may occur days to years after the initial epithelial episode and may persist for years. Although this patient did not present with infectious epithelial keratitis, it is possible he had a previous episode not mentioned as a history was difficult to obtain, and it can be subtle or innocuous, like pink eye.
Symptoms of ISK include unilateral redness, photophobia, tearing, eye pain, and blurred vision, as described by this patient. On examination, initial manifestations of ISK include corneal haze, edema, scarring, and neovascularization.7 Again, this patient presented with edema and neovascularization. These signs may improve with prompt diagnosis and treatment. More frequent reactivated disease leads to a higher propensity of corneal scarring and irregular astigmatism, reducing the visual outcome.
The standard of care established by the Herpetic Eye Disease Study recommends that a patient with presumed ISK should be started on oral antiviral therapy and, in the absence of epithelial disease, topical steroids. Oral antivirals, such as acyclovir and valacyclovir, have good ocular penetration, a good safety profile, a low susceptibility of resistance, and are well tolerated with long-term treatment.2,8 There were no known interactions between any of the patient’s medications and valacyclovir. Oral antivirals should be used in the initial presentation and for maintenance therapy to help reduce the chance of recurrent disease. Initial treatment for ISK is 1-g valacyclovir 3 times daily. When the eye becomes quiet, that dosage can be tapered to 1 g twice daily, to 1 g once daily, and eventually to a maintenance dose of 500 mg daily. Topical steroids block the inflammatory cascade, therefore reducing the corneal inflammation and potential scarring, further reducing the risk of visual impairment.9 Initial treatment is 1 drop 3 times daily, then can be tapered at the same schedule as the oral acyclovir to help simplify adherence for the patient. After 1 drop once daily, steroids may be discontinued while the oral antiviral maintenance dosage continues. Follow-ups should be performed on a monthly to bimonthly basis to evaluate intraocular pressure, ensuring there is no steroid response.
As seen in this patient, adherence with a treatment regimen and awareness of factors, such as a complex psychosocial history that may impact this adherence, are of utmost importance.7
Conclusions
ISK presents unilaterally with decreased or absent corneal sensitivity and nonspecific symptoms. It should be at the top of the list in the differential diagnosis in any patient with unilateral corneal edema, opacification, or neovascularization, and the patient should be started on oral antiviral therapy.
Herpes simplex keratitis (HSK) is a common yet potentially blinding condition caused by a primary or reactivated herpetic infection of the cornea.1 The Herpetic Eye Disease Study established the standard of care in HSK management.2 Treatments range from oral antivirals and artificial tears to topical antibiotics, amniotic membranes, and corneal transplantation.3 Patients with immune stromal keratitis (ISK) may experience low-grade chronic keratitis for years.4 ISK is classified by a cellular and neovascularization infiltration of the cornea.5 We present a case of a patient with recurrent ISK and review its presentation, diagnosis, and management.
Case Presentation
A 52-year-old man presented to the eye clinic with a watery and itchy right eye with mildly blurred vision. His ocular history was unremarkable. His medical history was notable for hepatitis C, hypertension, alcohol and drug dependence, homelessness, and a COVID-19–induced coma. His medications included trazodone, nifedipine, clonidine HCl, and buprenorphine/naloxone.
On clinical examination, the patient’s best-corrected visual acuity was 20/40 in the right eye and 20/20 in the left. Corneal sensitivity was absent in the right eye and intact in the left. Anterior segment findings in the right eye included 360-degree superficial corneal neovascularization, deep neovascularization temporally, scattered patches of corneal haze, epithelial irregularity, and 2+ diffuse bulbar conjunctival injection (Figure 1). The anterior segment of the left eye and the posterior segments of both eyes were unremarkable. The differential diagnosis included HSK, syphilis, Cogan syndrome, varicella-zoster virus keratitis, Epstein-Barr virus keratitis, and Lyme disease. With consultation from a corneal specialist, the patient was given the presumptive diagnosis of ISK in the right eye based on unilateral corneal presentation and lack of corneal sensitivity. He was treated with
The patient returned a week later having only used the prednisolone drops for 2 days before discontinuing. Examination showed no change in his corneal appearance from the previous week. The patient was counseled on the importance of adherence to the regimen of topical prednisolone and oral valacyclovir.
The patient followed up 2 weeks later. He reported good adherence to the ISK medication regimen. His symptoms had resolved, and his visual acuity returned to 20/20 in the right eye. Slit-lamp examination showed improvement in injection, and the superficial corneal neovascularization had cleared. A trace ghost vessel was seen temporally at a site of deep neovascularization (Figure 2). He was instructed to continue valacyclovir once daily and prednisolone drops once daily in the right eye and to follow up in 1 month.
At the 1-month follow-up, the patient’s signs and symptoms had reverted to his original presentation. The patient reported poor adherence to the medication regimen, having missed multiple doses of prednisolone drops as well as valacyclovir. The patient was counseled again on the ISK regimen, and the prednisolone drops and 1-g oral valacyclovir were refilled. A follow-up visit was scheduled for 2 weeks. Additional follow-up revealed a resolved corneal appearance and bimonthly follow-ups were scheduled thereafter.
Discussion
HSK is the most common infectious cause of unilateral blindness and vision impairment in the world.2 This case highlights the diagnosis and management of a patient with ISK, a type of HSK characterized by decreased corneal sensitivity and unilateral stromal opacification or neovascularization.6
ISK is caused by the herpes simplex virus (HSV), a double-stranded enveloped DNA virus that occurs worldwide with little variation, replicates in many types of cells, has rapid growth, and is cytolytic, causing necrosis of nearby cells. Transmission is via direct contact and there is a lifelong latency period in the trigeminal ganglia. Both primary and reactivation infections of HSK can affect a broad array of ocular structures, from the lids to the retina. Infectious epithelial keratitis, also known as dendritic keratitis, is the reactivation of the live virus and is the most common presentation of HSK. ISK is responsible for 20% to 48% of recurrent HSV disease and is the leading cause of vision loss. ISK is the result of an immune-mediated inflammatory response due to a retained viral antigen within the stromal tissue.7 Inflammation in the corneal stroma leads to corneal haze and eventually focal or diffuse scarring, reducing the visual potential.7 This presentation may occur days to years after the initial epithelial episode and may persist for years. Although this patient did not present with infectious epithelial keratitis, it is possible he had a previous episode not mentioned as a history was difficult to obtain, and it can be subtle or innocuous, like pink eye.
Symptoms of ISK include unilateral redness, photophobia, tearing, eye pain, and blurred vision, as described by this patient. On examination, initial manifestations of ISK include corneal haze, edema, scarring, and neovascularization.7 Again, this patient presented with edema and neovascularization. These signs may improve with prompt diagnosis and treatment. More frequent reactivated disease leads to a higher propensity of corneal scarring and irregular astigmatism, reducing the visual outcome.
The standard of care established by the Herpetic Eye Disease Study recommends that a patient with presumed ISK should be started on oral antiviral therapy and, in the absence of epithelial disease, topical steroids. Oral antivirals, such as acyclovir and valacyclovir, have good ocular penetration, a good safety profile, a low susceptibility of resistance, and are well tolerated with long-term treatment.2,8 There were no known interactions between any of the patient’s medications and valacyclovir. Oral antivirals should be used in the initial presentation and for maintenance therapy to help reduce the chance of recurrent disease. Initial treatment for ISK is 1-g valacyclovir 3 times daily. When the eye becomes quiet, that dosage can be tapered to 1 g twice daily, to 1 g once daily, and eventually to a maintenance dose of 500 mg daily. Topical steroids block the inflammatory cascade, therefore reducing the corneal inflammation and potential scarring, further reducing the risk of visual impairment.9 Initial treatment is 1 drop 3 times daily, then can be tapered at the same schedule as the oral acyclovir to help simplify adherence for the patient. After 1 drop once daily, steroids may be discontinued while the oral antiviral maintenance dosage continues. Follow-ups should be performed on a monthly to bimonthly basis to evaluate intraocular pressure, ensuring there is no steroid response.
As seen in this patient, adherence with a treatment regimen and awareness of factors, such as a complex psychosocial history that may impact this adherence, are of utmost importance.7
Conclusions
ISK presents unilaterally with decreased or absent corneal sensitivity and nonspecific symptoms. It should be at the top of the list in the differential diagnosis in any patient with unilateral corneal edema, opacification, or neovascularization, and the patient should be started on oral antiviral therapy.
1. Sibley D, Larkin DFP. Update on Herpes simplex keratitis management. Eye (Lond). 2020;34(12):2219-2226. doi:10.1038/s41433-020-01153-x
2. Chodosh J, Ung L. Adoption of innovation in herpes simplex virus keratitis. Cornea. 2020;39(1)(suppl 1):S7-S18. doi:10.1097/ICO.0000000000002425
3. Pérez-Bartolomé F, Botín DM, de Dompablo P, de Arriba P, Arnalich Montiel F, Muñoz Negrete FJ. Post-herpes neurotrophic keratopathy: pathogenesis, clinical signs and current therapies. Arch Soc Esp Oftalmol. 2019;94(4):171-183. doi:10.1016/j.oftal.2019.01.002
4. Holland EJ, Schwartz GS. Classification of herpes simplex virus keratitis. Cornea. 1999;18(2):144-154.
5. Gauthier AS, Noureddine S, Delbosc B. Interstitial keratitis diagnosis and treatment. J Fr Ophtalmol. 2019;42(6):e229-e237. doi:10.1016/j.jfo.2019.04.001
6. Farooq AV, Shukla D. Herpes simplex epithelial and stromal keratitis: an epidemiologic update. Surv Ophthalmol. 2012;5(57):448-462. doi:10.1016/jsurvophthal.2012.01.005
7. Wang L, Wang R, Xu C, Zhou H. Pathogenesis of herpes stromal keratitis: immune inflammatory response mediated by inflammatory regulators. Front Immunol. 2020;11:766. Published 2020 May 13. doi:10.3389/fimmu.2020.00766
8. Tyring SK, Baker D, Snowden W. Valacyclovir for herpes simplex virus infection: long-term safety and sustained efficacy after 20 years’ experience with acyclovir. J Infect Dis. 2002;186(suppl 1):S40-S46. doi:10.1086/342966
9. Dawson CR. The herpetic eye disease study. Arch Ophthalmol. 1990;108(2):191-192. doi:10.1001/archopht.1990.01070040043027
1. Sibley D, Larkin DFP. Update on Herpes simplex keratitis management. Eye (Lond). 2020;34(12):2219-2226. doi:10.1038/s41433-020-01153-x
2. Chodosh J, Ung L. Adoption of innovation in herpes simplex virus keratitis. Cornea. 2020;39(1)(suppl 1):S7-S18. doi:10.1097/ICO.0000000000002425
3. Pérez-Bartolomé F, Botín DM, de Dompablo P, de Arriba P, Arnalich Montiel F, Muñoz Negrete FJ. Post-herpes neurotrophic keratopathy: pathogenesis, clinical signs and current therapies. Arch Soc Esp Oftalmol. 2019;94(4):171-183. doi:10.1016/j.oftal.2019.01.002
4. Holland EJ, Schwartz GS. Classification of herpes simplex virus keratitis. Cornea. 1999;18(2):144-154.
5. Gauthier AS, Noureddine S, Delbosc B. Interstitial keratitis diagnosis and treatment. J Fr Ophtalmol. 2019;42(6):e229-e237. doi:10.1016/j.jfo.2019.04.001
6. Farooq AV, Shukla D. Herpes simplex epithelial and stromal keratitis: an epidemiologic update. Surv Ophthalmol. 2012;5(57):448-462. doi:10.1016/jsurvophthal.2012.01.005
7. Wang L, Wang R, Xu C, Zhou H. Pathogenesis of herpes stromal keratitis: immune inflammatory response mediated by inflammatory regulators. Front Immunol. 2020;11:766. Published 2020 May 13. doi:10.3389/fimmu.2020.00766
8. Tyring SK, Baker D, Snowden W. Valacyclovir for herpes simplex virus infection: long-term safety and sustained efficacy after 20 years’ experience with acyclovir. J Infect Dis. 2002;186(suppl 1):S40-S46. doi:10.1086/342966
9. Dawson CR. The herpetic eye disease study. Arch Ophthalmol. 1990;108(2):191-192. doi:10.1001/archopht.1990.01070040043027