‘Dr. Caveman’ had a leg up on amputation

Article Type
Changed

 

Monkey see, monkey do (advanced medical procedures)

We don’t tend to think too kindly of our prehistoric ancestors. We throw around the word “caveman” – hardly a term of endearment – and depictions of Paleolithic humans rarely flatter their subjects. In many ways, though, our conceptions are correct. Humans of the Stone Age lived short, often brutish lives, but civilization had to start somewhere, and our prehistoric ancestors were often far more capable than we give them credit for.

Tim Maloney/Nature

Case in point is a recent discovery from an archaeological dig in Borneo: A young adult who lived 31,000 years ago was discovered with the lower third of their left leg amputated. Save the clever retort about the person’s untimely death, because this individual did not die from the surgery. The amputation occurred when the individual was a child and the subject lived for several years after the operation.

Amputation is usually unnecessary given our current level of medical technology, but it’s actually quite an advanced procedure, and this example predates the previous first case of amputation by nearly 25,000 years. Not only did the surgeon need to cut at an appropriate place, they needed to understand blood loss, the risk of infection, and the need to preserve skin in order to seal the wound back up. That’s quite a lot for our Paleolithic doctor to know, and it’s even more impressive considering the, shall we say, limited tools they would have had available to perform the operation.

Rocks. They cut off the leg with a rock. And it worked.

This discovery also gives insight into the amputee’s society. Someone knew that amputation was the right move for this person, indicating that it had been done before. In addition, the individual would not have been able to spring back into action hunting mammoths right away, they would require care for the rest of their lives. And clearly the community provided, given the individual’s continued life post operation and their burial in a place of honor.

If only the American health care system was capable of such feats of compassion, but that would require the majority of politicians to be as clever as cavemen. We’re not hopeful on those odds.
 

The first step is admitting you have a crying baby. The second step is … a step

Knock, knock.

Who’s there?

Crying baby.

Crying baby who?

Current Biology/Ohmura et al.

Crying baby who … umm … doesn’t have a punchline. Let’s try this again.

A priest, a rabbi, and a crying baby walk into a bar and … nope, that’s not going to work.

Why did the crying baby cross the road? Ugh, never mind.

Clearly, crying babies are no laughing matter. What crying babies need is science. And the latest innovation – it’s fresh from a study conducted at the RIKEN Center for Brain Science in Saitama, Japan – in the science of crying babies is … walking. Researchers observed 21 unhappy infants and compared their responses to four strategies: being held by their walking mothers, held by their sitting mothers, lying in a motionless crib, or lying in a rocking cot.

The best strategy is for the mother – the experiment only involved mothers, but the results should apply to any caregiver – to pick up the crying baby, walk around for 5 minutes, sit for another 5-8 minutes, and then put the infant back to bed, the researchers said in a written statement.

The walking strategy, however, isn’t perfect. “Walking for 5 minutes promoted sleep, but only for crying infants. Surprisingly, this effect was absent when babies were already calm beforehand,” lead author Kumi O. Kuroda, MD, PhD, explained in a separate statement from the center.

It also doesn’t work on adults. We could not get a crying LOTME writer to fall asleep no matter how long his mother carried him around the office.
 

 

 

New way to detect Parkinson’s has already passed the sniff test

We humans aren’t generally known for our superpowers, but a woman from Scotland may just be the Smelling Superhero. Not only was she able to literally smell Parkinson’s disease (PD) on her husband 12 years before his diagnosis; she is also the reason that scientists have found a new way to test for PD.

© Siri Stafford/Thinkstock

Joy Milne, a retired nurse, told the BBC that her husband “had this musty rather unpleasant smell especially round his shoulders and the back of his neck and his skin had definitely changed.” She put two and two together after he had been diagnosed with PD and she came in contact with others with the same scent at a support group.

Researchers at the University of Manchester, working with Ms. Milne, have now created a skin test that uses mass spectroscopy to analyze a sample of the patient’s sebum in just 3 minutes and is 95% accurate. They tested 79 people with Parkinson’s and 71 without using this method and found “specific compounds unique to PD sebum samples when compared to healthy controls. Furthermore, we have identified two classes of lipids, namely, triacylglycerides and diglycerides, as components of human sebum that are significantly differentially expressed in PD,” they said in JACS Au.

This test could be available to general physicians within 2 years, which would provide new opportunities to the people who are waiting in line for neurologic consults. Ms. Milne’s husband passed away in 2015, but her courageous help and amazing nasal abilities may help millions down the line.
 

The power of flirting

It’s a common office stereotype: Women flirt with the boss to get ahead in the workplace, while men in power sexually harass women in subordinate positions. Nobody ever suspects the guys in the cubicles. A recent study takes a different look and paints a different picture.

Mart Production/Pexels

The investigators conducted multiple online and lab experiments in how social sexual identity drives behavior in a workplace setting in relation to job placement. They found that it was most often men in lower-power positions who are insecure about their roles who initiate social sexual behavior, even though they know it’s offensive. Why? Power.

They randomly paired over 200 undergraduate students in a male/female fashion, placed them in subordinate and boss-like roles, and asked them to choose from a series of social sexual questions they wanted to ask their teammate. Male participants who were placed in subordinate positions to a female boss chose social sexual questions more often than did male bosses, female subordinates, and female bosses.

So what does this say about the threat of workplace harassment? The researchers found that men and women differ in their strategy for flirtation. For men, it’s a way to gain more power. But problems arise when they rationalize their behavior with a character trait like being a “big flirt.”

“When we take on that identity, it leads to certain behavioral patterns that reinforce the identity. And then, people use that identity as an excuse,” lead author Laura Kray of the University of California, Berkeley, said in a statement from the school.

The researchers make a point to note that the study isn’t about whether flirting is good or bad, nor are they suggesting that people in powerful positions don’t sexually harass underlings. It’s meant to provide insight to improve corporate sexual harassment training. A comment or conversation held in jest could potentially be a warning sign for future behavior.

Publications
Topics
Sections

 

Monkey see, monkey do (advanced medical procedures)

We don’t tend to think too kindly of our prehistoric ancestors. We throw around the word “caveman” – hardly a term of endearment – and depictions of Paleolithic humans rarely flatter their subjects. In many ways, though, our conceptions are correct. Humans of the Stone Age lived short, often brutish lives, but civilization had to start somewhere, and our prehistoric ancestors were often far more capable than we give them credit for.

Tim Maloney/Nature

Case in point is a recent discovery from an archaeological dig in Borneo: A young adult who lived 31,000 years ago was discovered with the lower third of their left leg amputated. Save the clever retort about the person’s untimely death, because this individual did not die from the surgery. The amputation occurred when the individual was a child and the subject lived for several years after the operation.

Amputation is usually unnecessary given our current level of medical technology, but it’s actually quite an advanced procedure, and this example predates the previous first case of amputation by nearly 25,000 years. Not only did the surgeon need to cut at an appropriate place, they needed to understand blood loss, the risk of infection, and the need to preserve skin in order to seal the wound back up. That’s quite a lot for our Paleolithic doctor to know, and it’s even more impressive considering the, shall we say, limited tools they would have had available to perform the operation.

Rocks. They cut off the leg with a rock. And it worked.

This discovery also gives insight into the amputee’s society. Someone knew that amputation was the right move for this person, indicating that it had been done before. In addition, the individual would not have been able to spring back into action hunting mammoths right away, they would require care for the rest of their lives. And clearly the community provided, given the individual’s continued life post operation and their burial in a place of honor.

If only the American health care system was capable of such feats of compassion, but that would require the majority of politicians to be as clever as cavemen. We’re not hopeful on those odds.
 

The first step is admitting you have a crying baby. The second step is … a step

Knock, knock.

Who’s there?

Crying baby.

Crying baby who?

Current Biology/Ohmura et al.

Crying baby who … umm … doesn’t have a punchline. Let’s try this again.

A priest, a rabbi, and a crying baby walk into a bar and … nope, that’s not going to work.

Why did the crying baby cross the road? Ugh, never mind.

Clearly, crying babies are no laughing matter. What crying babies need is science. And the latest innovation – it’s fresh from a study conducted at the RIKEN Center for Brain Science in Saitama, Japan – in the science of crying babies is … walking. Researchers observed 21 unhappy infants and compared their responses to four strategies: being held by their walking mothers, held by their sitting mothers, lying in a motionless crib, or lying in a rocking cot.

The best strategy is for the mother – the experiment only involved mothers, but the results should apply to any caregiver – to pick up the crying baby, walk around for 5 minutes, sit for another 5-8 minutes, and then put the infant back to bed, the researchers said in a written statement.

The walking strategy, however, isn’t perfect. “Walking for 5 minutes promoted sleep, but only for crying infants. Surprisingly, this effect was absent when babies were already calm beforehand,” lead author Kumi O. Kuroda, MD, PhD, explained in a separate statement from the center.

It also doesn’t work on adults. We could not get a crying LOTME writer to fall asleep no matter how long his mother carried him around the office.
 

 

 

New way to detect Parkinson’s has already passed the sniff test

We humans aren’t generally known for our superpowers, but a woman from Scotland may just be the Smelling Superhero. Not only was she able to literally smell Parkinson’s disease (PD) on her husband 12 years before his diagnosis; she is also the reason that scientists have found a new way to test for PD.

© Siri Stafford/Thinkstock

Joy Milne, a retired nurse, told the BBC that her husband “had this musty rather unpleasant smell especially round his shoulders and the back of his neck and his skin had definitely changed.” She put two and two together after he had been diagnosed with PD and she came in contact with others with the same scent at a support group.

Researchers at the University of Manchester, working with Ms. Milne, have now created a skin test that uses mass spectroscopy to analyze a sample of the patient’s sebum in just 3 minutes and is 95% accurate. They tested 79 people with Parkinson’s and 71 without using this method and found “specific compounds unique to PD sebum samples when compared to healthy controls. Furthermore, we have identified two classes of lipids, namely, triacylglycerides and diglycerides, as components of human sebum that are significantly differentially expressed in PD,” they said in JACS Au.

This test could be available to general physicians within 2 years, which would provide new opportunities to the people who are waiting in line for neurologic consults. Ms. Milne’s husband passed away in 2015, but her courageous help and amazing nasal abilities may help millions down the line.
 

The power of flirting

It’s a common office stereotype: Women flirt with the boss to get ahead in the workplace, while men in power sexually harass women in subordinate positions. Nobody ever suspects the guys in the cubicles. A recent study takes a different look and paints a different picture.

Mart Production/Pexels

The investigators conducted multiple online and lab experiments in how social sexual identity drives behavior in a workplace setting in relation to job placement. They found that it was most often men in lower-power positions who are insecure about their roles who initiate social sexual behavior, even though they know it’s offensive. Why? Power.

They randomly paired over 200 undergraduate students in a male/female fashion, placed them in subordinate and boss-like roles, and asked them to choose from a series of social sexual questions they wanted to ask their teammate. Male participants who were placed in subordinate positions to a female boss chose social sexual questions more often than did male bosses, female subordinates, and female bosses.

So what does this say about the threat of workplace harassment? The researchers found that men and women differ in their strategy for flirtation. For men, it’s a way to gain more power. But problems arise when they rationalize their behavior with a character trait like being a “big flirt.”

“When we take on that identity, it leads to certain behavioral patterns that reinforce the identity. And then, people use that identity as an excuse,” lead author Laura Kray of the University of California, Berkeley, said in a statement from the school.

The researchers make a point to note that the study isn’t about whether flirting is good or bad, nor are they suggesting that people in powerful positions don’t sexually harass underlings. It’s meant to provide insight to improve corporate sexual harassment training. A comment or conversation held in jest could potentially be a warning sign for future behavior.

 

Monkey see, monkey do (advanced medical procedures)

We don’t tend to think too kindly of our prehistoric ancestors. We throw around the word “caveman” – hardly a term of endearment – and depictions of Paleolithic humans rarely flatter their subjects. In many ways, though, our conceptions are correct. Humans of the Stone Age lived short, often brutish lives, but civilization had to start somewhere, and our prehistoric ancestors were often far more capable than we give them credit for.

Tim Maloney/Nature

Case in point is a recent discovery from an archaeological dig in Borneo: A young adult who lived 31,000 years ago was discovered with the lower third of their left leg amputated. Save the clever retort about the person’s untimely death, because this individual did not die from the surgery. The amputation occurred when the individual was a child and the subject lived for several years after the operation.

Amputation is usually unnecessary given our current level of medical technology, but it’s actually quite an advanced procedure, and this example predates the previous first case of amputation by nearly 25,000 years. Not only did the surgeon need to cut at an appropriate place, they needed to understand blood loss, the risk of infection, and the need to preserve skin in order to seal the wound back up. That’s quite a lot for our Paleolithic doctor to know, and it’s even more impressive considering the, shall we say, limited tools they would have had available to perform the operation.

Rocks. They cut off the leg with a rock. And it worked.

This discovery also gives insight into the amputee’s society. Someone knew that amputation was the right move for this person, indicating that it had been done before. In addition, the individual would not have been able to spring back into action hunting mammoths right away, they would require care for the rest of their lives. And clearly the community provided, given the individual’s continued life post operation and their burial in a place of honor.

If only the American health care system was capable of such feats of compassion, but that would require the majority of politicians to be as clever as cavemen. We’re not hopeful on those odds.
 

The first step is admitting you have a crying baby. The second step is … a step

Knock, knock.

Who’s there?

Crying baby.

Crying baby who?

Current Biology/Ohmura et al.

Crying baby who … umm … doesn’t have a punchline. Let’s try this again.

A priest, a rabbi, and a crying baby walk into a bar and … nope, that’s not going to work.

Why did the crying baby cross the road? Ugh, never mind.

Clearly, crying babies are no laughing matter. What crying babies need is science. And the latest innovation – it’s fresh from a study conducted at the RIKEN Center for Brain Science in Saitama, Japan – in the science of crying babies is … walking. Researchers observed 21 unhappy infants and compared their responses to four strategies: being held by their walking mothers, held by their sitting mothers, lying in a motionless crib, or lying in a rocking cot.

The best strategy is for the mother – the experiment only involved mothers, but the results should apply to any caregiver – to pick up the crying baby, walk around for 5 minutes, sit for another 5-8 minutes, and then put the infant back to bed, the researchers said in a written statement.

The walking strategy, however, isn’t perfect. “Walking for 5 minutes promoted sleep, but only for crying infants. Surprisingly, this effect was absent when babies were already calm beforehand,” lead author Kumi O. Kuroda, MD, PhD, explained in a separate statement from the center.

It also doesn’t work on adults. We could not get a crying LOTME writer to fall asleep no matter how long his mother carried him around the office.
 

 

 

New way to detect Parkinson’s has already passed the sniff test

We humans aren’t generally known for our superpowers, but a woman from Scotland may just be the Smelling Superhero. Not only was she able to literally smell Parkinson’s disease (PD) on her husband 12 years before his diagnosis; she is also the reason that scientists have found a new way to test for PD.

© Siri Stafford/Thinkstock

Joy Milne, a retired nurse, told the BBC that her husband “had this musty rather unpleasant smell especially round his shoulders and the back of his neck and his skin had definitely changed.” She put two and two together after he had been diagnosed with PD and she came in contact with others with the same scent at a support group.

Researchers at the University of Manchester, working with Ms. Milne, have now created a skin test that uses mass spectroscopy to analyze a sample of the patient’s sebum in just 3 minutes and is 95% accurate. They tested 79 people with Parkinson’s and 71 without using this method and found “specific compounds unique to PD sebum samples when compared to healthy controls. Furthermore, we have identified two classes of lipids, namely, triacylglycerides and diglycerides, as components of human sebum that are significantly differentially expressed in PD,” they said in JACS Au.

This test could be available to general physicians within 2 years, which would provide new opportunities to the people who are waiting in line for neurologic consults. Ms. Milne’s husband passed away in 2015, but her courageous help and amazing nasal abilities may help millions down the line.
 

The power of flirting

It’s a common office stereotype: Women flirt with the boss to get ahead in the workplace, while men in power sexually harass women in subordinate positions. Nobody ever suspects the guys in the cubicles. A recent study takes a different look and paints a different picture.

Mart Production/Pexels

The investigators conducted multiple online and lab experiments in how social sexual identity drives behavior in a workplace setting in relation to job placement. They found that it was most often men in lower-power positions who are insecure about their roles who initiate social sexual behavior, even though they know it’s offensive. Why? Power.

They randomly paired over 200 undergraduate students in a male/female fashion, placed them in subordinate and boss-like roles, and asked them to choose from a series of social sexual questions they wanted to ask their teammate. Male participants who were placed in subordinate positions to a female boss chose social sexual questions more often than did male bosses, female subordinates, and female bosses.

So what does this say about the threat of workplace harassment? The researchers found that men and women differ in their strategy for flirtation. For men, it’s a way to gain more power. But problems arise when they rationalize their behavior with a character trait like being a “big flirt.”

“When we take on that identity, it leads to certain behavioral patterns that reinforce the identity. And then, people use that identity as an excuse,” lead author Laura Kray of the University of California, Berkeley, said in a statement from the school.

The researchers make a point to note that the study isn’t about whether flirting is good or bad, nor are they suggesting that people in powerful positions don’t sexually harass underlings. It’s meant to provide insight to improve corporate sexual harassment training. A comment or conversation held in jest could potentially be a warning sign for future behavior.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Targeted anti-IgE therapy found safe and effective for chronic urticaria

Article Type
Changed

MILAN – The therapeutic value of inhibiting the activity of IgE in patients with chronic spontaneous urticaria (CSU) was reinforced by two large phase 3 trials with ligelizumab, a drug characterized as a new generation anti-IgE monoclonal antibody.

Both doses of ligelizumab evaluated met the primary endpoint of superiority to placebo for a complete response at 16 weeks of therapy, reported Marcus Maurer, MD, director of the Urticaria Center for Reference and Excellence at the Charité Hospital, Berlin.

The data from the two identically designed trials, PEARL 1 and PEARL 2, were presented at the annual congress of the European Academy of Dermatology and Venereology. The two ligelizumab experimental arms (72 mg or 120 mg administered subcutaneously every 4 weeks) and the active comparative arm of omalizumab (300 mg administered subcutaneously every 4 weeks) demonstrated similar efficacy, all three of which were highly superior to placebo.

The data show that “another anti-IgE therapy – ligelizumab – is effective in CSU,” Dr. Maurer said.

“While the benefit was not different from omalizumab, ligelizumab showed remarkable results in disease activity and by demonstrating just how many patients achieved what we want them to achieve, which is to have no more signs and symptoms,” he added.
 

Majority of participants with severe urticaria

All of the patients entered into the two trials had severe (about 65%) or moderate (about 35%) symptoms at baseline. The results of the two trials were almost identical. In the randomization arms, a weekly Urticaria Activity Score (UAS7) of 0, which was the primary endpoint, was achieved at week 16 by 31.0% of those receiving 72-mg ligelizumab, 38.3% of those receiving 120-mg ligelizumab, and 34.1% of those receiving omalizumab (Xolair). The placebo response was 5.7%.

The UAS7 score is drawn from two components, wheals and itch. The range is 0 (no symptoms) to 42 (most severe). At baseline, the average patients’ scores were about 30, which correlates with a substantial symptom burden, according to Dr. Maurer.

The mean reduction in the UAS7 score in PEARL 2, which differed from PEARL 1 by no more than 0.4 points for any treatment group, was 19.2 points in the 72-mg ligelizumab group, 19.3 points in the 120-mg ligelizumab group, 19.6 points in the omalizumab group, and 9.2 points in the placebo group. There were no significant differences between any active treatment arm.

Complete symptom relief, meaning a UAS7 score of 0, was selected as the primary endpoint, because Dr. Maurer said that this is the goal of treatment. Although he admitted that a UAS7 score of 0 is analogous to a PASI score in psoriasis of 100 (complete clearing), he said, “Chronic urticaria is a debilitating disease, and we want to eliminate the symptoms. Gone is gone.”

Combined, the two phase 3 trials represent “the biggest chronic urticaria program ever,” according to Dr. Maurer. The 1,034 patients enrolled in PEARL 1 and the 1,023 enrolled in PEARL 2 were randomized in a 3:3:3:1 ratio with placebo representing the smaller group.

The planned follow-up is 52 weeks, but the placebo group will be switched to 120 mg ligelizumab every 4 weeks at the end of 24 weeks. The switch is required because “you cannot maintain patients with this disease on placebo over a long period,” Dr. Maurer said.
 

 

 

Ligelizumab associated with low discontinuation rate

Adverse events overall and stratified by severity have been similar across treatment arms, including placebo. The possible exception was a lower rate of moderate events (16.5%) in the placebo arm relative to the 72-mg ligelizumab arm (19.8%), the 120-mg ligelizumab arm (21.6%), and the omalizumab arm (22.3%). Discontinuations because of an adverse event were under 4% in every treatment arm.

Although Dr. Maurer did not present outcomes at 52 weeks, he did note that “only 15% of those who enrolled in these trials have discontinued treatment.” He considered this remarkable in that the study was conducted in the midst of the COVID-19 pandemic, and it appears that at least some of those left the trial did so because of concern for clinic visits.

Despite the similar benefit provided by ligelizumab and omalizumab, Dr. Maurer said that subgroup analyses will be coming. The possibility that some patients benefit more from one than the another cannot yet be ruled out. There are also, as of yet, no data to determine whether at least some patients respond to one after an inadequate response to the other.

Still, given the efficacy and the safety of ligelizumab, Dr. Maurer indicated that the drug is likely to find a role in routine management of CSU if approved.

“We only have two options for chronic spontaneous urticaria. There are antihistamines, which do not usually work, and omalizumab,” he said. “It is very important we develop more treatment options.”

Adam Friedman, MD, professor and chair of dermatology, George Washington University, Washington, agreed.

“More therapeutic options, especially for disease states that have a small armament – even if equivalent in efficacy to established therapies – is always a win for patients as it almost always increases access to treatment,” Dr. Friedman said in an interview.

“Furthermore, the heterogeneous nature of inflammatory skin diseases is often not captured in even phase 3 studies. Therefore, having additional options could offer relief where previous therapies have failed,” he added.

Dr. Maurer reports financial relationships with more than 10 pharmaceutical companies, including Novartis, which is developing ligelizumab. Dr. Friedman has a financial relationship with more than 20 pharmaceutical companies but has no current financial association with Novartis and was not involved in the PEARL 1 and 2 trials.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

MILAN – The therapeutic value of inhibiting the activity of IgE in patients with chronic spontaneous urticaria (CSU) was reinforced by two large phase 3 trials with ligelizumab, a drug characterized as a new generation anti-IgE monoclonal antibody.

Both doses of ligelizumab evaluated met the primary endpoint of superiority to placebo for a complete response at 16 weeks of therapy, reported Marcus Maurer, MD, director of the Urticaria Center for Reference and Excellence at the Charité Hospital, Berlin.

The data from the two identically designed trials, PEARL 1 and PEARL 2, were presented at the annual congress of the European Academy of Dermatology and Venereology. The two ligelizumab experimental arms (72 mg or 120 mg administered subcutaneously every 4 weeks) and the active comparative arm of omalizumab (300 mg administered subcutaneously every 4 weeks) demonstrated similar efficacy, all three of which were highly superior to placebo.

The data show that “another anti-IgE therapy – ligelizumab – is effective in CSU,” Dr. Maurer said.

“While the benefit was not different from omalizumab, ligelizumab showed remarkable results in disease activity and by demonstrating just how many patients achieved what we want them to achieve, which is to have no more signs and symptoms,” he added.
 

Majority of participants with severe urticaria

All of the patients entered into the two trials had severe (about 65%) or moderate (about 35%) symptoms at baseline. The results of the two trials were almost identical. In the randomization arms, a weekly Urticaria Activity Score (UAS7) of 0, which was the primary endpoint, was achieved at week 16 by 31.0% of those receiving 72-mg ligelizumab, 38.3% of those receiving 120-mg ligelizumab, and 34.1% of those receiving omalizumab (Xolair). The placebo response was 5.7%.

The UAS7 score is drawn from two components, wheals and itch. The range is 0 (no symptoms) to 42 (most severe). At baseline, the average patients’ scores were about 30, which correlates with a substantial symptom burden, according to Dr. Maurer.

The mean reduction in the UAS7 score in PEARL 2, which differed from PEARL 1 by no more than 0.4 points for any treatment group, was 19.2 points in the 72-mg ligelizumab group, 19.3 points in the 120-mg ligelizumab group, 19.6 points in the omalizumab group, and 9.2 points in the placebo group. There were no significant differences between any active treatment arm.

Complete symptom relief, meaning a UAS7 score of 0, was selected as the primary endpoint, because Dr. Maurer said that this is the goal of treatment. Although he admitted that a UAS7 score of 0 is analogous to a PASI score in psoriasis of 100 (complete clearing), he said, “Chronic urticaria is a debilitating disease, and we want to eliminate the symptoms. Gone is gone.”

Combined, the two phase 3 trials represent “the biggest chronic urticaria program ever,” according to Dr. Maurer. The 1,034 patients enrolled in PEARL 1 and the 1,023 enrolled in PEARL 2 were randomized in a 3:3:3:1 ratio with placebo representing the smaller group.

The planned follow-up is 52 weeks, but the placebo group will be switched to 120 mg ligelizumab every 4 weeks at the end of 24 weeks. The switch is required because “you cannot maintain patients with this disease on placebo over a long period,” Dr. Maurer said.
 

 

 

Ligelizumab associated with low discontinuation rate

Adverse events overall and stratified by severity have been similar across treatment arms, including placebo. The possible exception was a lower rate of moderate events (16.5%) in the placebo arm relative to the 72-mg ligelizumab arm (19.8%), the 120-mg ligelizumab arm (21.6%), and the omalizumab arm (22.3%). Discontinuations because of an adverse event were under 4% in every treatment arm.

Although Dr. Maurer did not present outcomes at 52 weeks, he did note that “only 15% of those who enrolled in these trials have discontinued treatment.” He considered this remarkable in that the study was conducted in the midst of the COVID-19 pandemic, and it appears that at least some of those left the trial did so because of concern for clinic visits.

Despite the similar benefit provided by ligelizumab and omalizumab, Dr. Maurer said that subgroup analyses will be coming. The possibility that some patients benefit more from one than the another cannot yet be ruled out. There are also, as of yet, no data to determine whether at least some patients respond to one after an inadequate response to the other.

Still, given the efficacy and the safety of ligelizumab, Dr. Maurer indicated that the drug is likely to find a role in routine management of CSU if approved.

“We only have two options for chronic spontaneous urticaria. There are antihistamines, which do not usually work, and omalizumab,” he said. “It is very important we develop more treatment options.”

Adam Friedman, MD, professor and chair of dermatology, George Washington University, Washington, agreed.

“More therapeutic options, especially for disease states that have a small armament – even if equivalent in efficacy to established therapies – is always a win for patients as it almost always increases access to treatment,” Dr. Friedman said in an interview.

“Furthermore, the heterogeneous nature of inflammatory skin diseases is often not captured in even phase 3 studies. Therefore, having additional options could offer relief where previous therapies have failed,” he added.

Dr. Maurer reports financial relationships with more than 10 pharmaceutical companies, including Novartis, which is developing ligelizumab. Dr. Friedman has a financial relationship with more than 20 pharmaceutical companies but has no current financial association with Novartis and was not involved in the PEARL 1 and 2 trials.

MILAN – The therapeutic value of inhibiting the activity of IgE in patients with chronic spontaneous urticaria (CSU) was reinforced by two large phase 3 trials with ligelizumab, a drug characterized as a new generation anti-IgE monoclonal antibody.

Both doses of ligelizumab evaluated met the primary endpoint of superiority to placebo for a complete response at 16 weeks of therapy, reported Marcus Maurer, MD, director of the Urticaria Center for Reference and Excellence at the Charité Hospital, Berlin.

The data from the two identically designed trials, PEARL 1 and PEARL 2, were presented at the annual congress of the European Academy of Dermatology and Venereology. The two ligelizumab experimental arms (72 mg or 120 mg administered subcutaneously every 4 weeks) and the active comparative arm of omalizumab (300 mg administered subcutaneously every 4 weeks) demonstrated similar efficacy, all three of which were highly superior to placebo.

The data show that “another anti-IgE therapy – ligelizumab – is effective in CSU,” Dr. Maurer said.

“While the benefit was not different from omalizumab, ligelizumab showed remarkable results in disease activity and by demonstrating just how many patients achieved what we want them to achieve, which is to have no more signs and symptoms,” he added.
 

Majority of participants with severe urticaria

All of the patients entered into the two trials had severe (about 65%) or moderate (about 35%) symptoms at baseline. The results of the two trials were almost identical. In the randomization arms, a weekly Urticaria Activity Score (UAS7) of 0, which was the primary endpoint, was achieved at week 16 by 31.0% of those receiving 72-mg ligelizumab, 38.3% of those receiving 120-mg ligelizumab, and 34.1% of those receiving omalizumab (Xolair). The placebo response was 5.7%.

The UAS7 score is drawn from two components, wheals and itch. The range is 0 (no symptoms) to 42 (most severe). At baseline, the average patients’ scores were about 30, which correlates with a substantial symptom burden, according to Dr. Maurer.

The mean reduction in the UAS7 score in PEARL 2, which differed from PEARL 1 by no more than 0.4 points for any treatment group, was 19.2 points in the 72-mg ligelizumab group, 19.3 points in the 120-mg ligelizumab group, 19.6 points in the omalizumab group, and 9.2 points in the placebo group. There were no significant differences between any active treatment arm.

Complete symptom relief, meaning a UAS7 score of 0, was selected as the primary endpoint, because Dr. Maurer said that this is the goal of treatment. Although he admitted that a UAS7 score of 0 is analogous to a PASI score in psoriasis of 100 (complete clearing), he said, “Chronic urticaria is a debilitating disease, and we want to eliminate the symptoms. Gone is gone.”

Combined, the two phase 3 trials represent “the biggest chronic urticaria program ever,” according to Dr. Maurer. The 1,034 patients enrolled in PEARL 1 and the 1,023 enrolled in PEARL 2 were randomized in a 3:3:3:1 ratio with placebo representing the smaller group.

The planned follow-up is 52 weeks, but the placebo group will be switched to 120 mg ligelizumab every 4 weeks at the end of 24 weeks. The switch is required because “you cannot maintain patients with this disease on placebo over a long period,” Dr. Maurer said.
 

 

 

Ligelizumab associated with low discontinuation rate

Adverse events overall and stratified by severity have been similar across treatment arms, including placebo. The possible exception was a lower rate of moderate events (16.5%) in the placebo arm relative to the 72-mg ligelizumab arm (19.8%), the 120-mg ligelizumab arm (21.6%), and the omalizumab arm (22.3%). Discontinuations because of an adverse event were under 4% in every treatment arm.

Although Dr. Maurer did not present outcomes at 52 weeks, he did note that “only 15% of those who enrolled in these trials have discontinued treatment.” He considered this remarkable in that the study was conducted in the midst of the COVID-19 pandemic, and it appears that at least some of those left the trial did so because of concern for clinic visits.

Despite the similar benefit provided by ligelizumab and omalizumab, Dr. Maurer said that subgroup analyses will be coming. The possibility that some patients benefit more from one than the another cannot yet be ruled out. There are also, as of yet, no data to determine whether at least some patients respond to one after an inadequate response to the other.

Still, given the efficacy and the safety of ligelizumab, Dr. Maurer indicated that the drug is likely to find a role in routine management of CSU if approved.

“We only have two options for chronic spontaneous urticaria. There are antihistamines, which do not usually work, and omalizumab,” he said. “It is very important we develop more treatment options.”

Adam Friedman, MD, professor and chair of dermatology, George Washington University, Washington, agreed.

“More therapeutic options, especially for disease states that have a small armament – even if equivalent in efficacy to established therapies – is always a win for patients as it almost always increases access to treatment,” Dr. Friedman said in an interview.

“Furthermore, the heterogeneous nature of inflammatory skin diseases is often not captured in even phase 3 studies. Therefore, having additional options could offer relief where previous therapies have failed,” he added.

Dr. Maurer reports financial relationships with more than 10 pharmaceutical companies, including Novartis, which is developing ligelizumab. Dr. Friedman has a financial relationship with more than 20 pharmaceutical companies but has no current financial association with Novartis and was not involved in the PEARL 1 and 2 trials.

Publications
Publications
Topics
Article Type
Sections
Article Source

AT THE EADV CONGRESS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Your poop may hold the secret to long life

Article Type
Changed

Lots of things can disrupt your gut health over the years. A high-sugar diet, stress, antibiotics – all are linked to bad changes in the gut microbiome, the microbes that live in your intestinal tract. And this can raise the risk of diseases.

But what if you could erase all that damage, restoring your gut to a time when you were younger and healthier?

It could be possible, scientists say, by having people take a sample of their own stool when they are young to be put back into their colons when they are older.

While the science to back this up isn’t quite there yet, some researchers are saying we shouldn’t wait. They are calling on existing stool banks to let people start banking their stool now, so it’s there for them to use if the science becomes available.

But how would that work?

First, you’d go to a stool bank and provide a fresh sample of your poop, which would be screened for diseases, washed, processed, and deposited into a long-term storage facility.

Then, down the road, if you get a condition such as inflammatory bowel disease, heart disease, or type 2 diabetes – or if you have a procedure that wipes out your microbiome, like a course of antibiotics or chemotherapy – doctors could use your preserved stool to “re-colonize” your gut, restoring it to its earlier, healthier state, said Scott Weiss, MD, professor of medicine at Harvard Medical School, Boston, and a coauthor of a recent paper on the topic. They would do that using fecal microbiota transplantation, or FMT.

Timing is everything. You’d want a sample from when you’re healthy – say, between the ages of 18 and 35, or before a chronic condition is likely, said Dr. Weiss. But if you’re still healthy into your late 30s, 40s, or even 50s, providing a sample then could still benefit you later in life.

If we could pull off a banking system like this, it could have the potential to treat autoimmune disease, inflammatory bowel disease, diabetes, obesity, and heart disease – or even reverse the effects of aging. How can we make this happen?
 

Stool banks of today

While stool banks do exist today, the samples inside are destined not for the original donors but rather for sick patients hoping to treat an illness. Using FMT, doctors transfer the fecal material to the patient’s colon, restoring helpful gut microbiota.

Some research shows FMT may help treat inflammatory bowel diseases, such as Crohn’s or ulcerative colitis. Animal studies suggest it could help treat obesity, lengthen lifespan, and reverse some effects of aging, such as age-related decline in brain function. Other clinical trials are looking into its potential as a cancer treatment, said Dr. Weiss.

But outside the lab, FMT is mainly used for one purpose: to treat Clostridioides difficile infection. It works even better than antibiotics, research shows.

But first you need to find a healthy donor, and that’s harder than you might think.
 

Finding healthy stool samples

Banking our bodily substances is nothing new. Blood banks, for example, are common throughout the United States, and cord blood banking – preserving blood from a baby’s umbilical cord to aid possible future medical needs of the child – is becoming more popular. Sperm donors are highly sought after, and doctors regularly transplant kidneys and bone marrow to patients in need.

So why are we so particular about poop?

Part of the reason may be because feces (like blood, for that matter) can harbor disease – which is why it’s so important to find healthy stool donors. Problem is, this can be surprisingly hard to do.

To donate fecal matter, people must go through a rigorous screening process, said Majdi Osman, MD, chief medical officer for OpenBiome, a nonprofit microbiome research organization.

Until recently, OpenBiome operated a stool donation program, though it has since shifted its focus to research. Potential donors were screened for diseases and mental health conditions, pathogens, and antibiotic resistance. The pass rate was less than 3%.

“We take a very cautious approach because the association between diseases and the microbiome is still being understood,” Dr. Osman said.

FMT also carries risks – though so far, they seem mild. Side effects include mild diarrhea, nausea, belly pain, and fatigue. (The reason? Even the healthiest donor stool may not mix perfectly with your own.)

That’s where the idea of using your own stool comes in, said Yang-Yu Liu, PhD, a Harvard researcher who studies the microbiome and the lead author of the paper mentioned above. It’s not just more appealing but may also be a better “match” for your body.
 

Should you bank your stool?

While the researchers say we have reason to be optimistic about the future, it’s important to remember that many challenges remain. FMT is early in development, and there’s a lot about the microbiome we still don’t know.

There’s no guarantee, for example, that restoring a person’s microbiome to its formerly disease-free state will keep diseases at bay forever, said Dr. Weiss. If your genes raise your odds of having Crohn’s, for instance, it’s possible the disease could come back.

We also don’t know how long stool samples can be preserved, said Dr. Liu. Stool banks currently store fecal matter for 1 or 2 years, not decades. To protect the proteins and DNA structures for that long, samples would likely need to be stashed at the liquid nitrogen storage temperature of –196° C. (Currently, samples are stored at about –80° C.) Even then, testing would be needed to confirm if the fragile microorganisms in the stool can survive.

This raises another question: Who’s going to regulate all this?

The FDA regulates the use of FMT as a drug for the treatment of C. diff, but as Dr. Liu pointed out, many gastroenterologists consider the gut microbiota an organ. In that case, human fecal matter could be regulated the same way blood, bone, or even egg cells are.

Cord blood banking may be a helpful model, Dr. Liu said.

“We don’t have to start from scratch.”

Then there’s the question of cost. Cord blood banks could be a point of reference for that too, the researchers say. They charge about $1,500 to $2,820 for the first collection and processing, plus a yearly storage fee of $185 to $370.

Despite the unknowns, one thing is for sure: The interest in fecal banking is real – and growing. At least one microbiome firm, Cordlife Group Limited, based in Singapore, announced that it has started to allow people to bank their stool for future use.

“More people should talk about it and think about it,” said Dr. Liu.

A version of this article first appeared on WebMD.com.

Publications
Topics
Sections

Lots of things can disrupt your gut health over the years. A high-sugar diet, stress, antibiotics – all are linked to bad changes in the gut microbiome, the microbes that live in your intestinal tract. And this can raise the risk of diseases.

But what if you could erase all that damage, restoring your gut to a time when you were younger and healthier?

It could be possible, scientists say, by having people take a sample of their own stool when they are young to be put back into their colons when they are older.

While the science to back this up isn’t quite there yet, some researchers are saying we shouldn’t wait. They are calling on existing stool banks to let people start banking their stool now, so it’s there for them to use if the science becomes available.

But how would that work?

First, you’d go to a stool bank and provide a fresh sample of your poop, which would be screened for diseases, washed, processed, and deposited into a long-term storage facility.

Then, down the road, if you get a condition such as inflammatory bowel disease, heart disease, or type 2 diabetes – or if you have a procedure that wipes out your microbiome, like a course of antibiotics or chemotherapy – doctors could use your preserved stool to “re-colonize” your gut, restoring it to its earlier, healthier state, said Scott Weiss, MD, professor of medicine at Harvard Medical School, Boston, and a coauthor of a recent paper on the topic. They would do that using fecal microbiota transplantation, or FMT.

Timing is everything. You’d want a sample from when you’re healthy – say, between the ages of 18 and 35, or before a chronic condition is likely, said Dr. Weiss. But if you’re still healthy into your late 30s, 40s, or even 50s, providing a sample then could still benefit you later in life.

If we could pull off a banking system like this, it could have the potential to treat autoimmune disease, inflammatory bowel disease, diabetes, obesity, and heart disease – or even reverse the effects of aging. How can we make this happen?
 

Stool banks of today

While stool banks do exist today, the samples inside are destined not for the original donors but rather for sick patients hoping to treat an illness. Using FMT, doctors transfer the fecal material to the patient’s colon, restoring helpful gut microbiota.

Some research shows FMT may help treat inflammatory bowel diseases, such as Crohn’s or ulcerative colitis. Animal studies suggest it could help treat obesity, lengthen lifespan, and reverse some effects of aging, such as age-related decline in brain function. Other clinical trials are looking into its potential as a cancer treatment, said Dr. Weiss.

But outside the lab, FMT is mainly used for one purpose: to treat Clostridioides difficile infection. It works even better than antibiotics, research shows.

But first you need to find a healthy donor, and that’s harder than you might think.
 

Finding healthy stool samples

Banking our bodily substances is nothing new. Blood banks, for example, are common throughout the United States, and cord blood banking – preserving blood from a baby’s umbilical cord to aid possible future medical needs of the child – is becoming more popular. Sperm donors are highly sought after, and doctors regularly transplant kidneys and bone marrow to patients in need.

So why are we so particular about poop?

Part of the reason may be because feces (like blood, for that matter) can harbor disease – which is why it’s so important to find healthy stool donors. Problem is, this can be surprisingly hard to do.

To donate fecal matter, people must go through a rigorous screening process, said Majdi Osman, MD, chief medical officer for OpenBiome, a nonprofit microbiome research organization.

Until recently, OpenBiome operated a stool donation program, though it has since shifted its focus to research. Potential donors were screened for diseases and mental health conditions, pathogens, and antibiotic resistance. The pass rate was less than 3%.

“We take a very cautious approach because the association between diseases and the microbiome is still being understood,” Dr. Osman said.

FMT also carries risks – though so far, they seem mild. Side effects include mild diarrhea, nausea, belly pain, and fatigue. (The reason? Even the healthiest donor stool may not mix perfectly with your own.)

That’s where the idea of using your own stool comes in, said Yang-Yu Liu, PhD, a Harvard researcher who studies the microbiome and the lead author of the paper mentioned above. It’s not just more appealing but may also be a better “match” for your body.
 

Should you bank your stool?

While the researchers say we have reason to be optimistic about the future, it’s important to remember that many challenges remain. FMT is early in development, and there’s a lot about the microbiome we still don’t know.

There’s no guarantee, for example, that restoring a person’s microbiome to its formerly disease-free state will keep diseases at bay forever, said Dr. Weiss. If your genes raise your odds of having Crohn’s, for instance, it’s possible the disease could come back.

We also don’t know how long stool samples can be preserved, said Dr. Liu. Stool banks currently store fecal matter for 1 or 2 years, not decades. To protect the proteins and DNA structures for that long, samples would likely need to be stashed at the liquid nitrogen storage temperature of –196° C. (Currently, samples are stored at about –80° C.) Even then, testing would be needed to confirm if the fragile microorganisms in the stool can survive.

This raises another question: Who’s going to regulate all this?

The FDA regulates the use of FMT as a drug for the treatment of C. diff, but as Dr. Liu pointed out, many gastroenterologists consider the gut microbiota an organ. In that case, human fecal matter could be regulated the same way blood, bone, or even egg cells are.

Cord blood banking may be a helpful model, Dr. Liu said.

“We don’t have to start from scratch.”

Then there’s the question of cost. Cord blood banks could be a point of reference for that too, the researchers say. They charge about $1,500 to $2,820 for the first collection and processing, plus a yearly storage fee of $185 to $370.

Despite the unknowns, one thing is for sure: The interest in fecal banking is real – and growing. At least one microbiome firm, Cordlife Group Limited, based in Singapore, announced that it has started to allow people to bank their stool for future use.

“More people should talk about it and think about it,” said Dr. Liu.

A version of this article first appeared on WebMD.com.

Lots of things can disrupt your gut health over the years. A high-sugar diet, stress, antibiotics – all are linked to bad changes in the gut microbiome, the microbes that live in your intestinal tract. And this can raise the risk of diseases.

But what if you could erase all that damage, restoring your gut to a time when you were younger and healthier?

It could be possible, scientists say, by having people take a sample of their own stool when they are young to be put back into their colons when they are older.

While the science to back this up isn’t quite there yet, some researchers are saying we shouldn’t wait. They are calling on existing stool banks to let people start banking their stool now, so it’s there for them to use if the science becomes available.

But how would that work?

First, you’d go to a stool bank and provide a fresh sample of your poop, which would be screened for diseases, washed, processed, and deposited into a long-term storage facility.

Then, down the road, if you get a condition such as inflammatory bowel disease, heart disease, or type 2 diabetes – or if you have a procedure that wipes out your microbiome, like a course of antibiotics or chemotherapy – doctors could use your preserved stool to “re-colonize” your gut, restoring it to its earlier, healthier state, said Scott Weiss, MD, professor of medicine at Harvard Medical School, Boston, and a coauthor of a recent paper on the topic. They would do that using fecal microbiota transplantation, or FMT.

Timing is everything. You’d want a sample from when you’re healthy – say, between the ages of 18 and 35, or before a chronic condition is likely, said Dr. Weiss. But if you’re still healthy into your late 30s, 40s, or even 50s, providing a sample then could still benefit you later in life.

If we could pull off a banking system like this, it could have the potential to treat autoimmune disease, inflammatory bowel disease, diabetes, obesity, and heart disease – or even reverse the effects of aging. How can we make this happen?
 

Stool banks of today

While stool banks do exist today, the samples inside are destined not for the original donors but rather for sick patients hoping to treat an illness. Using FMT, doctors transfer the fecal material to the patient’s colon, restoring helpful gut microbiota.

Some research shows FMT may help treat inflammatory bowel diseases, such as Crohn’s or ulcerative colitis. Animal studies suggest it could help treat obesity, lengthen lifespan, and reverse some effects of aging, such as age-related decline in brain function. Other clinical trials are looking into its potential as a cancer treatment, said Dr. Weiss.

But outside the lab, FMT is mainly used for one purpose: to treat Clostridioides difficile infection. It works even better than antibiotics, research shows.

But first you need to find a healthy donor, and that’s harder than you might think.
 

Finding healthy stool samples

Banking our bodily substances is nothing new. Blood banks, for example, are common throughout the United States, and cord blood banking – preserving blood from a baby’s umbilical cord to aid possible future medical needs of the child – is becoming more popular. Sperm donors are highly sought after, and doctors regularly transplant kidneys and bone marrow to patients in need.

So why are we so particular about poop?

Part of the reason may be because feces (like blood, for that matter) can harbor disease – which is why it’s so important to find healthy stool donors. Problem is, this can be surprisingly hard to do.

To donate fecal matter, people must go through a rigorous screening process, said Majdi Osman, MD, chief medical officer for OpenBiome, a nonprofit microbiome research organization.

Until recently, OpenBiome operated a stool donation program, though it has since shifted its focus to research. Potential donors were screened for diseases and mental health conditions, pathogens, and antibiotic resistance. The pass rate was less than 3%.

“We take a very cautious approach because the association between diseases and the microbiome is still being understood,” Dr. Osman said.

FMT also carries risks – though so far, they seem mild. Side effects include mild diarrhea, nausea, belly pain, and fatigue. (The reason? Even the healthiest donor stool may not mix perfectly with your own.)

That’s where the idea of using your own stool comes in, said Yang-Yu Liu, PhD, a Harvard researcher who studies the microbiome and the lead author of the paper mentioned above. It’s not just more appealing but may also be a better “match” for your body.
 

Should you bank your stool?

While the researchers say we have reason to be optimistic about the future, it’s important to remember that many challenges remain. FMT is early in development, and there’s a lot about the microbiome we still don’t know.

There’s no guarantee, for example, that restoring a person’s microbiome to its formerly disease-free state will keep diseases at bay forever, said Dr. Weiss. If your genes raise your odds of having Crohn’s, for instance, it’s possible the disease could come back.

We also don’t know how long stool samples can be preserved, said Dr. Liu. Stool banks currently store fecal matter for 1 or 2 years, not decades. To protect the proteins and DNA structures for that long, samples would likely need to be stashed at the liquid nitrogen storage temperature of –196° C. (Currently, samples are stored at about –80° C.) Even then, testing would be needed to confirm if the fragile microorganisms in the stool can survive.

This raises another question: Who’s going to regulate all this?

The FDA regulates the use of FMT as a drug for the treatment of C. diff, but as Dr. Liu pointed out, many gastroenterologists consider the gut microbiota an organ. In that case, human fecal matter could be regulated the same way blood, bone, or even egg cells are.

Cord blood banking may be a helpful model, Dr. Liu said.

“We don’t have to start from scratch.”

Then there’s the question of cost. Cord blood banks could be a point of reference for that too, the researchers say. They charge about $1,500 to $2,820 for the first collection and processing, plus a yearly storage fee of $185 to $370.

Despite the unknowns, one thing is for sure: The interest in fecal banking is real – and growing. At least one microbiome firm, Cordlife Group Limited, based in Singapore, announced that it has started to allow people to bank their stool for future use.

“More people should talk about it and think about it,” said Dr. Liu.

A version of this article first appeared on WebMD.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

FDA warns of cancer risk in scar tissue around breast implants

Article Type
Changed

The Food and Drug Administration has issued a safety alert, warning of a rare but concerning potential risk of squamous cell carcinoma (SCC) and various lymphomas in the scar tissue around breast implants.

The FDA safety communication is based on several dozen reports of these cancers occurring in the capsule or scar tissue around breast implants. This issue differs from breast implant–associated anaplastic large-cell lymphoma (BIA-ALCL) – a known risk among implant recipients.

“After preliminary review of published literature as part of our ongoing monitoring of the safety of breast implants, the FDA is aware of less than 20 cases of SCC and less than 30 cases of various lymphomas in the capsule around the breast implant,” the agency’s alert explains.

One avenue through which the FDA has identified cases is via medical device reports. As of Sept. 1, the FDA has received 10 medical device reports about SCC related to breast implants and 12 about various lymphomas.

The incidence rate and risk factors for these events are currently unknown, but reports of SCC and various lymphomas in the capsule around the breast implants have been reported for both textured and smooth breast implants, as well as for both saline and silicone breast implants. In some cases, the cancers were diagnosed years after breast implant surgery.

Reported signs and symptoms included swelling, pain, lumps, or skin changes. 

Although the risks of SCC and lymphomas in the tissue around breast implants appears rare, “when safety risks with medical devices are identified, we wanted to provide clear and understandable information to the public as quickly as possible,” Binita Ashar, MD, director of the Office of Surgical and Infection Control Devices, FDA Center for Devices and Radiological Health, explained in a press release.

Patients and providers are strongly encouraged to report breast implant–related problems and cases of SCC or lymphoma of the breast implant capsule to MedWatch, the FDA’s adverse event reporting program.

The FDA plans to complete “a thorough literature review” as well as “identify ways to collect more detailed information regarding patient cases.”

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

The Food and Drug Administration has issued a safety alert, warning of a rare but concerning potential risk of squamous cell carcinoma (SCC) and various lymphomas in the scar tissue around breast implants.

The FDA safety communication is based on several dozen reports of these cancers occurring in the capsule or scar tissue around breast implants. This issue differs from breast implant–associated anaplastic large-cell lymphoma (BIA-ALCL) – a known risk among implant recipients.

“After preliminary review of published literature as part of our ongoing monitoring of the safety of breast implants, the FDA is aware of less than 20 cases of SCC and less than 30 cases of various lymphomas in the capsule around the breast implant,” the agency’s alert explains.

One avenue through which the FDA has identified cases is via medical device reports. As of Sept. 1, the FDA has received 10 medical device reports about SCC related to breast implants and 12 about various lymphomas.

The incidence rate and risk factors for these events are currently unknown, but reports of SCC and various lymphomas in the capsule around the breast implants have been reported for both textured and smooth breast implants, as well as for both saline and silicone breast implants. In some cases, the cancers were diagnosed years after breast implant surgery.

Reported signs and symptoms included swelling, pain, lumps, or skin changes. 

Although the risks of SCC and lymphomas in the tissue around breast implants appears rare, “when safety risks with medical devices are identified, we wanted to provide clear and understandable information to the public as quickly as possible,” Binita Ashar, MD, director of the Office of Surgical and Infection Control Devices, FDA Center for Devices and Radiological Health, explained in a press release.

Patients and providers are strongly encouraged to report breast implant–related problems and cases of SCC or lymphoma of the breast implant capsule to MedWatch, the FDA’s adverse event reporting program.

The FDA plans to complete “a thorough literature review” as well as “identify ways to collect more detailed information regarding patient cases.”

A version of this article first appeared on Medscape.com.

The Food and Drug Administration has issued a safety alert, warning of a rare but concerning potential risk of squamous cell carcinoma (SCC) and various lymphomas in the scar tissue around breast implants.

The FDA safety communication is based on several dozen reports of these cancers occurring in the capsule or scar tissue around breast implants. This issue differs from breast implant–associated anaplastic large-cell lymphoma (BIA-ALCL) – a known risk among implant recipients.

“After preliminary review of published literature as part of our ongoing monitoring of the safety of breast implants, the FDA is aware of less than 20 cases of SCC and less than 30 cases of various lymphomas in the capsule around the breast implant,” the agency’s alert explains.

One avenue through which the FDA has identified cases is via medical device reports. As of Sept. 1, the FDA has received 10 medical device reports about SCC related to breast implants and 12 about various lymphomas.

The incidence rate and risk factors for these events are currently unknown, but reports of SCC and various lymphomas in the capsule around the breast implants have been reported for both textured and smooth breast implants, as well as for both saline and silicone breast implants. In some cases, the cancers were diagnosed years after breast implant surgery.

Reported signs and symptoms included swelling, pain, lumps, or skin changes. 

Although the risks of SCC and lymphomas in the tissue around breast implants appears rare, “when safety risks with medical devices are identified, we wanted to provide clear and understandable information to the public as quickly as possible,” Binita Ashar, MD, director of the Office of Surgical and Infection Control Devices, FDA Center for Devices and Radiological Health, explained in a press release.

Patients and providers are strongly encouraged to report breast implant–related problems and cases of SCC or lymphoma of the breast implant capsule to MedWatch, the FDA’s adverse event reporting program.

The FDA plans to complete “a thorough literature review” as well as “identify ways to collect more detailed information regarding patient cases.”

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Fish oil pills do not reduce fractures in healthy seniors: VITAL

Article Type
Changed

Omega-3 supplements did not reduce fractures during a median 5.3-year follow-up in the more than 25,000 generally healthy men and women (≥ age 50 and ≥ age 55, respectively) in the Vitamin D and Omega-3 Trial (VITAL).

The large randomized controlled trial tested whether omega-3 fatty acid or vitamin D supplements prevented cardiovascular disease or cancer in a representative sample of midlife and older adults from 50 U.S. states – which they did not. In a further analysis of VITAL, vitamin D supplements (cholecalciferol, 2,000 IU/day) did not lower the risk of incident total, nonvertebral, and hip fractures, compared with placebo.

Dmitriy Danilchenko/Shutterstock

Now this new analysis shows that omega-3 fatty acid supplements (1 g/day of fish oil) did not reduce the risk of such fractures in the VITAL population either. Meryl S. LeBoff, MD, presented the latest findings during an oral session at the annual meeting of the American Society for Bone and Mineral Research.

“In this, the largest randomized controlled trial in the world, we did not find an effect of omega-3 fatty acid supplements on fractures,” Dr. LeBoff, from Brigham and Women’s Hospital and Harvard Medical School, both in Boston, told this news organization.

The current analysis did “unexpectedly” show that among participants who received the omega-3 fatty acid supplements, there was an increase in fractures in men, and fracture risk was higher in people with a normal or low body mass index and lower in people with higher BMI.

However, these subgroup findings need to be interpreted with caution and may be caused by chance, Dr. LeBoff warned. The researchers will be investigating these findings in further analyses.
 

Should patients take omega-3 supplements or not?

Asked whether, in the meantime, patients should start or keep taking fish oil supplements for possible health benefits, she noted that certain individuals might benefit.

For example, in VITAL, participants who ate less than 1.5 servings of fish per week and received omega-3 fatty acid supplements had a decrease in the combined cardiovascular endpoint, and Black participants who took fish oil supplements had a substantially reduced risk of the outcome, regardless of fish intake.

“I think everybody needs to review [the study findings] with clinicians and make a decision in terms of what would be best for them,” she said.

Session comoderator Bente Langdahl, MD, PhD, commented that “many people take omega-3 because they think it will help” knee, hip, or other joint pain.

Perhaps men are more prone to joint pain because of osteoarthritis and the supplements lessen the pain, so these men became more physically active and more prone to fractures, she speculated.

The current study shows that, “so far, we haven’t been able to demonstrate a reduced rate of fractures with fish oil supplements in clinical randomized trials” conducted in relatively healthy and not the oldest patients, she summarized. “We’re not talking about 80-year-olds.”

In this “well-conducted study, they were not able to see any difference” with omega-3 fatty acid supplements versus placebo, but apparently, there are no harms associated with taking these supplements, she said.

To patients who ask her about such supplements, Dr. Langdahl advised: “Try it out for 3 months. If it really helps you, if it takes away your joint pain or whatever, then that might work for you. But then remember to stop again because it might just be a temporary effect.”
 

 

 

Could fish oil supplements protect against fractures?

An estimated 22% of U.S. adults aged 60 and older take omega-3 fatty acid supplements, Dr. LeBoff noted.

Preclinical studies have shown that omega-3 fatty acids reduce bone resorption and have anti-inflammatory effects, but observational studies have reported conflicting findings.

The researchers conducted this ancillary study of VITAL to fill these knowledge gaps.

VITAL enrolled a national sample of 25,871 U.S. men and women, including 5,106 Black participants, with a mean age of 67 and a mean BMI of 28 kg/m2.

Importantly, participants were not recruited by low bone density, fractures, or vitamin D deficiency. Prior to entry, participants were required to stop taking omega-3 supplements and limit nonstudy vitamin D and calcium supplements.

The omega-3 fatty acid supplements used in the study contained eicosapentaenoic acid and docosahexaenoic acid in a 1.2:1 ratio.

VITAL had a 2x2 factorial design whereby 6,463 participants were randomized to receive the omega-3 fatty acid supplement and 6,474 were randomized to placebo. (Remaining participants were randomized to receive vitamin D or placebo.)

Participants in the omega-3 fatty acid and placebo groups had similar baseline characteristics. For example, about half (50.5%) were women, and on average, they ate 1.1 servings of dark-meat fish (such as salmon) per week.

Participants completed detailed questionnaires at baseline and each year.

Plasma omega-3 levels were measured at baseline and, in 1,583 participants, at 1 year of follow-up. The mean omega-3 index rose 54.7% in the omega-3 fatty acid group and changed less than 2% in the placebo group at 1 year.

Study pill adherence was 87.0% at 2 years and 85.7% at 5 years.

Fractures were self-reported on annual questionnaires and centrally adjudicated in medical record review.
 

No clinically meaningful effect of omega-3 fatty acids on fractures

During a median 5.3-year follow-up, researchers adjudicated 2,133 total fractures and confirmed 1,991 fractures (93%) in 1551 participants.

Incidences of total, nonvertebral, and hip fractures were similar in both groups.

Compared with placebo, omega-3 fatty acid supplements had no significant effect on risk of total fractures (hazard ratio, 1.02; 95% confidence interval, 0.92-1.13), nonvertebral fractures (HR, 1.01; 95% CI, 0.91-1.12), or hip fractures (HR, 0.89; 95% CI, 0.61-1.30), all adjusted for age, sex, and race.

The “confidence intervals were narrow, likely excluding a clinically meaningful effect,” Dr. LeBoff noted.

Among men, those who received fish oil supplements had a greater risk of fracture than those who received placebo (HR, 1.27; 95% CI, 1.07-1.51), but this result “was not corrected for multiple hypothesis testing,” Dr. LeBoff cautioned.

In the overall population, participants with a BMI less than 25 who received fish oil versus placebo had an increased risk of fracture, and those with a BMI of at least 30 who received fish oil versus placebo had a decreased risk of fracture, but the limits of the confidence intervals crossed 1.00.

After excluding digit, skull, and pathologic fractures, there was no significant reduction in total fractures (HR, 1.02; 95% CI, 0.92-1.14), nonvertebral fractures (HR, 1.02; 95% CI, 0.92-1.14), or hip fractures (HR, 0.90; 95% CI, 0.61-1.33), with omega-3 supplements versus placebo.

Similarly, there was no significant reduction in risk of major osteoporotic fractures (hip, wrist, humerus, and clinical spine fractures) or wrist fractures with omega-3 supplements versus placebo.

VITAL only studied one dose of omega-3 fatty acid supplements, and results may not be generalizable to younger adults, or older adults living in residential communities, Dr. LeBoff noted.

The study was supported by grants from the National Institute of Arthritis Musculoskeletal and Skin Diseases. VITAL was funded by the National Cancer Institute and the National Heart, Lung, and Blood Institute. Dr. LeBoff and Dr. Langdahl have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Meeting/Event
Publications
Topics
Sections
Meeting/Event
Meeting/Event

Omega-3 supplements did not reduce fractures during a median 5.3-year follow-up in the more than 25,000 generally healthy men and women (≥ age 50 and ≥ age 55, respectively) in the Vitamin D and Omega-3 Trial (VITAL).

The large randomized controlled trial tested whether omega-3 fatty acid or vitamin D supplements prevented cardiovascular disease or cancer in a representative sample of midlife and older adults from 50 U.S. states – which they did not. In a further analysis of VITAL, vitamin D supplements (cholecalciferol, 2,000 IU/day) did not lower the risk of incident total, nonvertebral, and hip fractures, compared with placebo.

Dmitriy Danilchenko/Shutterstock

Now this new analysis shows that omega-3 fatty acid supplements (1 g/day of fish oil) did not reduce the risk of such fractures in the VITAL population either. Meryl S. LeBoff, MD, presented the latest findings during an oral session at the annual meeting of the American Society for Bone and Mineral Research.

“In this, the largest randomized controlled trial in the world, we did not find an effect of omega-3 fatty acid supplements on fractures,” Dr. LeBoff, from Brigham and Women’s Hospital and Harvard Medical School, both in Boston, told this news organization.

The current analysis did “unexpectedly” show that among participants who received the omega-3 fatty acid supplements, there was an increase in fractures in men, and fracture risk was higher in people with a normal or low body mass index and lower in people with higher BMI.

However, these subgroup findings need to be interpreted with caution and may be caused by chance, Dr. LeBoff warned. The researchers will be investigating these findings in further analyses.
 

Should patients take omega-3 supplements or not?

Asked whether, in the meantime, patients should start or keep taking fish oil supplements for possible health benefits, she noted that certain individuals might benefit.

For example, in VITAL, participants who ate less than 1.5 servings of fish per week and received omega-3 fatty acid supplements had a decrease in the combined cardiovascular endpoint, and Black participants who took fish oil supplements had a substantially reduced risk of the outcome, regardless of fish intake.

“I think everybody needs to review [the study findings] with clinicians and make a decision in terms of what would be best for them,” she said.

Session comoderator Bente Langdahl, MD, PhD, commented that “many people take omega-3 because they think it will help” knee, hip, or other joint pain.

Perhaps men are more prone to joint pain because of osteoarthritis and the supplements lessen the pain, so these men became more physically active and more prone to fractures, she speculated.

The current study shows that, “so far, we haven’t been able to demonstrate a reduced rate of fractures with fish oil supplements in clinical randomized trials” conducted in relatively healthy and not the oldest patients, she summarized. “We’re not talking about 80-year-olds.”

In this “well-conducted study, they were not able to see any difference” with omega-3 fatty acid supplements versus placebo, but apparently, there are no harms associated with taking these supplements, she said.

To patients who ask her about such supplements, Dr. Langdahl advised: “Try it out for 3 months. If it really helps you, if it takes away your joint pain or whatever, then that might work for you. But then remember to stop again because it might just be a temporary effect.”
 

 

 

Could fish oil supplements protect against fractures?

An estimated 22% of U.S. adults aged 60 and older take omega-3 fatty acid supplements, Dr. LeBoff noted.

Preclinical studies have shown that omega-3 fatty acids reduce bone resorption and have anti-inflammatory effects, but observational studies have reported conflicting findings.

The researchers conducted this ancillary study of VITAL to fill these knowledge gaps.

VITAL enrolled a national sample of 25,871 U.S. men and women, including 5,106 Black participants, with a mean age of 67 and a mean BMI of 28 kg/m2.

Importantly, participants were not recruited by low bone density, fractures, or vitamin D deficiency. Prior to entry, participants were required to stop taking omega-3 supplements and limit nonstudy vitamin D and calcium supplements.

The omega-3 fatty acid supplements used in the study contained eicosapentaenoic acid and docosahexaenoic acid in a 1.2:1 ratio.

VITAL had a 2x2 factorial design whereby 6,463 participants were randomized to receive the omega-3 fatty acid supplement and 6,474 were randomized to placebo. (Remaining participants were randomized to receive vitamin D or placebo.)

Participants in the omega-3 fatty acid and placebo groups had similar baseline characteristics. For example, about half (50.5%) were women, and on average, they ate 1.1 servings of dark-meat fish (such as salmon) per week.

Participants completed detailed questionnaires at baseline and each year.

Plasma omega-3 levels were measured at baseline and, in 1,583 participants, at 1 year of follow-up. The mean omega-3 index rose 54.7% in the omega-3 fatty acid group and changed less than 2% in the placebo group at 1 year.

Study pill adherence was 87.0% at 2 years and 85.7% at 5 years.

Fractures were self-reported on annual questionnaires and centrally adjudicated in medical record review.
 

No clinically meaningful effect of omega-3 fatty acids on fractures

During a median 5.3-year follow-up, researchers adjudicated 2,133 total fractures and confirmed 1,991 fractures (93%) in 1551 participants.

Incidences of total, nonvertebral, and hip fractures were similar in both groups.

Compared with placebo, omega-3 fatty acid supplements had no significant effect on risk of total fractures (hazard ratio, 1.02; 95% confidence interval, 0.92-1.13), nonvertebral fractures (HR, 1.01; 95% CI, 0.91-1.12), or hip fractures (HR, 0.89; 95% CI, 0.61-1.30), all adjusted for age, sex, and race.

The “confidence intervals were narrow, likely excluding a clinically meaningful effect,” Dr. LeBoff noted.

Among men, those who received fish oil supplements had a greater risk of fracture than those who received placebo (HR, 1.27; 95% CI, 1.07-1.51), but this result “was not corrected for multiple hypothesis testing,” Dr. LeBoff cautioned.

In the overall population, participants with a BMI less than 25 who received fish oil versus placebo had an increased risk of fracture, and those with a BMI of at least 30 who received fish oil versus placebo had a decreased risk of fracture, but the limits of the confidence intervals crossed 1.00.

After excluding digit, skull, and pathologic fractures, there was no significant reduction in total fractures (HR, 1.02; 95% CI, 0.92-1.14), nonvertebral fractures (HR, 1.02; 95% CI, 0.92-1.14), or hip fractures (HR, 0.90; 95% CI, 0.61-1.33), with omega-3 supplements versus placebo.

Similarly, there was no significant reduction in risk of major osteoporotic fractures (hip, wrist, humerus, and clinical spine fractures) or wrist fractures with omega-3 supplements versus placebo.

VITAL only studied one dose of omega-3 fatty acid supplements, and results may not be generalizable to younger adults, or older adults living in residential communities, Dr. LeBoff noted.

The study was supported by grants from the National Institute of Arthritis Musculoskeletal and Skin Diseases. VITAL was funded by the National Cancer Institute and the National Heart, Lung, and Blood Institute. Dr. LeBoff and Dr. Langdahl have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Omega-3 supplements did not reduce fractures during a median 5.3-year follow-up in the more than 25,000 generally healthy men and women (≥ age 50 and ≥ age 55, respectively) in the Vitamin D and Omega-3 Trial (VITAL).

The large randomized controlled trial tested whether omega-3 fatty acid or vitamin D supplements prevented cardiovascular disease or cancer in a representative sample of midlife and older adults from 50 U.S. states – which they did not. In a further analysis of VITAL, vitamin D supplements (cholecalciferol, 2,000 IU/day) did not lower the risk of incident total, nonvertebral, and hip fractures, compared with placebo.

Dmitriy Danilchenko/Shutterstock

Now this new analysis shows that omega-3 fatty acid supplements (1 g/day of fish oil) did not reduce the risk of such fractures in the VITAL population either. Meryl S. LeBoff, MD, presented the latest findings during an oral session at the annual meeting of the American Society for Bone and Mineral Research.

“In this, the largest randomized controlled trial in the world, we did not find an effect of omega-3 fatty acid supplements on fractures,” Dr. LeBoff, from Brigham and Women’s Hospital and Harvard Medical School, both in Boston, told this news organization.

The current analysis did “unexpectedly” show that among participants who received the omega-3 fatty acid supplements, there was an increase in fractures in men, and fracture risk was higher in people with a normal or low body mass index and lower in people with higher BMI.

However, these subgroup findings need to be interpreted with caution and may be caused by chance, Dr. LeBoff warned. The researchers will be investigating these findings in further analyses.
 

Should patients take omega-3 supplements or not?

Asked whether, in the meantime, patients should start or keep taking fish oil supplements for possible health benefits, she noted that certain individuals might benefit.

For example, in VITAL, participants who ate less than 1.5 servings of fish per week and received omega-3 fatty acid supplements had a decrease in the combined cardiovascular endpoint, and Black participants who took fish oil supplements had a substantially reduced risk of the outcome, regardless of fish intake.

“I think everybody needs to review [the study findings] with clinicians and make a decision in terms of what would be best for them,” she said.

Session comoderator Bente Langdahl, MD, PhD, commented that “many people take omega-3 because they think it will help” knee, hip, or other joint pain.

Perhaps men are more prone to joint pain because of osteoarthritis and the supplements lessen the pain, so these men became more physically active and more prone to fractures, she speculated.

The current study shows that, “so far, we haven’t been able to demonstrate a reduced rate of fractures with fish oil supplements in clinical randomized trials” conducted in relatively healthy and not the oldest patients, she summarized. “We’re not talking about 80-year-olds.”

In this “well-conducted study, they were not able to see any difference” with omega-3 fatty acid supplements versus placebo, but apparently, there are no harms associated with taking these supplements, she said.

To patients who ask her about such supplements, Dr. Langdahl advised: “Try it out for 3 months. If it really helps you, if it takes away your joint pain or whatever, then that might work for you. But then remember to stop again because it might just be a temporary effect.”
 

 

 

Could fish oil supplements protect against fractures?

An estimated 22% of U.S. adults aged 60 and older take omega-3 fatty acid supplements, Dr. LeBoff noted.

Preclinical studies have shown that omega-3 fatty acids reduce bone resorption and have anti-inflammatory effects, but observational studies have reported conflicting findings.

The researchers conducted this ancillary study of VITAL to fill these knowledge gaps.

VITAL enrolled a national sample of 25,871 U.S. men and women, including 5,106 Black participants, with a mean age of 67 and a mean BMI of 28 kg/m2.

Importantly, participants were not recruited by low bone density, fractures, or vitamin D deficiency. Prior to entry, participants were required to stop taking omega-3 supplements and limit nonstudy vitamin D and calcium supplements.

The omega-3 fatty acid supplements used in the study contained eicosapentaenoic acid and docosahexaenoic acid in a 1.2:1 ratio.

VITAL had a 2x2 factorial design whereby 6,463 participants were randomized to receive the omega-3 fatty acid supplement and 6,474 were randomized to placebo. (Remaining participants were randomized to receive vitamin D or placebo.)

Participants in the omega-3 fatty acid and placebo groups had similar baseline characteristics. For example, about half (50.5%) were women, and on average, they ate 1.1 servings of dark-meat fish (such as salmon) per week.

Participants completed detailed questionnaires at baseline and each year.

Plasma omega-3 levels were measured at baseline and, in 1,583 participants, at 1 year of follow-up. The mean omega-3 index rose 54.7% in the omega-3 fatty acid group and changed less than 2% in the placebo group at 1 year.

Study pill adherence was 87.0% at 2 years and 85.7% at 5 years.

Fractures were self-reported on annual questionnaires and centrally adjudicated in medical record review.
 

No clinically meaningful effect of omega-3 fatty acids on fractures

During a median 5.3-year follow-up, researchers adjudicated 2,133 total fractures and confirmed 1,991 fractures (93%) in 1551 participants.

Incidences of total, nonvertebral, and hip fractures were similar in both groups.

Compared with placebo, omega-3 fatty acid supplements had no significant effect on risk of total fractures (hazard ratio, 1.02; 95% confidence interval, 0.92-1.13), nonvertebral fractures (HR, 1.01; 95% CI, 0.91-1.12), or hip fractures (HR, 0.89; 95% CI, 0.61-1.30), all adjusted for age, sex, and race.

The “confidence intervals were narrow, likely excluding a clinically meaningful effect,” Dr. LeBoff noted.

Among men, those who received fish oil supplements had a greater risk of fracture than those who received placebo (HR, 1.27; 95% CI, 1.07-1.51), but this result “was not corrected for multiple hypothesis testing,” Dr. LeBoff cautioned.

In the overall population, participants with a BMI less than 25 who received fish oil versus placebo had an increased risk of fracture, and those with a BMI of at least 30 who received fish oil versus placebo had a decreased risk of fracture, but the limits of the confidence intervals crossed 1.00.

After excluding digit, skull, and pathologic fractures, there was no significant reduction in total fractures (HR, 1.02; 95% CI, 0.92-1.14), nonvertebral fractures (HR, 1.02; 95% CI, 0.92-1.14), or hip fractures (HR, 0.90; 95% CI, 0.61-1.33), with omega-3 supplements versus placebo.

Similarly, there was no significant reduction in risk of major osteoporotic fractures (hip, wrist, humerus, and clinical spine fractures) or wrist fractures with omega-3 supplements versus placebo.

VITAL only studied one dose of omega-3 fatty acid supplements, and results may not be generalizable to younger adults, or older adults living in residential communities, Dr. LeBoff noted.

The study was supported by grants from the National Institute of Arthritis Musculoskeletal and Skin Diseases. VITAL was funded by the National Cancer Institute and the National Heart, Lung, and Blood Institute. Dr. LeBoff and Dr. Langdahl have reported no relevant financial relationships.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM ASMBR 2022

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

When do we stop using BMI to diagnose obesity?

Article Type
Changed

“BMI is trash. Full stop.” This controversial tweet received 26,500 likes and almost 3,000 retweets. The 400 comments from medical and non–health care personnel ranged from agreeable to contrary to offensive.

Regardless of your opinion on BMI (body mass index), this conversation highlighted that the medical community needs to discuss the limitations of BMI and decide its future.

As a Black woman who is an obesity expert living with the impact of obesity in my own life, I know the emotion that a BMI conversation can evoke. Before emotions hijack the conversation, let’s discuss BMI’s past, present, and future.
 

BMI: From observational measurement to clinical use

Imagine walking into your favorite clothing store where an eager clerk greets you with a shirt to try on. The fit is off, but the clerk insists that the shirt must fit because everyone who’s your height should be able to wear it. This scenario seems ridiculous. But this is how we’ve come to use the BMI. Instead of thinking that people of the same height may be the same size, we declare that they must be the same size.

The idea behind the BMI was conceived in 1832 by Belgian anthropologist and mathematician Adolphe Quetelet, but he didn’t intend for it to be a health measure. Instead, it was simply an observation of how people’s weight changed in proportion to height over their lifetime.

Fast-forward to the 20th century, when insurance companies began using weight as an indicator of health status. Weights were recorded in a “Life Table.” Individual health status was determined on the basis of arbitrary cut-offs for weight on the Life Tables. Furthermore, White men set the “normal” weight standards because they were the primary insurance holders.

In 1972, Dr. Ancel Keys, a physician and leading expert in body composition at the time, cried foul on this practice and sought to standardize the use of weight as a health indicator. Dr. Keys used Quetelet’s calculation and termed it the Body Mass Index.

By 1985, the U.S. National Institutes of Health and the World Health Organization adopted the BMI. By the 21st century, BMI had become widely used in clinical settings. For example, the Centers for Medicare & Medicaid Services adopted BMI as a quality-of-care measure, placing even more pressure on clinicians to use BMI as a health screening tool.
 

BMI as a tool to diagnose obesity

We can’t discuss BMI without discussing the disease of obesity. BMI is the most widely used tool to diagnose obesity. In the United States, one-third of Americans meet the criteria for obesity. Another one-third are at risk for obesity.

Compared with BMI’s relatively quick acceptance into clinical practice, however, obesity was only recently recognized as a disease.

Historically, obesity has been viewed as a lifestyle choice, fueled by misinformation and multiple forms of bias. The historical bias associated with BMI and discrimination has led some public health officials and scholars to dismiss the use of BMI or fail to recognize obesity as disease.

This is a dangerous conclusion, because it comes to the detriment of the very people disproportionately impacted by obesity-related health disparities.

Furthermore, weight bias continues to prevent people living with obesity from receiving insurance coverage for life-enhancing obesity medications and interventions.
 

 

 

Is it time to phase out BMI?

The BMI is intertwined with many forms of bias: age, gender, racial, ethnic, and even weight. Therefore, it is time to phase out BMI. However, phasing out BMI is complex and will take time, given that:

  • Obesity is still a relatively “young” disease. 2023 marks the 10th anniversary of obesity’s recognition as a disease by the American Medical Association. Currently, BMI is the most widely used tool to diagnose obesity. Tools such as waist circumference, body composition, and metabolic health assessment will need to replace the BMI. Shifting from BMI emphasizes that obesity is more than a number on the scale. Obesity, as defined by the Obesity Medicine Association, is indeed a “chronic, relapsing, multi-factorial, neurobehavioral disease, wherein an increase in body fat promotes adipose tissue dysfunction and abnormal fat mass physical forces, resulting in adverse metabolic, biomechanical, and psychosocial health consequences.”
  • Much of our health research is tied to BMI. There have been some shifts in looking at non–weight-related health indicators. However, we need more robust studies evaluating other health indicators beyond weight and BMI. The availability of this data will help eliminate the need for BMI and promote individualized health assessment.
  • Current treatment guidelines for obesity medications are based on BMI. (Note: Medications to treat obesity are called “anti-obesity” medications or AOMs. However, given the stigma associated with obesity, I prefer not to use the term “anti-obesity.”) Presently this interferes with long-term obesity treatment. Once BMI is “normal,” many patients lose insurance coverage for their obesity medication, despite needing long-term metabolic support to overcome the compensatory mechanism of weight regain. Obesity is a chronic disease that exists independent of weight status. Therefore, using non-BMI measures will help ensure appropriate lifetime support for obesity.

The preceding are barriers, not impossibilities. In the interim, if BMI is still used in any capacity, the BMI reference chart should be an adjusted BMI chart based on agerace, ethnicity, biological sex, and obesity-related conditions. Furthermore, BMI isn’t the sole determining factor of health status.

Instead, an “abnormal” BMI should initiate conversation and further testing, if needed, to determine an individual’s health. For example, compare two people of the same height with different BMIs and lifestyles. Current studies support that a person flagged as having a high adjusted BMI but practicing a healthy lifestyle and having no metabolic diseases is less at risk than a person with a “normal” BMI but high waist circumference and an unhealthy lifestyle.

Regardless of your personal feelings, the facts are clear. Technology empowers us with better tools than BMI to determine health status. Therefore, it’s not a matter of if we will stop using BMI but when.

Sylvia Gonsahn-Bollie, MD, DipABOM, is an integrative obesity specialist who specializes in individualized solutions for emotional and biological overeating. Connect with her at www.embraceyouweightloss.com or on Instagram @embraceyoumd. Her bestselling book, “Embrace You: Your Guide to Transforming Weight Loss Misconceptions Into Lifelong Wellness,” is Healthline.com’s Best Overall Weight Loss Book 2022 and one of Livestrong.com’s picks for the 8 Best Weight-Loss Books to Read in 2022.

A version of this article first appeared on Medscape.com.

Publications
Topics
Sections

“BMI is trash. Full stop.” This controversial tweet received 26,500 likes and almost 3,000 retweets. The 400 comments from medical and non–health care personnel ranged from agreeable to contrary to offensive.

Regardless of your opinion on BMI (body mass index), this conversation highlighted that the medical community needs to discuss the limitations of BMI and decide its future.

As a Black woman who is an obesity expert living with the impact of obesity in my own life, I know the emotion that a BMI conversation can evoke. Before emotions hijack the conversation, let’s discuss BMI’s past, present, and future.
 

BMI: From observational measurement to clinical use

Imagine walking into your favorite clothing store where an eager clerk greets you with a shirt to try on. The fit is off, but the clerk insists that the shirt must fit because everyone who’s your height should be able to wear it. This scenario seems ridiculous. But this is how we’ve come to use the BMI. Instead of thinking that people of the same height may be the same size, we declare that they must be the same size.

The idea behind the BMI was conceived in 1832 by Belgian anthropologist and mathematician Adolphe Quetelet, but he didn’t intend for it to be a health measure. Instead, it was simply an observation of how people’s weight changed in proportion to height over their lifetime.

Fast-forward to the 20th century, when insurance companies began using weight as an indicator of health status. Weights were recorded in a “Life Table.” Individual health status was determined on the basis of arbitrary cut-offs for weight on the Life Tables. Furthermore, White men set the “normal” weight standards because they were the primary insurance holders.

In 1972, Dr. Ancel Keys, a physician and leading expert in body composition at the time, cried foul on this practice and sought to standardize the use of weight as a health indicator. Dr. Keys used Quetelet’s calculation and termed it the Body Mass Index.

By 1985, the U.S. National Institutes of Health and the World Health Organization adopted the BMI. By the 21st century, BMI had become widely used in clinical settings. For example, the Centers for Medicare & Medicaid Services adopted BMI as a quality-of-care measure, placing even more pressure on clinicians to use BMI as a health screening tool.
 

BMI as a tool to diagnose obesity

We can’t discuss BMI without discussing the disease of obesity. BMI is the most widely used tool to diagnose obesity. In the United States, one-third of Americans meet the criteria for obesity. Another one-third are at risk for obesity.

Compared with BMI’s relatively quick acceptance into clinical practice, however, obesity was only recently recognized as a disease.

Historically, obesity has been viewed as a lifestyle choice, fueled by misinformation and multiple forms of bias. The historical bias associated with BMI and discrimination has led some public health officials and scholars to dismiss the use of BMI or fail to recognize obesity as disease.

This is a dangerous conclusion, because it comes to the detriment of the very people disproportionately impacted by obesity-related health disparities.

Furthermore, weight bias continues to prevent people living with obesity from receiving insurance coverage for life-enhancing obesity medications and interventions.
 

 

 

Is it time to phase out BMI?

The BMI is intertwined with many forms of bias: age, gender, racial, ethnic, and even weight. Therefore, it is time to phase out BMI. However, phasing out BMI is complex and will take time, given that:

  • Obesity is still a relatively “young” disease. 2023 marks the 10th anniversary of obesity’s recognition as a disease by the American Medical Association. Currently, BMI is the most widely used tool to diagnose obesity. Tools such as waist circumference, body composition, and metabolic health assessment will need to replace the BMI. Shifting from BMI emphasizes that obesity is more than a number on the scale. Obesity, as defined by the Obesity Medicine Association, is indeed a “chronic, relapsing, multi-factorial, neurobehavioral disease, wherein an increase in body fat promotes adipose tissue dysfunction and abnormal fat mass physical forces, resulting in adverse metabolic, biomechanical, and psychosocial health consequences.”
  • Much of our health research is tied to BMI. There have been some shifts in looking at non–weight-related health indicators. However, we need more robust studies evaluating other health indicators beyond weight and BMI. The availability of this data will help eliminate the need for BMI and promote individualized health assessment.
  • Current treatment guidelines for obesity medications are based on BMI. (Note: Medications to treat obesity are called “anti-obesity” medications or AOMs. However, given the stigma associated with obesity, I prefer not to use the term “anti-obesity.”) Presently this interferes with long-term obesity treatment. Once BMI is “normal,” many patients lose insurance coverage for their obesity medication, despite needing long-term metabolic support to overcome the compensatory mechanism of weight regain. Obesity is a chronic disease that exists independent of weight status. Therefore, using non-BMI measures will help ensure appropriate lifetime support for obesity.

The preceding are barriers, not impossibilities. In the interim, if BMI is still used in any capacity, the BMI reference chart should be an adjusted BMI chart based on agerace, ethnicity, biological sex, and obesity-related conditions. Furthermore, BMI isn’t the sole determining factor of health status.

Instead, an “abnormal” BMI should initiate conversation and further testing, if needed, to determine an individual’s health. For example, compare two people of the same height with different BMIs and lifestyles. Current studies support that a person flagged as having a high adjusted BMI but practicing a healthy lifestyle and having no metabolic diseases is less at risk than a person with a “normal” BMI but high waist circumference and an unhealthy lifestyle.

Regardless of your personal feelings, the facts are clear. Technology empowers us with better tools than BMI to determine health status. Therefore, it’s not a matter of if we will stop using BMI but when.

Sylvia Gonsahn-Bollie, MD, DipABOM, is an integrative obesity specialist who specializes in individualized solutions for emotional and biological overeating. Connect with her at www.embraceyouweightloss.com or on Instagram @embraceyoumd. Her bestselling book, “Embrace You: Your Guide to Transforming Weight Loss Misconceptions Into Lifelong Wellness,” is Healthline.com’s Best Overall Weight Loss Book 2022 and one of Livestrong.com’s picks for the 8 Best Weight-Loss Books to Read in 2022.

A version of this article first appeared on Medscape.com.

“BMI is trash. Full stop.” This controversial tweet received 26,500 likes and almost 3,000 retweets. The 400 comments from medical and non–health care personnel ranged from agreeable to contrary to offensive.

Regardless of your opinion on BMI (body mass index), this conversation highlighted that the medical community needs to discuss the limitations of BMI and decide its future.

As a Black woman who is an obesity expert living with the impact of obesity in my own life, I know the emotion that a BMI conversation can evoke. Before emotions hijack the conversation, let’s discuss BMI’s past, present, and future.
 

BMI: From observational measurement to clinical use

Imagine walking into your favorite clothing store where an eager clerk greets you with a shirt to try on. The fit is off, but the clerk insists that the shirt must fit because everyone who’s your height should be able to wear it. This scenario seems ridiculous. But this is how we’ve come to use the BMI. Instead of thinking that people of the same height may be the same size, we declare that they must be the same size.

The idea behind the BMI was conceived in 1832 by Belgian anthropologist and mathematician Adolphe Quetelet, but he didn’t intend for it to be a health measure. Instead, it was simply an observation of how people’s weight changed in proportion to height over their lifetime.

Fast-forward to the 20th century, when insurance companies began using weight as an indicator of health status. Weights were recorded in a “Life Table.” Individual health status was determined on the basis of arbitrary cut-offs for weight on the Life Tables. Furthermore, White men set the “normal” weight standards because they were the primary insurance holders.

In 1972, Dr. Ancel Keys, a physician and leading expert in body composition at the time, cried foul on this practice and sought to standardize the use of weight as a health indicator. Dr. Keys used Quetelet’s calculation and termed it the Body Mass Index.

By 1985, the U.S. National Institutes of Health and the World Health Organization adopted the BMI. By the 21st century, BMI had become widely used in clinical settings. For example, the Centers for Medicare & Medicaid Services adopted BMI as a quality-of-care measure, placing even more pressure on clinicians to use BMI as a health screening tool.
 

BMI as a tool to diagnose obesity

We can’t discuss BMI without discussing the disease of obesity. BMI is the most widely used tool to diagnose obesity. In the United States, one-third of Americans meet the criteria for obesity. Another one-third are at risk for obesity.

Compared with BMI’s relatively quick acceptance into clinical practice, however, obesity was only recently recognized as a disease.

Historically, obesity has been viewed as a lifestyle choice, fueled by misinformation and multiple forms of bias. The historical bias associated with BMI and discrimination has led some public health officials and scholars to dismiss the use of BMI or fail to recognize obesity as disease.

This is a dangerous conclusion, because it comes to the detriment of the very people disproportionately impacted by obesity-related health disparities.

Furthermore, weight bias continues to prevent people living with obesity from receiving insurance coverage for life-enhancing obesity medications and interventions.
 

 

 

Is it time to phase out BMI?

The BMI is intertwined with many forms of bias: age, gender, racial, ethnic, and even weight. Therefore, it is time to phase out BMI. However, phasing out BMI is complex and will take time, given that:

  • Obesity is still a relatively “young” disease. 2023 marks the 10th anniversary of obesity’s recognition as a disease by the American Medical Association. Currently, BMI is the most widely used tool to diagnose obesity. Tools such as waist circumference, body composition, and metabolic health assessment will need to replace the BMI. Shifting from BMI emphasizes that obesity is more than a number on the scale. Obesity, as defined by the Obesity Medicine Association, is indeed a “chronic, relapsing, multi-factorial, neurobehavioral disease, wherein an increase in body fat promotes adipose tissue dysfunction and abnormal fat mass physical forces, resulting in adverse metabolic, biomechanical, and psychosocial health consequences.”
  • Much of our health research is tied to BMI. There have been some shifts in looking at non–weight-related health indicators. However, we need more robust studies evaluating other health indicators beyond weight and BMI. The availability of this data will help eliminate the need for BMI and promote individualized health assessment.
  • Current treatment guidelines for obesity medications are based on BMI. (Note: Medications to treat obesity are called “anti-obesity” medications or AOMs. However, given the stigma associated with obesity, I prefer not to use the term “anti-obesity.”) Presently this interferes with long-term obesity treatment. Once BMI is “normal,” many patients lose insurance coverage for their obesity medication, despite needing long-term metabolic support to overcome the compensatory mechanism of weight regain. Obesity is a chronic disease that exists independent of weight status. Therefore, using non-BMI measures will help ensure appropriate lifetime support for obesity.

The preceding are barriers, not impossibilities. In the interim, if BMI is still used in any capacity, the BMI reference chart should be an adjusted BMI chart based on agerace, ethnicity, biological sex, and obesity-related conditions. Furthermore, BMI isn’t the sole determining factor of health status.

Instead, an “abnormal” BMI should initiate conversation and further testing, if needed, to determine an individual’s health. For example, compare two people of the same height with different BMIs and lifestyles. Current studies support that a person flagged as having a high adjusted BMI but practicing a healthy lifestyle and having no metabolic diseases is less at risk than a person with a “normal” BMI but high waist circumference and an unhealthy lifestyle.

Regardless of your personal feelings, the facts are clear. Technology empowers us with better tools than BMI to determine health status. Therefore, it’s not a matter of if we will stop using BMI but when.

Sylvia Gonsahn-Bollie, MD, DipABOM, is an integrative obesity specialist who specializes in individualized solutions for emotional and biological overeating. Connect with her at www.embraceyouweightloss.com or on Instagram @embraceyoumd. Her bestselling book, “Embrace You: Your Guide to Transforming Weight Loss Misconceptions Into Lifelong Wellness,” is Healthline.com’s Best Overall Weight Loss Book 2022 and one of Livestrong.com’s picks for the 8 Best Weight-Loss Books to Read in 2022.

A version of this article first appeared on Medscape.com.

Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article

Catheter-Directed Retrieval of an Infected Fragment in a Vietnam War Veteran

Article Type
Changed

Shrapnel injuries are commonly encountered in war zones.1 Shrapnel injuries can remain asymptomatic or become systemic, with health effects of the retained foreign body ranging from local to systemic toxicities depending on the patient’s reaction to the chemical composition and corrosiveness of the fragments in vivo.2 We present a case of a reactivating shrapnel injury in the form of a retroperitoneal infection and subsequent iliopsoas abscess. A collaborative procedure was performed between surgery and interventional radiology to snare and remove the infected fragment and drain the abscess.

Case Presentation

While serving in Vietnam, a soldier sustained a fragment injury to his left lower abdomen. He underwent a laparotomy, small bowel resection, and a temporary ileostomy at the time of the injury. Nearly 50 years later, the patient presented with chronic left lower quadrant pain and a low-grade fever. He was diagnosed clinically in the emergency department (ED) with diverticulitis and treated with antibiotics. The patient initially responded to treatment but returned 6 months later with similar symptoms, low-grade fever, and mild leukocytosis. A computed tomography (CT) scan during that encounter without IV contrast revealed a few scattered colonic diverticula without definite diverticulitis as well as a metallic fragment embedded in the left iliopsoas with increased soft tissue density.

The patient was diagnosed with a pelvic/abdominal wall hematoma and was discharged with pain medication. The patient reported recurrent attacks of left lower quadrant pain, fever, and changes in bowel habits, prompting gastrointestinal consultation and a colonoscopy that was unremarkable. Ten months later, the patient again presented to the ED, with recurrent symptoms, a fever of 102 °F, and leukocytosis with a white blood cell count of 11.7 × 109/L. CT scan with IV contrast revealed a large left iliopsoas abscess associated with an approximately 1-cm metallic fragment (Figure 1). A drainage catheter was placed under CT guidance and approximately 270 mL of purulent fluid was drained. Culture of the fluid was positive for Escherichia coli (E coli). Two days after drain placement, the fragment was removed as a joint procedure with interventional radiology and surgery. Using the drainage catheter tract as a point of entry, multiple attempts were made to retrieve the fragment with Olympus EndoJaw endoscopic forceps without success.



Ultimately a stiff directional sheath from a Cook Medical transjugular liver biopsy kit was used with a Merit Medical EnSnare to relocate the fragment to the left inguinal region for surgical excision (Figures 2, 3, and 4). The fragment was removed and swabbed for culture and sensitivity and a BLAKE drain was placed in the evacuated abscess cavity. The patient tolerated the procedure well and was discharged the following day. Three days later, culture and sensitivity grew E coli and Acinetobacter, thus confirming infection and a nidus for the surrounding abscess formation. On follow-up with general surgery 7 days later, the patient reported he was doing well, and the drain was removed without difficulty.

Discussion

Foreign body injuries can be benign or debilitating depending on the initial damage, anatomical location of the foreign body, composition of the foreign body, and the patient’s response to it. Retained shrapnel deep within the muscle tissue rarely causes complications. Although many times embedded objects can be asymptomatic and require no further management, migration of the foreign body or the formation of a fistula is possible, causing symptoms and requiring surgical intervention.1 One case involved the formation of a purulent fistula appearing a year after an explosive wound to the lumbosacral spine, which was treated with antimicrobials. Recurrence of the fistula several times after treatment led to surgical removal of the shrapnel along with antibiotic treatment of the osteomyelitis.3 Although uncommon, lead exposure that occurs due to retained foreign body fragments from gunshot or military-related injuries can cause systemic lead toxicity. Symptoms may range from abdominal pain, nausea, and constipation to jaundice and hepatitis.4 The severity has also been stated to correlate with the surface area of the lead exposed for dissolution.5 Migration of foreign bodies and shrapnel to other sites in the body, such as movement from soft tissues into distantly located body cavities, have been reported as well. Such a case involved the spontaneous onset of knee synovitis due to an intra-articular metallic object that was introduced via a blast injury to the upper third of the ipsilateral thigh.1

 

In this patient’s case, a large intramuscular abscess had formed nearly 50 years after the initial combat injury, requiring drainage of the abscess and removal of the fragment. By snaring the foreign body to a more superficial site, the surgical removal only required a minor incision, decreasing recovery time and the likelihood of postoperative complications that would have been associated with a large retroperitoneal dissection. While loop snare is often the first-line technique for the removal of intravascular foreign bodies, its use in soft tissue retained materials is scarcely reported.6 The more typical uses involve the removal of intraluminal materials, such as partially fractured venous catheters, guide wires, stents, and vena cava filters. The same report mentioned that in all 16 cases of percutaneous foreign body retrieval, no surgical intervention was required.7 In the case of most nonvascular foreign bodies, however, surgical retrieval is usually performed.8

Surgical removal of foreign bodies can be difficult in cases where a foreign body is anatomically located next to vital structures.9 An additional challenge with a sole surgical approach to foreign body retrieval is when it is small in size and lies deep within the soft tissue, as was the case for our patient. In such cases, the surgical procedure can be time consuming and lead to more trauma to the surrounding tissues.10 These factors alone necessitate consideration of postoperative morbidity and mortality.

 

 



In our patient, the retained fragment was embedded in the wall of an abscess located retroperitoneally in his iliopsoas muscle. When considering the proximity of the iliopsoas muscle to the digestive tract, urinary tract, and iliac lymph nodes, it is reasonable for infectious material to come in contact with the foreign body from these nearby structures, resulting in secondary infection.11 Surgery was previously considered the first-line treatment for retroperitoneal abscesses until the advent of imaging-guided percutaneous drainage.12

In some instances, surgical drainage may still be attempted, such as if there are different disease processes requiring open surgery or if percutaneous catheter drainage is not technically possible due to the location of the abscess, thick exudate, loculation/septations, or phlegmon. In these cases, laparoscopic drainage as opposed to open surgical drainage can provide the benefits of an open procedure (ie, total drainage and resection of infected tissue) but is less invasive, requires a smaller incision, and heals faster.13 Percutaneous drainage is the current first-line treatment due to the lack of need for general anesthesia, lower cost, and better morbidity and mortality outcomes compared to surgical methods.12 While percutaneous drainage proved to be immediately therapeutic for our patient, the risk of abscess recurrence with the retained infected fragment necessitated coordination of procedures across specialties to provide the best outcome for the patient.

Conclusions

This case demonstrates a multidisciplinary approach to transforming an otherwise large retroperitoneal dissection to a minimally invasive and technically efficient abscess drainage and foreign body retrieval.

References

1. Schroeder JE, Lowe J, Chaimsky G, Liebergall M, Mosheiff R. Migrating shrapnel: a rare cause of knee synovitis. Mil Med. 2010;175(11):929-930. doi:10.7205/milmed-d-09-00254

2. Centeno JA, Rogers DA, van der Voet GB, et al. Embedded fragments from U.S. military personnel—chemical analysis and potential health implications. Int J Environ Res Public Health. 2014;11(2):1261-1278. Published 2014 Jan 23. doi:10.3390/ijerph110201261

3. Carija R, Busic Z, Bradaric N, Bulovic B, Borzic Z, Pavicic-Perkovic S. Surgical removal of metallic foreign body (shrapnel) from the lumbosacral spine and the treatment of chronic osteomyelitis: a case report. West Indian Med J. 2014;63(4):373-375. doi:10.7727/wimj.2012.290

4. Grasso I, Blattner M, Short T, Downs J. Severe systemic lead toxicity resulting from extra-articular retained shrapnel presenting as jaundice and hepatitis: a case report and review of the literature. Mil Med. 2017;182(3-4):e1843-e1848. doi:10.7205/MILMED-D-16-00231

5. Dillman RO, Crumb CK, Lidsky MJ. Lead poisoning from a gunshot wound: report of a case and review of the literature. Am J Med. 1979;66(3):509-514. doi:10.1016/0002-9343(79)91083-0

6. Woodhouse JB, Uberoi R. Techniques for intravascular foreign body retrieval. Cardiovasc Intervent Radiol. 2013;36(4):888-897. doi:10.1007/s00270-012-0488-8

7. Mallmann CV, Wolf KJ, Wacker FK. Retrieval of vascular foreign bodies using a self-made wire snare. Acta Radiol. 2008;49(10):1124-1128. doi:10.1080/02841850802454741

8. Nosher JL, Siegel R. Percutaneous retrieval of nonvascular foreign bodies. Radiology. 1993;187(3):649-651. doi:10.1148/radiology.187.3.8497610

9. Fu Y, Cui LG, Romagnoli C, Li ZQ, Lei YT. Ultrasound-guided removal of retained soft tissue foreign body with late presentation. Chin Med J (Engl). 2017;130(14):1753-1754. doi:10.4103/0366-6999.209910

10. Liang HD, Li H, Feng H, Zhao ZN, Song WJ, Yuan B. Application of intraoperative navigation and positioning system in the removal of deep foreign bodies in the limbs. Chin Med J (Engl). 2019;132(11):1375-1377. doi:10.1097/CM9.0000000000000253

11. Moriarty CM, Baker RJ. A pain in the psoas. Sports Health. 2016;8(6):568-572. doi:10.1177/1941738116665112

12. Akhan O, Durmaz H, Balcı S, Birgi E, Çiftçi T, Akıncı D. Percutaneous drainage of retroperitoneal abscesses: variables for success, failure, and recurrence. Diagn Interv Radiol. 2020;26(2):124-130. doi:10.5152/dir.2019.19199

13. Hong CH, Hong YC, Bae SH, et al. Laparoscopic drainage as a minimally invasive treatment for a psoas abscess: a single center case series and literature review. Medicine (Baltimore). 2020;99(14):e19640. doi:10.1097/MD.0000000000019640

Article PDF
Author and Disclosure Information

Ahmed Elgazzara; Abeer Chaudharyb; Lance Klosterman, MDc
Correspondence
: Lance Klosterman (lance.klosterman@va.gov)

aEast Tennessee State University Quillen College of Medicine, Johnson City
bMidwestern University Chicago College of Osteopathic Medicine, Downers Grove, Illinois
cMountain Home Veterans Affairs Medical Center, Johnson City, Tennessee

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

No identifiable information or patient photographs included in this case report. The patient gave consent to have the radiographic and foreign body images published.

Issue
Federal Practitioner - 39(9)a
Publications
Topics
Page Number
372-375
Sections
Author and Disclosure Information

Ahmed Elgazzara; Abeer Chaudharyb; Lance Klosterman, MDc
Correspondence
: Lance Klosterman (lance.klosterman@va.gov)

aEast Tennessee State University Quillen College of Medicine, Johnson City
bMidwestern University Chicago College of Osteopathic Medicine, Downers Grove, Illinois
cMountain Home Veterans Affairs Medical Center, Johnson City, Tennessee

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

No identifiable information or patient photographs included in this case report. The patient gave consent to have the radiographic and foreign body images published.

Author and Disclosure Information

Ahmed Elgazzara; Abeer Chaudharyb; Lance Klosterman, MDc
Correspondence
: Lance Klosterman (lance.klosterman@va.gov)

aEast Tennessee State University Quillen College of Medicine, Johnson City
bMidwestern University Chicago College of Osteopathic Medicine, Downers Grove, Illinois
cMountain Home Veterans Affairs Medical Center, Johnson City, Tennessee

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Ethics and consent

No identifiable information or patient photographs included in this case report. The patient gave consent to have the radiographic and foreign body images published.

Article PDF
Article PDF

Shrapnel injuries are commonly encountered in war zones.1 Shrapnel injuries can remain asymptomatic or become systemic, with health effects of the retained foreign body ranging from local to systemic toxicities depending on the patient’s reaction to the chemical composition and corrosiveness of the fragments in vivo.2 We present a case of a reactivating shrapnel injury in the form of a retroperitoneal infection and subsequent iliopsoas abscess. A collaborative procedure was performed between surgery and interventional radiology to snare and remove the infected fragment and drain the abscess.

Case Presentation

While serving in Vietnam, a soldier sustained a fragment injury to his left lower abdomen. He underwent a laparotomy, small bowel resection, and a temporary ileostomy at the time of the injury. Nearly 50 years later, the patient presented with chronic left lower quadrant pain and a low-grade fever. He was diagnosed clinically in the emergency department (ED) with diverticulitis and treated with antibiotics. The patient initially responded to treatment but returned 6 months later with similar symptoms, low-grade fever, and mild leukocytosis. A computed tomography (CT) scan during that encounter without IV contrast revealed a few scattered colonic diverticula without definite diverticulitis as well as a metallic fragment embedded in the left iliopsoas with increased soft tissue density.

The patient was diagnosed with a pelvic/abdominal wall hematoma and was discharged with pain medication. The patient reported recurrent attacks of left lower quadrant pain, fever, and changes in bowel habits, prompting gastrointestinal consultation and a colonoscopy that was unremarkable. Ten months later, the patient again presented to the ED, with recurrent symptoms, a fever of 102 °F, and leukocytosis with a white blood cell count of 11.7 × 109/L. CT scan with IV contrast revealed a large left iliopsoas abscess associated with an approximately 1-cm metallic fragment (Figure 1). A drainage catheter was placed under CT guidance and approximately 270 mL of purulent fluid was drained. Culture of the fluid was positive for Escherichia coli (E coli). Two days after drain placement, the fragment was removed as a joint procedure with interventional radiology and surgery. Using the drainage catheter tract as a point of entry, multiple attempts were made to retrieve the fragment with Olympus EndoJaw endoscopic forceps without success.



Ultimately a stiff directional sheath from a Cook Medical transjugular liver biopsy kit was used with a Merit Medical EnSnare to relocate the fragment to the left inguinal region for surgical excision (Figures 2, 3, and 4). The fragment was removed and swabbed for culture and sensitivity and a BLAKE drain was placed in the evacuated abscess cavity. The patient tolerated the procedure well and was discharged the following day. Three days later, culture and sensitivity grew E coli and Acinetobacter, thus confirming infection and a nidus for the surrounding abscess formation. On follow-up with general surgery 7 days later, the patient reported he was doing well, and the drain was removed without difficulty.

Discussion

Foreign body injuries can be benign or debilitating depending on the initial damage, anatomical location of the foreign body, composition of the foreign body, and the patient’s response to it. Retained shrapnel deep within the muscle tissue rarely causes complications. Although many times embedded objects can be asymptomatic and require no further management, migration of the foreign body or the formation of a fistula is possible, causing symptoms and requiring surgical intervention.1 One case involved the formation of a purulent fistula appearing a year after an explosive wound to the lumbosacral spine, which was treated with antimicrobials. Recurrence of the fistula several times after treatment led to surgical removal of the shrapnel along with antibiotic treatment of the osteomyelitis.3 Although uncommon, lead exposure that occurs due to retained foreign body fragments from gunshot or military-related injuries can cause systemic lead toxicity. Symptoms may range from abdominal pain, nausea, and constipation to jaundice and hepatitis.4 The severity has also been stated to correlate with the surface area of the lead exposed for dissolution.5 Migration of foreign bodies and shrapnel to other sites in the body, such as movement from soft tissues into distantly located body cavities, have been reported as well. Such a case involved the spontaneous onset of knee synovitis due to an intra-articular metallic object that was introduced via a blast injury to the upper third of the ipsilateral thigh.1

 

In this patient’s case, a large intramuscular abscess had formed nearly 50 years after the initial combat injury, requiring drainage of the abscess and removal of the fragment. By snaring the foreign body to a more superficial site, the surgical removal only required a minor incision, decreasing recovery time and the likelihood of postoperative complications that would have been associated with a large retroperitoneal dissection. While loop snare is often the first-line technique for the removal of intravascular foreign bodies, its use in soft tissue retained materials is scarcely reported.6 The more typical uses involve the removal of intraluminal materials, such as partially fractured venous catheters, guide wires, stents, and vena cava filters. The same report mentioned that in all 16 cases of percutaneous foreign body retrieval, no surgical intervention was required.7 In the case of most nonvascular foreign bodies, however, surgical retrieval is usually performed.8

Surgical removal of foreign bodies can be difficult in cases where a foreign body is anatomically located next to vital structures.9 An additional challenge with a sole surgical approach to foreign body retrieval is when it is small in size and lies deep within the soft tissue, as was the case for our patient. In such cases, the surgical procedure can be time consuming and lead to more trauma to the surrounding tissues.10 These factors alone necessitate consideration of postoperative morbidity and mortality.

 

 



In our patient, the retained fragment was embedded in the wall of an abscess located retroperitoneally in his iliopsoas muscle. When considering the proximity of the iliopsoas muscle to the digestive tract, urinary tract, and iliac lymph nodes, it is reasonable for infectious material to come in contact with the foreign body from these nearby structures, resulting in secondary infection.11 Surgery was previously considered the first-line treatment for retroperitoneal abscesses until the advent of imaging-guided percutaneous drainage.12

In some instances, surgical drainage may still be attempted, such as if there are different disease processes requiring open surgery or if percutaneous catheter drainage is not technically possible due to the location of the abscess, thick exudate, loculation/septations, or phlegmon. In these cases, laparoscopic drainage as opposed to open surgical drainage can provide the benefits of an open procedure (ie, total drainage and resection of infected tissue) but is less invasive, requires a smaller incision, and heals faster.13 Percutaneous drainage is the current first-line treatment due to the lack of need for general anesthesia, lower cost, and better morbidity and mortality outcomes compared to surgical methods.12 While percutaneous drainage proved to be immediately therapeutic for our patient, the risk of abscess recurrence with the retained infected fragment necessitated coordination of procedures across specialties to provide the best outcome for the patient.

Conclusions

This case demonstrates a multidisciplinary approach to transforming an otherwise large retroperitoneal dissection to a minimally invasive and technically efficient abscess drainage and foreign body retrieval.

Shrapnel injuries are commonly encountered in war zones.1 Shrapnel injuries can remain asymptomatic or become systemic, with health effects of the retained foreign body ranging from local to systemic toxicities depending on the patient’s reaction to the chemical composition and corrosiveness of the fragments in vivo.2 We present a case of a reactivating shrapnel injury in the form of a retroperitoneal infection and subsequent iliopsoas abscess. A collaborative procedure was performed between surgery and interventional radiology to snare and remove the infected fragment and drain the abscess.

Case Presentation

While serving in Vietnam, a soldier sustained a fragment injury to his left lower abdomen. He underwent a laparotomy, small bowel resection, and a temporary ileostomy at the time of the injury. Nearly 50 years later, the patient presented with chronic left lower quadrant pain and a low-grade fever. He was diagnosed clinically in the emergency department (ED) with diverticulitis and treated with antibiotics. The patient initially responded to treatment but returned 6 months later with similar symptoms, low-grade fever, and mild leukocytosis. A computed tomography (CT) scan during that encounter without IV contrast revealed a few scattered colonic diverticula without definite diverticulitis as well as a metallic fragment embedded in the left iliopsoas with increased soft tissue density.

The patient was diagnosed with a pelvic/abdominal wall hematoma and was discharged with pain medication. The patient reported recurrent attacks of left lower quadrant pain, fever, and changes in bowel habits, prompting gastrointestinal consultation and a colonoscopy that was unremarkable. Ten months later, the patient again presented to the ED, with recurrent symptoms, a fever of 102 °F, and leukocytosis with a white blood cell count of 11.7 × 109/L. CT scan with IV contrast revealed a large left iliopsoas abscess associated with an approximately 1-cm metallic fragment (Figure 1). A drainage catheter was placed under CT guidance and approximately 270 mL of purulent fluid was drained. Culture of the fluid was positive for Escherichia coli (E coli). Two days after drain placement, the fragment was removed as a joint procedure with interventional radiology and surgery. Using the drainage catheter tract as a point of entry, multiple attempts were made to retrieve the fragment with Olympus EndoJaw endoscopic forceps without success.



Ultimately a stiff directional sheath from a Cook Medical transjugular liver biopsy kit was used with a Merit Medical EnSnare to relocate the fragment to the left inguinal region for surgical excision (Figures 2, 3, and 4). The fragment was removed and swabbed for culture and sensitivity and a BLAKE drain was placed in the evacuated abscess cavity. The patient tolerated the procedure well and was discharged the following day. Three days later, culture and sensitivity grew E coli and Acinetobacter, thus confirming infection and a nidus for the surrounding abscess formation. On follow-up with general surgery 7 days later, the patient reported he was doing well, and the drain was removed without difficulty.

Discussion

Foreign body injuries can be benign or debilitating depending on the initial damage, anatomical location of the foreign body, composition of the foreign body, and the patient’s response to it. Retained shrapnel deep within the muscle tissue rarely causes complications. Although many times embedded objects can be asymptomatic and require no further management, migration of the foreign body or the formation of a fistula is possible, causing symptoms and requiring surgical intervention.1 One case involved the formation of a purulent fistula appearing a year after an explosive wound to the lumbosacral spine, which was treated with antimicrobials. Recurrence of the fistula several times after treatment led to surgical removal of the shrapnel along with antibiotic treatment of the osteomyelitis.3 Although uncommon, lead exposure that occurs due to retained foreign body fragments from gunshot or military-related injuries can cause systemic lead toxicity. Symptoms may range from abdominal pain, nausea, and constipation to jaundice and hepatitis.4 The severity has also been stated to correlate with the surface area of the lead exposed for dissolution.5 Migration of foreign bodies and shrapnel to other sites in the body, such as movement from soft tissues into distantly located body cavities, have been reported as well. Such a case involved the spontaneous onset of knee synovitis due to an intra-articular metallic object that was introduced via a blast injury to the upper third of the ipsilateral thigh.1

 

In this patient’s case, a large intramuscular abscess had formed nearly 50 years after the initial combat injury, requiring drainage of the abscess and removal of the fragment. By snaring the foreign body to a more superficial site, the surgical removal only required a minor incision, decreasing recovery time and the likelihood of postoperative complications that would have been associated with a large retroperitoneal dissection. While loop snare is often the first-line technique for the removal of intravascular foreign bodies, its use in soft tissue retained materials is scarcely reported.6 The more typical uses involve the removal of intraluminal materials, such as partially fractured venous catheters, guide wires, stents, and vena cava filters. The same report mentioned that in all 16 cases of percutaneous foreign body retrieval, no surgical intervention was required.7 In the case of most nonvascular foreign bodies, however, surgical retrieval is usually performed.8

Surgical removal of foreign bodies can be difficult in cases where a foreign body is anatomically located next to vital structures.9 An additional challenge with a sole surgical approach to foreign body retrieval is when it is small in size and lies deep within the soft tissue, as was the case for our patient. In such cases, the surgical procedure can be time consuming and lead to more trauma to the surrounding tissues.10 These factors alone necessitate consideration of postoperative morbidity and mortality.

 

 



In our patient, the retained fragment was embedded in the wall of an abscess located retroperitoneally in his iliopsoas muscle. When considering the proximity of the iliopsoas muscle to the digestive tract, urinary tract, and iliac lymph nodes, it is reasonable for infectious material to come in contact with the foreign body from these nearby structures, resulting in secondary infection.11 Surgery was previously considered the first-line treatment for retroperitoneal abscesses until the advent of imaging-guided percutaneous drainage.12

In some instances, surgical drainage may still be attempted, such as if there are different disease processes requiring open surgery or if percutaneous catheter drainage is not technically possible due to the location of the abscess, thick exudate, loculation/septations, or phlegmon. In these cases, laparoscopic drainage as opposed to open surgical drainage can provide the benefits of an open procedure (ie, total drainage and resection of infected tissue) but is less invasive, requires a smaller incision, and heals faster.13 Percutaneous drainage is the current first-line treatment due to the lack of need for general anesthesia, lower cost, and better morbidity and mortality outcomes compared to surgical methods.12 While percutaneous drainage proved to be immediately therapeutic for our patient, the risk of abscess recurrence with the retained infected fragment necessitated coordination of procedures across specialties to provide the best outcome for the patient.

Conclusions

This case demonstrates a multidisciplinary approach to transforming an otherwise large retroperitoneal dissection to a minimally invasive and technically efficient abscess drainage and foreign body retrieval.

References

1. Schroeder JE, Lowe J, Chaimsky G, Liebergall M, Mosheiff R. Migrating shrapnel: a rare cause of knee synovitis. Mil Med. 2010;175(11):929-930. doi:10.7205/milmed-d-09-00254

2. Centeno JA, Rogers DA, van der Voet GB, et al. Embedded fragments from U.S. military personnel—chemical analysis and potential health implications. Int J Environ Res Public Health. 2014;11(2):1261-1278. Published 2014 Jan 23. doi:10.3390/ijerph110201261

3. Carija R, Busic Z, Bradaric N, Bulovic B, Borzic Z, Pavicic-Perkovic S. Surgical removal of metallic foreign body (shrapnel) from the lumbosacral spine and the treatment of chronic osteomyelitis: a case report. West Indian Med J. 2014;63(4):373-375. doi:10.7727/wimj.2012.290

4. Grasso I, Blattner M, Short T, Downs J. Severe systemic lead toxicity resulting from extra-articular retained shrapnel presenting as jaundice and hepatitis: a case report and review of the literature. Mil Med. 2017;182(3-4):e1843-e1848. doi:10.7205/MILMED-D-16-00231

5. Dillman RO, Crumb CK, Lidsky MJ. Lead poisoning from a gunshot wound: report of a case and review of the literature. Am J Med. 1979;66(3):509-514. doi:10.1016/0002-9343(79)91083-0

6. Woodhouse JB, Uberoi R. Techniques for intravascular foreign body retrieval. Cardiovasc Intervent Radiol. 2013;36(4):888-897. doi:10.1007/s00270-012-0488-8

7. Mallmann CV, Wolf KJ, Wacker FK. Retrieval of vascular foreign bodies using a self-made wire snare. Acta Radiol. 2008;49(10):1124-1128. doi:10.1080/02841850802454741

8. Nosher JL, Siegel R. Percutaneous retrieval of nonvascular foreign bodies. Radiology. 1993;187(3):649-651. doi:10.1148/radiology.187.3.8497610

9. Fu Y, Cui LG, Romagnoli C, Li ZQ, Lei YT. Ultrasound-guided removal of retained soft tissue foreign body with late presentation. Chin Med J (Engl). 2017;130(14):1753-1754. doi:10.4103/0366-6999.209910

10. Liang HD, Li H, Feng H, Zhao ZN, Song WJ, Yuan B. Application of intraoperative navigation and positioning system in the removal of deep foreign bodies in the limbs. Chin Med J (Engl). 2019;132(11):1375-1377. doi:10.1097/CM9.0000000000000253

11. Moriarty CM, Baker RJ. A pain in the psoas. Sports Health. 2016;8(6):568-572. doi:10.1177/1941738116665112

12. Akhan O, Durmaz H, Balcı S, Birgi E, Çiftçi T, Akıncı D. Percutaneous drainage of retroperitoneal abscesses: variables for success, failure, and recurrence. Diagn Interv Radiol. 2020;26(2):124-130. doi:10.5152/dir.2019.19199

13. Hong CH, Hong YC, Bae SH, et al. Laparoscopic drainage as a minimally invasive treatment for a psoas abscess: a single center case series and literature review. Medicine (Baltimore). 2020;99(14):e19640. doi:10.1097/MD.0000000000019640

References

1. Schroeder JE, Lowe J, Chaimsky G, Liebergall M, Mosheiff R. Migrating shrapnel: a rare cause of knee synovitis. Mil Med. 2010;175(11):929-930. doi:10.7205/milmed-d-09-00254

2. Centeno JA, Rogers DA, van der Voet GB, et al. Embedded fragments from U.S. military personnel—chemical analysis and potential health implications. Int J Environ Res Public Health. 2014;11(2):1261-1278. Published 2014 Jan 23. doi:10.3390/ijerph110201261

3. Carija R, Busic Z, Bradaric N, Bulovic B, Borzic Z, Pavicic-Perkovic S. Surgical removal of metallic foreign body (shrapnel) from the lumbosacral spine and the treatment of chronic osteomyelitis: a case report. West Indian Med J. 2014;63(4):373-375. doi:10.7727/wimj.2012.290

4. Grasso I, Blattner M, Short T, Downs J. Severe systemic lead toxicity resulting from extra-articular retained shrapnel presenting as jaundice and hepatitis: a case report and review of the literature. Mil Med. 2017;182(3-4):e1843-e1848. doi:10.7205/MILMED-D-16-00231

5. Dillman RO, Crumb CK, Lidsky MJ. Lead poisoning from a gunshot wound: report of a case and review of the literature. Am J Med. 1979;66(3):509-514. doi:10.1016/0002-9343(79)91083-0

6. Woodhouse JB, Uberoi R. Techniques for intravascular foreign body retrieval. Cardiovasc Intervent Radiol. 2013;36(4):888-897. doi:10.1007/s00270-012-0488-8

7. Mallmann CV, Wolf KJ, Wacker FK. Retrieval of vascular foreign bodies using a self-made wire snare. Acta Radiol. 2008;49(10):1124-1128. doi:10.1080/02841850802454741

8. Nosher JL, Siegel R. Percutaneous retrieval of nonvascular foreign bodies. Radiology. 1993;187(3):649-651. doi:10.1148/radiology.187.3.8497610

9. Fu Y, Cui LG, Romagnoli C, Li ZQ, Lei YT. Ultrasound-guided removal of retained soft tissue foreign body with late presentation. Chin Med J (Engl). 2017;130(14):1753-1754. doi:10.4103/0366-6999.209910

10. Liang HD, Li H, Feng H, Zhao ZN, Song WJ, Yuan B. Application of intraoperative navigation and positioning system in the removal of deep foreign bodies in the limbs. Chin Med J (Engl). 2019;132(11):1375-1377. doi:10.1097/CM9.0000000000000253

11. Moriarty CM, Baker RJ. A pain in the psoas. Sports Health. 2016;8(6):568-572. doi:10.1177/1941738116665112

12. Akhan O, Durmaz H, Balcı S, Birgi E, Çiftçi T, Akıncı D. Percutaneous drainage of retroperitoneal abscesses: variables for success, failure, and recurrence. Diagn Interv Radiol. 2020;26(2):124-130. doi:10.5152/dir.2019.19199

13. Hong CH, Hong YC, Bae SH, et al. Laparoscopic drainage as a minimally invasive treatment for a psoas abscess: a single center case series and literature review. Medicine (Baltimore). 2020;99(14):e19640. doi:10.1097/MD.0000000000019640

Issue
Federal Practitioner - 39(9)a
Issue
Federal Practitioner - 39(9)a
Page Number
372-375
Page Number
372-375
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

Successful Use of Lanadelumab in an Older Patient With Type II Hereditary Angioedema

Article Type
Changed

Hereditary angioedema (HAE) is a rare genetic disorder affecting about 1 in 67,000 individuals and may lead to increased morbidity and mortality.1,2 HAE is characterized by recurring episodes of subcutaneous and/or submucosal edema without urticaria due to an excess of bradykinin.2,3 Autosomal dominant inheritance is present in 75% of patients with HAE and is classified into 2 main types.2 Type I HAE is caused by deficiency of C1 esterase inhibitor, accounting for 85% of cases.2 Type II HAE is marked by normal to elevated levels of C1 esterase inhibitor but with reduced activity.2

Cutaneous and abdominal angioedema attacks are the most common presentation.1 However, any location may be affected, including the face, oropharynx, and larynx.1 Only 0.9% of all HAE attacks cause laryngeal edema, but 50% of HAE patients have experienced a laryngeal attack, which may be lethal.1 An angioedema attack can range in severity, depending on the location and degree of edema.3 In addition, patients with HAE often are diagnosed with anxiety and depression secondary to their poor quality of life.4 Thus, long-term prophylaxis of attacks is crucial to reduce the physical and psychological implications.

Previously, HAE was treated with antifibrinolytic agents and attenuated androgens for short- and long-term prophylaxis.1 These treatment modalities are now considered second-line since the development of novel medications with improved efficacy and limited adverse effects (AEs).1 For long-term prophylaxis, subcutaneous and IV C1 esterase inhibitor has been proven effective in both types I and II HAE.1 Another option, lanadelumab, a subcutaneously delivered monoclonal antibody inhibitor of plasma kallikrein, has been proven to decrease the frequency of HAE attacks without significant AEs.5 Lanadelumab works by binding to the active site of plasma kallikrein, which reduces its activity and slows the production of bradykinin.6 This results in decreasing vascular permeability and swelling episodes in patients with HAE.7 Data, however, are limited, specifically regarding patients with type II HAE and patients aged ≥ 65 years.5 This article reports on an older male with type II HAE successfully treated with lanadelumab.

Case Presentation

An 81-year-old male patient with hypertension, hypertriglyceridemia, and aortic aneurysm had recurrent, frequent episodes of severe abdominal pain with a remote history of extremity and scrotal swelling since adolescence. He was misdiagnosed for years and was eventually determined to have HAE at age 75 years after his niece was diagnosed, prompting him to be reevaluated for his frequent bouts of abdominal pain. His laboratory findings were consistent with HAE type II with low C4 (7.8 mg/dL), normal C1 esterase inhibitor levels (24 mg/dL), and low levels of C1 esterase inhibitor activity (28% of normal).

Initially, he described having weekly attacks of abdominal pain that could last 1 to several days. At worst, these attacks would last up to a month, causing a decrease in appetite and weight loss. At age 77 years, he began an on-demand treatment, icatibant, a bradykinin receptor blocker. After initiating icatibant during an acute attack, the pain would diminish within 1 to 2 hours, and within several hours, he would be pain free. Previously, pain relief would take several days to weeks. He continued to use icatibant on-demand, typically requiring treatment every 1 to 2 months for only the more severe attacks.

After an increasing frequency of abdominal pain attacks, prophylactic medication was recommended. Therefore, subcutaneous lanadelumab 300 mg every 2 weeks was initiated for long-term prophylaxis. The patient went from requiring on-demand treatment 2 to 3 times per month to once in 6 months after starting lanadelumab. In addition, he tolerated the medication well without any AEs.

 

 

Discussion

According to the international WAO/EAACI 2021 guidelines, HAE treatment goals are “to achieve complete control of the disease and to normalize patients’ lives.”8 On-demand treatment options include C1 esterase inhibitor, icatibant, or ecallantide (a kallikrein inhibitor).8 Long-term prophylaxis in HAE should be considered, accounting for disease activity, burden, control, and patient preference. Five medications have been used for long-term prophylaxis: antifibrinolytic agents (not recommended), attenuated androgens (considered second-line), C1 esterase inhibitor, berotralstat, and lanadelumab.8

Antifibrinolytics are no longer recommended for long-term prophylactic treatment as their efficacy is poor and was not considered for our patient. Attenuated androgens, such as danazol, have a history of prophylactic use in patients with HAE due to their good efficacy but are suboptimal due to their significant AE profile and many drug-drug interactions.8 In addition, androgens have many contraindications, including hypertension and hypertriglyceridemia, which were both present in our patient. Consequently, danazol was not an advised treatment for our patient. C1 esterase inhibitor is often used to prevent HAE attacks and can be given intravenously or subcutaneously, typically administered biweekly. A potential AE of C1 esterase inhibitor is thrombosis.Therefore, C1 esterase inhibitor was not a preferred choice in our older patient with a history of hypercoagulability. Berotralstat, a plasma kallikrein inhibitor, is an oral treatment option that also has shown efficacy in long-term prophylaxis. The most common AEs of berotralstat tend to be gastrointestinal symptoms, and the medication requires dose adjustment for patients with hepatic impairment.8 Berotralstat was not considered because it was not an approved treatment option at the time of this patient’s treatment. Lanadelumab is a human monoclonal antibody against plasma kallikrein, which decreases bradykinin production in patients with HAE, thus preventing angioedema attacks.5 Data regarding the use of lanadelumab in patients with type II HAE are limited, but because HAE with normal C1 esterase inhibitor levels involves the production of bradykinin via kallikrein, lanadelumab should still be effective.1 Lanadelumab was chosen for our patient because of its minimal AEs and is not known to increase the risk of thrombosis.

Lanadelumab is a novel medication, recently approved in 2018 by the US Food and Drug Administration for the treatment of type I and type 2 HAE in patients aged ≥ 12 years.7 The phase 3 Hereditary Angioedema Long-term Prophylaxis (HELP) study concluded that treatment with subcutaneous lanadelumab for 26 weeks significantly decreased the frequency of angioedema attacks compared with placebo.5 However, 113 (90.4%) of patients in the phase III HELP study had type I HAE.5 Of the 125 patients that completed this randomized, double-blind study, only 12 had type II HAE.5 In addition, this study only included 5 patients aged ≥ 65 years.5 Also, no patients aged ≥ 65 years were part of the treatment arms that included a lanadelumab dose of 300 mg.5 In a case series of 12 patients in Canada, treatment with lanadelumab decreased angioedema attacks by 72%.9 However, the series only included 1 patient with type II HAE who was aged 36 years.9 Therefore, our case demonstrates the efficacy of lanadelumab in a patient aged ≥ 65 years with type II HAE.

Conclusions

HAE is a rare and potentially fatal disease characterized by recurrent, unpredictable attacks of edema throughout the body. The disease burden adversely affects a patient’s quality of life. Therefore, long-term prophylaxis is critical to managing patients with HAE. Lanadelumab has been proven as an effective long-term prophylactic treatment option for HAE attacks. This case supports the use of lanadelumab in patients with type II HAE and patients aged ≥ 65 years.

Acknowledgments

The patient was initially written up based on his delayed diagnosis as a case report.3 An earlier version of this article was presented by Samuel Weiss, MD, and Derek Smith, MD, as a poster at the American Academy of Allergy, Asthma, and Immunology virtual conference February 26 to March 1, 2021.

References

1. Busse PJ, Christiansen SC. Hereditary angioedema. N Engl J Med. 2020;382(12):1136-1148. doi:10.1056/NEJMra1808012

2. Bernstein JA. Severity of hereditary angioedema, prevalence, and diagnostic considerations. Am J Manag Care. 2018;24(14)(suppl):S292-S298.

3. Berger J, Carroll MP Jr, Champoux E, Coop CA. Extremely delayed diagnosis of type II hereditary angioedema: case report and review of the literature. Mil Med. 2018;183(11-12):e765-e767. doi:10.1093/milmed/usy031

4. Fouche AS, Saunders EF, Craig T. Depression and anxiety in patients with hereditary angioedema. Ann Allergy Asthma Immunol. 2014;112(4):371-375. doi:10.1016/j.anai.2013.05.028

5. Banerji A, Riedl MA, Bernstein JA, et al; HELP Investigators. Effect of lanadelumab compared with placebo on prevention of hereditary angioedema attacks: a randomized clinical trial. JAMA. 2018;320(20):2108-2121. doi:10.1001/jama.2018.16773

6. Busse PJ, Farkas H, Banerji A, et al. Lanadelumab for the prophylactic treatment of hereditary angioedema with C1 inhibitor deficiency: a review of preclinical and phase I studies. BioDrugs. 2019;33(1):33-43. doi:10.1007/s40259-018-0325-y

7. Riedl MA, Maurer M, Bernstein JA, et al. Lanadelumab demonstrates rapid and sustained prevention of hereditary angioedema attacks. Allergy. 2020;75(11):2879-2887. doi:10.1111/all.14416

8. Maurer M, Magerl M, Betschel S, et al. The international WAO/EAACI guideline for the management of hereditary angioedema—the 2021 revision and update. Allergy. 2022;77(7):1961-1990. doi:10.1111/all.15214

9. Iaboni A, Kanani A, Lacuesta G, Song C, Kan M, Betschel SD. Impact of lanadelumab in hereditary angioedema: a case series of 12 patients in Canada. Allergy Asthma Clin Immunol. 2021;17(1):78. Published 2021 Jul 23. doi:10.1186/s13223-021-00579-6

Article PDF
Author and Disclosure Information

Maj Tasha Hellu, DOa; Maj Samuel Weiss, MDa; Lt Col Derek Smith, MDa
Correspondence: Tasha Hellu (tasha.s.hellu.mil@mail.mil)

aDepartment of Medicine, Allergy and Immunology Division, Wilford Hall Ambulatory Surgical Center, Lackland Air Force Base, Texas

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Ethics and consent

No informed consent was obtained from the patient; patient identifiers were removed to protect the patient’s identity.

Issue
Federal Practitioner - 39(9)a
Publications
Topics
Page Number
390=392
Sections
Author and Disclosure Information

Maj Tasha Hellu, DOa; Maj Samuel Weiss, MDa; Lt Col Derek Smith, MDa
Correspondence: Tasha Hellu (tasha.s.hellu.mil@mail.mil)

aDepartment of Medicine, Allergy and Immunology Division, Wilford Hall Ambulatory Surgical Center, Lackland Air Force Base, Texas

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Ethics and consent

No informed consent was obtained from the patient; patient identifiers were removed to protect the patient’s identity.

Author and Disclosure Information

Maj Tasha Hellu, DOa; Maj Samuel Weiss, MDa; Lt Col Derek Smith, MDa
Correspondence: Tasha Hellu (tasha.s.hellu.mil@mail.mil)

aDepartment of Medicine, Allergy and Immunology Division, Wilford Hall Ambulatory Surgical Center, Lackland Air Force Base, Texas

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies. This article may discuss unlabeled or investigational use of certain drugs. Please review the complete prescribing information for specific drugs or drug combinations—including indications, contraindications, warnings, and adverse effects—before administering pharmacologic therapy to patients.

Ethics and consent

No informed consent was obtained from the patient; patient identifiers were removed to protect the patient’s identity.

Article PDF
Article PDF

Hereditary angioedema (HAE) is a rare genetic disorder affecting about 1 in 67,000 individuals and may lead to increased morbidity and mortality.1,2 HAE is characterized by recurring episodes of subcutaneous and/or submucosal edema without urticaria due to an excess of bradykinin.2,3 Autosomal dominant inheritance is present in 75% of patients with HAE and is classified into 2 main types.2 Type I HAE is caused by deficiency of C1 esterase inhibitor, accounting for 85% of cases.2 Type II HAE is marked by normal to elevated levels of C1 esterase inhibitor but with reduced activity.2

Cutaneous and abdominal angioedema attacks are the most common presentation.1 However, any location may be affected, including the face, oropharynx, and larynx.1 Only 0.9% of all HAE attacks cause laryngeal edema, but 50% of HAE patients have experienced a laryngeal attack, which may be lethal.1 An angioedema attack can range in severity, depending on the location and degree of edema.3 In addition, patients with HAE often are diagnosed with anxiety and depression secondary to their poor quality of life.4 Thus, long-term prophylaxis of attacks is crucial to reduce the physical and psychological implications.

Previously, HAE was treated with antifibrinolytic agents and attenuated androgens for short- and long-term prophylaxis.1 These treatment modalities are now considered second-line since the development of novel medications with improved efficacy and limited adverse effects (AEs).1 For long-term prophylaxis, subcutaneous and IV C1 esterase inhibitor has been proven effective in both types I and II HAE.1 Another option, lanadelumab, a subcutaneously delivered monoclonal antibody inhibitor of plasma kallikrein, has been proven to decrease the frequency of HAE attacks without significant AEs.5 Lanadelumab works by binding to the active site of plasma kallikrein, which reduces its activity and slows the production of bradykinin.6 This results in decreasing vascular permeability and swelling episodes in patients with HAE.7 Data, however, are limited, specifically regarding patients with type II HAE and patients aged ≥ 65 years.5 This article reports on an older male with type II HAE successfully treated with lanadelumab.

Case Presentation

An 81-year-old male patient with hypertension, hypertriglyceridemia, and aortic aneurysm had recurrent, frequent episodes of severe abdominal pain with a remote history of extremity and scrotal swelling since adolescence. He was misdiagnosed for years and was eventually determined to have HAE at age 75 years after his niece was diagnosed, prompting him to be reevaluated for his frequent bouts of abdominal pain. His laboratory findings were consistent with HAE type II with low C4 (7.8 mg/dL), normal C1 esterase inhibitor levels (24 mg/dL), and low levels of C1 esterase inhibitor activity (28% of normal).

Initially, he described having weekly attacks of abdominal pain that could last 1 to several days. At worst, these attacks would last up to a month, causing a decrease in appetite and weight loss. At age 77 years, he began an on-demand treatment, icatibant, a bradykinin receptor blocker. After initiating icatibant during an acute attack, the pain would diminish within 1 to 2 hours, and within several hours, he would be pain free. Previously, pain relief would take several days to weeks. He continued to use icatibant on-demand, typically requiring treatment every 1 to 2 months for only the more severe attacks.

After an increasing frequency of abdominal pain attacks, prophylactic medication was recommended. Therefore, subcutaneous lanadelumab 300 mg every 2 weeks was initiated for long-term prophylaxis. The patient went from requiring on-demand treatment 2 to 3 times per month to once in 6 months after starting lanadelumab. In addition, he tolerated the medication well without any AEs.

 

 

Discussion

According to the international WAO/EAACI 2021 guidelines, HAE treatment goals are “to achieve complete control of the disease and to normalize patients’ lives.”8 On-demand treatment options include C1 esterase inhibitor, icatibant, or ecallantide (a kallikrein inhibitor).8 Long-term prophylaxis in HAE should be considered, accounting for disease activity, burden, control, and patient preference. Five medications have been used for long-term prophylaxis: antifibrinolytic agents (not recommended), attenuated androgens (considered second-line), C1 esterase inhibitor, berotralstat, and lanadelumab.8

Antifibrinolytics are no longer recommended for long-term prophylactic treatment as their efficacy is poor and was not considered for our patient. Attenuated androgens, such as danazol, have a history of prophylactic use in patients with HAE due to their good efficacy but are suboptimal due to their significant AE profile and many drug-drug interactions.8 In addition, androgens have many contraindications, including hypertension and hypertriglyceridemia, which were both present in our patient. Consequently, danazol was not an advised treatment for our patient. C1 esterase inhibitor is often used to prevent HAE attacks and can be given intravenously or subcutaneously, typically administered biweekly. A potential AE of C1 esterase inhibitor is thrombosis.Therefore, C1 esterase inhibitor was not a preferred choice in our older patient with a history of hypercoagulability. Berotralstat, a plasma kallikrein inhibitor, is an oral treatment option that also has shown efficacy in long-term prophylaxis. The most common AEs of berotralstat tend to be gastrointestinal symptoms, and the medication requires dose adjustment for patients with hepatic impairment.8 Berotralstat was not considered because it was not an approved treatment option at the time of this patient’s treatment. Lanadelumab is a human monoclonal antibody against plasma kallikrein, which decreases bradykinin production in patients with HAE, thus preventing angioedema attacks.5 Data regarding the use of lanadelumab in patients with type II HAE are limited, but because HAE with normal C1 esterase inhibitor levels involves the production of bradykinin via kallikrein, lanadelumab should still be effective.1 Lanadelumab was chosen for our patient because of its minimal AEs and is not known to increase the risk of thrombosis.

Lanadelumab is a novel medication, recently approved in 2018 by the US Food and Drug Administration for the treatment of type I and type 2 HAE in patients aged ≥ 12 years.7 The phase 3 Hereditary Angioedema Long-term Prophylaxis (HELP) study concluded that treatment with subcutaneous lanadelumab for 26 weeks significantly decreased the frequency of angioedema attacks compared with placebo.5 However, 113 (90.4%) of patients in the phase III HELP study had type I HAE.5 Of the 125 patients that completed this randomized, double-blind study, only 12 had type II HAE.5 In addition, this study only included 5 patients aged ≥ 65 years.5 Also, no patients aged ≥ 65 years were part of the treatment arms that included a lanadelumab dose of 300 mg.5 In a case series of 12 patients in Canada, treatment with lanadelumab decreased angioedema attacks by 72%.9 However, the series only included 1 patient with type II HAE who was aged 36 years.9 Therefore, our case demonstrates the efficacy of lanadelumab in a patient aged ≥ 65 years with type II HAE.

Conclusions

HAE is a rare and potentially fatal disease characterized by recurrent, unpredictable attacks of edema throughout the body. The disease burden adversely affects a patient’s quality of life. Therefore, long-term prophylaxis is critical to managing patients with HAE. Lanadelumab has been proven as an effective long-term prophylactic treatment option for HAE attacks. This case supports the use of lanadelumab in patients with type II HAE and patients aged ≥ 65 years.

Acknowledgments

The patient was initially written up based on his delayed diagnosis as a case report.3 An earlier version of this article was presented by Samuel Weiss, MD, and Derek Smith, MD, as a poster at the American Academy of Allergy, Asthma, and Immunology virtual conference February 26 to March 1, 2021.

Hereditary angioedema (HAE) is a rare genetic disorder affecting about 1 in 67,000 individuals and may lead to increased morbidity and mortality.1,2 HAE is characterized by recurring episodes of subcutaneous and/or submucosal edema without urticaria due to an excess of bradykinin.2,3 Autosomal dominant inheritance is present in 75% of patients with HAE and is classified into 2 main types.2 Type I HAE is caused by deficiency of C1 esterase inhibitor, accounting for 85% of cases.2 Type II HAE is marked by normal to elevated levels of C1 esterase inhibitor but with reduced activity.2

Cutaneous and abdominal angioedema attacks are the most common presentation.1 However, any location may be affected, including the face, oropharynx, and larynx.1 Only 0.9% of all HAE attacks cause laryngeal edema, but 50% of HAE patients have experienced a laryngeal attack, which may be lethal.1 An angioedema attack can range in severity, depending on the location and degree of edema.3 In addition, patients with HAE often are diagnosed with anxiety and depression secondary to their poor quality of life.4 Thus, long-term prophylaxis of attacks is crucial to reduce the physical and psychological implications.

Previously, HAE was treated with antifibrinolytic agents and attenuated androgens for short- and long-term prophylaxis.1 These treatment modalities are now considered second-line since the development of novel medications with improved efficacy and limited adverse effects (AEs).1 For long-term prophylaxis, subcutaneous and IV C1 esterase inhibitor has been proven effective in both types I and II HAE.1 Another option, lanadelumab, a subcutaneously delivered monoclonal antibody inhibitor of plasma kallikrein, has been proven to decrease the frequency of HAE attacks without significant AEs.5 Lanadelumab works by binding to the active site of plasma kallikrein, which reduces its activity and slows the production of bradykinin.6 This results in decreasing vascular permeability and swelling episodes in patients with HAE.7 Data, however, are limited, specifically regarding patients with type II HAE and patients aged ≥ 65 years.5 This article reports on an older male with type II HAE successfully treated with lanadelumab.

Case Presentation

An 81-year-old male patient with hypertension, hypertriglyceridemia, and aortic aneurysm had recurrent, frequent episodes of severe abdominal pain with a remote history of extremity and scrotal swelling since adolescence. He was misdiagnosed for years and was eventually determined to have HAE at age 75 years after his niece was diagnosed, prompting him to be reevaluated for his frequent bouts of abdominal pain. His laboratory findings were consistent with HAE type II with low C4 (7.8 mg/dL), normal C1 esterase inhibitor levels (24 mg/dL), and low levels of C1 esterase inhibitor activity (28% of normal).

Initially, he described having weekly attacks of abdominal pain that could last 1 to several days. At worst, these attacks would last up to a month, causing a decrease in appetite and weight loss. At age 77 years, he began an on-demand treatment, icatibant, a bradykinin receptor blocker. After initiating icatibant during an acute attack, the pain would diminish within 1 to 2 hours, and within several hours, he would be pain free. Previously, pain relief would take several days to weeks. He continued to use icatibant on-demand, typically requiring treatment every 1 to 2 months for only the more severe attacks.

After an increasing frequency of abdominal pain attacks, prophylactic medication was recommended. Therefore, subcutaneous lanadelumab 300 mg every 2 weeks was initiated for long-term prophylaxis. The patient went from requiring on-demand treatment 2 to 3 times per month to once in 6 months after starting lanadelumab. In addition, he tolerated the medication well without any AEs.

 

 

Discussion

According to the international WAO/EAACI 2021 guidelines, HAE treatment goals are “to achieve complete control of the disease and to normalize patients’ lives.”8 On-demand treatment options include C1 esterase inhibitor, icatibant, or ecallantide (a kallikrein inhibitor).8 Long-term prophylaxis in HAE should be considered, accounting for disease activity, burden, control, and patient preference. Five medications have been used for long-term prophylaxis: antifibrinolytic agents (not recommended), attenuated androgens (considered second-line), C1 esterase inhibitor, berotralstat, and lanadelumab.8

Antifibrinolytics are no longer recommended for long-term prophylactic treatment as their efficacy is poor and was not considered for our patient. Attenuated androgens, such as danazol, have a history of prophylactic use in patients with HAE due to their good efficacy but are suboptimal due to their significant AE profile and many drug-drug interactions.8 In addition, androgens have many contraindications, including hypertension and hypertriglyceridemia, which were both present in our patient. Consequently, danazol was not an advised treatment for our patient. C1 esterase inhibitor is often used to prevent HAE attacks and can be given intravenously or subcutaneously, typically administered biweekly. A potential AE of C1 esterase inhibitor is thrombosis.Therefore, C1 esterase inhibitor was not a preferred choice in our older patient with a history of hypercoagulability. Berotralstat, a plasma kallikrein inhibitor, is an oral treatment option that also has shown efficacy in long-term prophylaxis. The most common AEs of berotralstat tend to be gastrointestinal symptoms, and the medication requires dose adjustment for patients with hepatic impairment.8 Berotralstat was not considered because it was not an approved treatment option at the time of this patient’s treatment. Lanadelumab is a human monoclonal antibody against plasma kallikrein, which decreases bradykinin production in patients with HAE, thus preventing angioedema attacks.5 Data regarding the use of lanadelumab in patients with type II HAE are limited, but because HAE with normal C1 esterase inhibitor levels involves the production of bradykinin via kallikrein, lanadelumab should still be effective.1 Lanadelumab was chosen for our patient because of its minimal AEs and is not known to increase the risk of thrombosis.

Lanadelumab is a novel medication, recently approved in 2018 by the US Food and Drug Administration for the treatment of type I and type 2 HAE in patients aged ≥ 12 years.7 The phase 3 Hereditary Angioedema Long-term Prophylaxis (HELP) study concluded that treatment with subcutaneous lanadelumab for 26 weeks significantly decreased the frequency of angioedema attacks compared with placebo.5 However, 113 (90.4%) of patients in the phase III HELP study had type I HAE.5 Of the 125 patients that completed this randomized, double-blind study, only 12 had type II HAE.5 In addition, this study only included 5 patients aged ≥ 65 years.5 Also, no patients aged ≥ 65 years were part of the treatment arms that included a lanadelumab dose of 300 mg.5 In a case series of 12 patients in Canada, treatment with lanadelumab decreased angioedema attacks by 72%.9 However, the series only included 1 patient with type II HAE who was aged 36 years.9 Therefore, our case demonstrates the efficacy of lanadelumab in a patient aged ≥ 65 years with type II HAE.

Conclusions

HAE is a rare and potentially fatal disease characterized by recurrent, unpredictable attacks of edema throughout the body. The disease burden adversely affects a patient’s quality of life. Therefore, long-term prophylaxis is critical to managing patients with HAE. Lanadelumab has been proven as an effective long-term prophylactic treatment option for HAE attacks. This case supports the use of lanadelumab in patients with type II HAE and patients aged ≥ 65 years.

Acknowledgments

The patient was initially written up based on his delayed diagnosis as a case report.3 An earlier version of this article was presented by Samuel Weiss, MD, and Derek Smith, MD, as a poster at the American Academy of Allergy, Asthma, and Immunology virtual conference February 26 to March 1, 2021.

References

1. Busse PJ, Christiansen SC. Hereditary angioedema. N Engl J Med. 2020;382(12):1136-1148. doi:10.1056/NEJMra1808012

2. Bernstein JA. Severity of hereditary angioedema, prevalence, and diagnostic considerations. Am J Manag Care. 2018;24(14)(suppl):S292-S298.

3. Berger J, Carroll MP Jr, Champoux E, Coop CA. Extremely delayed diagnosis of type II hereditary angioedema: case report and review of the literature. Mil Med. 2018;183(11-12):e765-e767. doi:10.1093/milmed/usy031

4. Fouche AS, Saunders EF, Craig T. Depression and anxiety in patients with hereditary angioedema. Ann Allergy Asthma Immunol. 2014;112(4):371-375. doi:10.1016/j.anai.2013.05.028

5. Banerji A, Riedl MA, Bernstein JA, et al; HELP Investigators. Effect of lanadelumab compared with placebo on prevention of hereditary angioedema attacks: a randomized clinical trial. JAMA. 2018;320(20):2108-2121. doi:10.1001/jama.2018.16773

6. Busse PJ, Farkas H, Banerji A, et al. Lanadelumab for the prophylactic treatment of hereditary angioedema with C1 inhibitor deficiency: a review of preclinical and phase I studies. BioDrugs. 2019;33(1):33-43. doi:10.1007/s40259-018-0325-y

7. Riedl MA, Maurer M, Bernstein JA, et al. Lanadelumab demonstrates rapid and sustained prevention of hereditary angioedema attacks. Allergy. 2020;75(11):2879-2887. doi:10.1111/all.14416

8. Maurer M, Magerl M, Betschel S, et al. The international WAO/EAACI guideline for the management of hereditary angioedema—the 2021 revision and update. Allergy. 2022;77(7):1961-1990. doi:10.1111/all.15214

9. Iaboni A, Kanani A, Lacuesta G, Song C, Kan M, Betschel SD. Impact of lanadelumab in hereditary angioedema: a case series of 12 patients in Canada. Allergy Asthma Clin Immunol. 2021;17(1):78. Published 2021 Jul 23. doi:10.1186/s13223-021-00579-6

References

1. Busse PJ, Christiansen SC. Hereditary angioedema. N Engl J Med. 2020;382(12):1136-1148. doi:10.1056/NEJMra1808012

2. Bernstein JA. Severity of hereditary angioedema, prevalence, and diagnostic considerations. Am J Manag Care. 2018;24(14)(suppl):S292-S298.

3. Berger J, Carroll MP Jr, Champoux E, Coop CA. Extremely delayed diagnosis of type II hereditary angioedema: case report and review of the literature. Mil Med. 2018;183(11-12):e765-e767. doi:10.1093/milmed/usy031

4. Fouche AS, Saunders EF, Craig T. Depression and anxiety in patients with hereditary angioedema. Ann Allergy Asthma Immunol. 2014;112(4):371-375. doi:10.1016/j.anai.2013.05.028

5. Banerji A, Riedl MA, Bernstein JA, et al; HELP Investigators. Effect of lanadelumab compared with placebo on prevention of hereditary angioedema attacks: a randomized clinical trial. JAMA. 2018;320(20):2108-2121. doi:10.1001/jama.2018.16773

6. Busse PJ, Farkas H, Banerji A, et al. Lanadelumab for the prophylactic treatment of hereditary angioedema with C1 inhibitor deficiency: a review of preclinical and phase I studies. BioDrugs. 2019;33(1):33-43. doi:10.1007/s40259-018-0325-y

7. Riedl MA, Maurer M, Bernstein JA, et al. Lanadelumab demonstrates rapid and sustained prevention of hereditary angioedema attacks. Allergy. 2020;75(11):2879-2887. doi:10.1111/all.14416

8. Maurer M, Magerl M, Betschel S, et al. The international WAO/EAACI guideline for the management of hereditary angioedema—the 2021 revision and update. Allergy. 2022;77(7):1961-1990. doi:10.1111/all.15214

9. Iaboni A, Kanani A, Lacuesta G, Song C, Kan M, Betschel SD. Impact of lanadelumab in hereditary angioedema: a case series of 12 patients in Canada. Allergy Asthma Clin Immunol. 2021;17(1):78. Published 2021 Jul 23. doi:10.1186/s13223-021-00579-6

Issue
Federal Practitioner - 39(9)a
Issue
Federal Practitioner - 39(9)a
Page Number
390=392
Page Number
390=392
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

75 Years of the Historic Partnership Between the VA and Academic Medical Centers

Article Type
Changed

The US government has a legacy of providing support for veterans. Pensions were offered to disabled veterans as early as 1776, and benefits were expanded to cover medical needs as the country grew and modernized.1,2 Enacted during the Civil War, the General Pension Act increased benefits for widows and dependents.2 Rehabilitation and vocational training assistance benefits were added after World War I, and the US Department of Veterans Affairs (VA) was created in 1930 to consolidate all benefits under one umbrella organization.2,3

Prior to World War II, the VA lacked the bed capacity for the 4 million veterans who were eligible for care. This shortage became more acute by the end of the war, when the number of eligible veterans increased by 15 million.4 Although the VA successfully built bed capacity through acquisition of military hospitals, VA hospitals struggled to recruit clinical staff.2 Physicians were hesitant to join the VA because civil service salaries were lower than comparable positions in the community, and the VA offered limited opportunities for research or continuing education. These limitations negatively impacted the overall reputation of the VA. The American Medical Association (AMA) was reluctant to directly admit VA physicians for membership because of a “lower” standard of care at VA hospitals.2 This review will describe how passage of 2 legislative actions, the Servicemen’s Readjustment Act and Public Law (PL)79-293, and a key policy memorandum set the foundation for the partnership between the VA and academic medical centers. This led to improved medical care for veterans and expansion of health professions education for VA and the nation.5,6

GI Bill of Rights

The passage of the Servicemen’s Readjustment Act of 1944, better known as the GI Bill of Rights, provided education assistance, guaranteed home loans, and unemployment payments to veterans.5 All medical officers serving during the war were eligible for this benefit, which effectively increased the number of potential physician trainees at the end of World War II by almost 60,000.7 Medical education at the time was simultaneously undergoing a transformation with more rigorous training and a push to standardize medical education across state lines. While prerequisite training was not required for admission to many medical schools and curricula varied in length based on state licensing requirements, more programs were adding premedical education requirements and transitioning to the 4-year curricula seen today. At this time, only 23 states required postgraduate internships for licensure, but this number was growing.8 The American Board of Medical Specialties was established several years prior to World War II in 1934 to elevate the quality of care; the desire for residency training and board certification continued to gain traction during the 1940s.9

 

 

Medical Training

In anticipation of an influx of medical trainees, the Committee on Postwar Medical Service conducted a comprehensive survey to understand the training needs of physician veterans returning from World War II.7 The survey collected data from medical officers on their desired length of training, interest in specialty board certification, time served, and type of medical practice prior to enlisting. Length of desired training was categorized as short (up to 6 months), which would serve as a refresher course and provide updates on recent advances in medicine and surgery, and long (> 6 months), which resembled a modern internship or residency. Nineteen percent did not want additional training, 22% wished to pursue short courses, and 51% were interested in longer courses. Most respondents also wished to obtain board certification.7 The AMA played a significant role in supporting the expansion of training opportunities, encouraging all accredited hospitals to assess their capacity to determine the number of additional residents they could accommodate. The AMA also awarded hospitals with existing internship programs temporary accreditation to allow them to add extended training through residency programs.7

Medical schools devised creative solutions to meet the needs of returning physician veterans and capitalize on the available educational benefits. Postgraduate refresher courses that varied in length from hours to months were developed focusing on an array of topics. In addition to basic medical principles, courses covered general topics, such as advances in medicine, to specialty topics, such as nutrition or ophthalmology.7 Although the courses could not be counted toward board certification, participation increased by almost 300% in the 1945/1946 academic year relative to the previous year.7 Increasing access to the longer training courses, including internships and residencies, was often achieved through experiences outside the clinical setting. Yale University modified its curriculum to reduce time devoted to lectures on published materials and encourage active learning and community outreach.10 Northwestern University assigned residents to spend 1 of their 3 years “out of residence” in basic science and clinical instruction provided by the medical school. Tuition assistance from the GI Bill supported the additional expenses incurred by the medical school to fund laboratory space, equipment, and the salaries of the basic science instructors and administrative staff.11

Public Law 79-293

Public Law 79-293 was passed on January 3, 1946, establishing the Department of Medicine and Surgery within the VA. The law, which became the basis for Title 38 chapters 73 and 74, allowed VA hospitals flexibility to hire doctors, dentists, and nurses without regard to the civil service regulations and salary restrictions associated with other federal positions.6

Concerns about quality of care had been mounting for years, and the release of several sensationalized and critical articles motivated VA leadership to make sweeping changes. One article described neglect at VA hospitals.12 Excessive paperwork and low economic benefits were identified as barriers to the recruitment of qualified clinicians at the VA.2 The VA Special Medical Advisory Group investigating the claims recommended that the VA encourage their hospitals to affiliate with medical schools to improve the quality of care. This group also recommended that new VA hospitals be constructed near academic medical centers to allow access to consultants.2 Three large veterans service organizations (American Legion, Veterans of Foreign Wars, and Disabled American Veterans) conducted their own investigations in response to the media reports. The organizations reported that the quality of care in most VA hospitals was already on par with the community but indicated that the VA would benefit from expansion of medical research and training, increased bed capacity, reduction in the administrative burden on clinicians, and increased salaries for clinical staff.2

 

 

Policy Memorandum No. 2

The relationship between VA and academic medical centers was solidified on January 30, 1946, with adoption of Policy Memorandum No. 2.13 This memorandum allowed for the establishment of relationships with academic medical centers to provide “the veteran a much higher standard of medical care than could be given him with a wholly full-time medical staff.” Shortly after this memorandum was signed, residents from Northwestern University and the University of Illinois at Chicago began clinical rotations at the Hines VA facility in Chicago, Illinois.2 By 1947, 62 medical schools had committed to an affiliation with local VA hospitals and 21 deans’ committees were in operation, which were responsible for the appointment of physician residents and consultants. The AMA extended direct membership privileges to VA physicians, and by 1947 the number of residency positions doubled nationally.14,15 The almost universal support of the relationship between VA and academic affiliates provided educational opportunities for returning veterans and raised standards for medical education nationally.

Current State

Since the passage of PL 79-293 and PM No. 2, the VA-academic health professions education partnership has grown to include 113,000 trainees rotating through 150 VA medical centers annually from more than 1400 colleges and universities.16 Most VA podiatrists, psychologists, optometrists, and physicians working in VA medical centers also trained at VA, and trainees are 37% more likely to consider a job at VA after completing their clinical rotations. This unique partnership began 76 years ago and continues to provide clinicians “for VA and the nation.”

References

1. Glasson WH. History of military pension legislation in the United States. Columbia University Press; 1900.

2. Lewis BJ. Veterans Administration medical program relationship with medical schools in the United States. Dissertation. The American University; 1969.

3. Kracke RR. The role of the medical college in the medical care of the veteran. J Med Assoc State Ala. 1950;19(8):225-230.

4. US Department of Veterans Affairs, Office of Public Affairs. VA History in Brief. VA Pamphlet 80-97-2. Washington, DC: United States Department of Veterans Affairs; 1997.

5. Servicesmen’s Readjustment Act of 1944. 38 USC § 370 (1944).

6. To establish a Department of Medicine and Surgery in the Veterans’ Administration. 38 USC § 73-74 (1946). Accessed August 2, 2022.

7. Lueth HC. Postgraduate wishes of medical officers: final report on 21,029 questionnaires. J Am Med Assoc. 1945; 127(13):759-770.

8. Johnson V, Arestad FH, Tipner A. Medical education in the United States and Canada: forty-sixth annual report on medical education in the United States and Canada by the Council on Medical Education and Hospitals of the American Medical Association. J Am Med Assoc. 1946;131(16):1277-1310.

9. Chesney AM. Some impacts of the specialty board movement on medical education. J Assoc Am Med Coll. 1948;23(2):83-89.

10. Hiscock IV. New frontiers in health education. Can J Public Health. 1946;37(11):452-457.

11. Colwell AR. Principles of graduate medical instruction: with a specific plan of application in a medical school. J Am Med Assoc. 1945;127(13):741-746.

12. Maisel, AQ. The veteran betrayed. How long will the Veterans’ Administration continue to give third-rate medical care to first-rate men? Cosmopolitan. 1945(3):45.

13. US Veterans Administration. Policy Memorandum No. 2: Policy in association of veterans’ hospitals with medical schools. January 30, 1946.

14. American Medical Association. Digest of Official Actions: 1846-1958. JAMA. 1946;132:1094.

15. Wentz DK, Ford CV. A brief history of the internship. JAMA. 1984;252(24):3390-3394. doi:10.1001/jama.1984.03350240036035

16. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education academic year 2022-2021. Accessed August 8, 2022. https://www.va.gov/OAA/docs/OAA_Stats_AY_2020_2021_FINAL.pdf

Article PDF
Author and Disclosure Information

Andrea D. Birnbaum, MD, PhDa,b; Paul B. Greenberg, MD, MPHa,c; Karen M. Sanders, MDa,d
Correspondence: Andrea Birnbaum (andrea.birnbaum@va.gov)

aOffice of Academic Affiliations, Veterans Health Administration, US Department of Veterans Affairs, Washington, DC
bDepartment of Ophthalmology, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
cDivision of Ophthalmology, Warren Alpert Medical School, Brown University, Providence, Rhode Island
dDepartment of Internal Medicine, Virginia Commonwealth University School of Medicine, Richmond

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Issue
Federal Practitioner - 39(9)a
Publications
Topics
Page Number
368-370
Sections
Author and Disclosure Information

Andrea D. Birnbaum, MD, PhDa,b; Paul B. Greenberg, MD, MPHa,c; Karen M. Sanders, MDa,d
Correspondence: Andrea Birnbaum (andrea.birnbaum@va.gov)

aOffice of Academic Affiliations, Veterans Health Administration, US Department of Veterans Affairs, Washington, DC
bDepartment of Ophthalmology, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
cDivision of Ophthalmology, Warren Alpert Medical School, Brown University, Providence, Rhode Island
dDepartment of Internal Medicine, Virginia Commonwealth University School of Medicine, Richmond

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Author and Disclosure Information

Andrea D. Birnbaum, MD, PhDa,b; Paul B. Greenberg, MD, MPHa,c; Karen M. Sanders, MDa,d
Correspondence: Andrea Birnbaum (andrea.birnbaum@va.gov)

aOffice of Academic Affiliations, Veterans Health Administration, US Department of Veterans Affairs, Washington, DC
bDepartment of Ophthalmology, Feinberg School of Medicine, Northwestern University, Chicago, Illinois
cDivision of Ophthalmology, Warren Alpert Medical School, Brown University, Providence, Rhode Island
dDepartment of Internal Medicine, Virginia Commonwealth University School of Medicine, Richmond

Author disclosures

The authors report no actual or potential conflicts of interest or outside sources of funding with regard to this article.

Disclaimer

The opinions expressed herein are those of the authors and do not necessarily reflect those of Federal Practitioner, Frontline Medical Communications Inc., the US Government, or any of its agencies.

Article PDF
Article PDF

The US government has a legacy of providing support for veterans. Pensions were offered to disabled veterans as early as 1776, and benefits were expanded to cover medical needs as the country grew and modernized.1,2 Enacted during the Civil War, the General Pension Act increased benefits for widows and dependents.2 Rehabilitation and vocational training assistance benefits were added after World War I, and the US Department of Veterans Affairs (VA) was created in 1930 to consolidate all benefits under one umbrella organization.2,3

Prior to World War II, the VA lacked the bed capacity for the 4 million veterans who were eligible for care. This shortage became more acute by the end of the war, when the number of eligible veterans increased by 15 million.4 Although the VA successfully built bed capacity through acquisition of military hospitals, VA hospitals struggled to recruit clinical staff.2 Physicians were hesitant to join the VA because civil service salaries were lower than comparable positions in the community, and the VA offered limited opportunities for research or continuing education. These limitations negatively impacted the overall reputation of the VA. The American Medical Association (AMA) was reluctant to directly admit VA physicians for membership because of a “lower” standard of care at VA hospitals.2 This review will describe how passage of 2 legislative actions, the Servicemen’s Readjustment Act and Public Law (PL)79-293, and a key policy memorandum set the foundation for the partnership between the VA and academic medical centers. This led to improved medical care for veterans and expansion of health professions education for VA and the nation.5,6

GI Bill of Rights

The passage of the Servicemen’s Readjustment Act of 1944, better known as the GI Bill of Rights, provided education assistance, guaranteed home loans, and unemployment payments to veterans.5 All medical officers serving during the war were eligible for this benefit, which effectively increased the number of potential physician trainees at the end of World War II by almost 60,000.7 Medical education at the time was simultaneously undergoing a transformation with more rigorous training and a push to standardize medical education across state lines. While prerequisite training was not required for admission to many medical schools and curricula varied in length based on state licensing requirements, more programs were adding premedical education requirements and transitioning to the 4-year curricula seen today. At this time, only 23 states required postgraduate internships for licensure, but this number was growing.8 The American Board of Medical Specialties was established several years prior to World War II in 1934 to elevate the quality of care; the desire for residency training and board certification continued to gain traction during the 1940s.9

 

 

Medical Training

In anticipation of an influx of medical trainees, the Committee on Postwar Medical Service conducted a comprehensive survey to understand the training needs of physician veterans returning from World War II.7 The survey collected data from medical officers on their desired length of training, interest in specialty board certification, time served, and type of medical practice prior to enlisting. Length of desired training was categorized as short (up to 6 months), which would serve as a refresher course and provide updates on recent advances in medicine and surgery, and long (> 6 months), which resembled a modern internship or residency. Nineteen percent did not want additional training, 22% wished to pursue short courses, and 51% were interested in longer courses. Most respondents also wished to obtain board certification.7 The AMA played a significant role in supporting the expansion of training opportunities, encouraging all accredited hospitals to assess their capacity to determine the number of additional residents they could accommodate. The AMA also awarded hospitals with existing internship programs temporary accreditation to allow them to add extended training through residency programs.7

Medical schools devised creative solutions to meet the needs of returning physician veterans and capitalize on the available educational benefits. Postgraduate refresher courses that varied in length from hours to months were developed focusing on an array of topics. In addition to basic medical principles, courses covered general topics, such as advances in medicine, to specialty topics, such as nutrition or ophthalmology.7 Although the courses could not be counted toward board certification, participation increased by almost 300% in the 1945/1946 academic year relative to the previous year.7 Increasing access to the longer training courses, including internships and residencies, was often achieved through experiences outside the clinical setting. Yale University modified its curriculum to reduce time devoted to lectures on published materials and encourage active learning and community outreach.10 Northwestern University assigned residents to spend 1 of their 3 years “out of residence” in basic science and clinical instruction provided by the medical school. Tuition assistance from the GI Bill supported the additional expenses incurred by the medical school to fund laboratory space, equipment, and the salaries of the basic science instructors and administrative staff.11

Public Law 79-293

Public Law 79-293 was passed on January 3, 1946, establishing the Department of Medicine and Surgery within the VA. The law, which became the basis for Title 38 chapters 73 and 74, allowed VA hospitals flexibility to hire doctors, dentists, and nurses without regard to the civil service regulations and salary restrictions associated with other federal positions.6

Concerns about quality of care had been mounting for years, and the release of several sensationalized and critical articles motivated VA leadership to make sweeping changes. One article described neglect at VA hospitals.12 Excessive paperwork and low economic benefits were identified as barriers to the recruitment of qualified clinicians at the VA.2 The VA Special Medical Advisory Group investigating the claims recommended that the VA encourage their hospitals to affiliate with medical schools to improve the quality of care. This group also recommended that new VA hospitals be constructed near academic medical centers to allow access to consultants.2 Three large veterans service organizations (American Legion, Veterans of Foreign Wars, and Disabled American Veterans) conducted their own investigations in response to the media reports. The organizations reported that the quality of care in most VA hospitals was already on par with the community but indicated that the VA would benefit from expansion of medical research and training, increased bed capacity, reduction in the administrative burden on clinicians, and increased salaries for clinical staff.2

 

 

Policy Memorandum No. 2

The relationship between VA and academic medical centers was solidified on January 30, 1946, with adoption of Policy Memorandum No. 2.13 This memorandum allowed for the establishment of relationships with academic medical centers to provide “the veteran a much higher standard of medical care than could be given him with a wholly full-time medical staff.” Shortly after this memorandum was signed, residents from Northwestern University and the University of Illinois at Chicago began clinical rotations at the Hines VA facility in Chicago, Illinois.2 By 1947, 62 medical schools had committed to an affiliation with local VA hospitals and 21 deans’ committees were in operation, which were responsible for the appointment of physician residents and consultants. The AMA extended direct membership privileges to VA physicians, and by 1947 the number of residency positions doubled nationally.14,15 The almost universal support of the relationship between VA and academic affiliates provided educational opportunities for returning veterans and raised standards for medical education nationally.

Current State

Since the passage of PL 79-293 and PM No. 2, the VA-academic health professions education partnership has grown to include 113,000 trainees rotating through 150 VA medical centers annually from more than 1400 colleges and universities.16 Most VA podiatrists, psychologists, optometrists, and physicians working in VA medical centers also trained at VA, and trainees are 37% more likely to consider a job at VA after completing their clinical rotations. This unique partnership began 76 years ago and continues to provide clinicians “for VA and the nation.”

The US government has a legacy of providing support for veterans. Pensions were offered to disabled veterans as early as 1776, and benefits were expanded to cover medical needs as the country grew and modernized.1,2 Enacted during the Civil War, the General Pension Act increased benefits for widows and dependents.2 Rehabilitation and vocational training assistance benefits were added after World War I, and the US Department of Veterans Affairs (VA) was created in 1930 to consolidate all benefits under one umbrella organization.2,3

Prior to World War II, the VA lacked the bed capacity for the 4 million veterans who were eligible for care. This shortage became more acute by the end of the war, when the number of eligible veterans increased by 15 million.4 Although the VA successfully built bed capacity through acquisition of military hospitals, VA hospitals struggled to recruit clinical staff.2 Physicians were hesitant to join the VA because civil service salaries were lower than comparable positions in the community, and the VA offered limited opportunities for research or continuing education. These limitations negatively impacted the overall reputation of the VA. The American Medical Association (AMA) was reluctant to directly admit VA physicians for membership because of a “lower” standard of care at VA hospitals.2 This review will describe how passage of 2 legislative actions, the Servicemen’s Readjustment Act and Public Law (PL)79-293, and a key policy memorandum set the foundation for the partnership between the VA and academic medical centers. This led to improved medical care for veterans and expansion of health professions education for VA and the nation.5,6

GI Bill of Rights

The passage of the Servicemen’s Readjustment Act of 1944, better known as the GI Bill of Rights, provided education assistance, guaranteed home loans, and unemployment payments to veterans.5 All medical officers serving during the war were eligible for this benefit, which effectively increased the number of potential physician trainees at the end of World War II by almost 60,000.7 Medical education at the time was simultaneously undergoing a transformation with more rigorous training and a push to standardize medical education across state lines. While prerequisite training was not required for admission to many medical schools and curricula varied in length based on state licensing requirements, more programs were adding premedical education requirements and transitioning to the 4-year curricula seen today. At this time, only 23 states required postgraduate internships for licensure, but this number was growing.8 The American Board of Medical Specialties was established several years prior to World War II in 1934 to elevate the quality of care; the desire for residency training and board certification continued to gain traction during the 1940s.9

 

 

Medical Training

In anticipation of an influx of medical trainees, the Committee on Postwar Medical Service conducted a comprehensive survey to understand the training needs of physician veterans returning from World War II.7 The survey collected data from medical officers on their desired length of training, interest in specialty board certification, time served, and type of medical practice prior to enlisting. Length of desired training was categorized as short (up to 6 months), which would serve as a refresher course and provide updates on recent advances in medicine and surgery, and long (> 6 months), which resembled a modern internship or residency. Nineteen percent did not want additional training, 22% wished to pursue short courses, and 51% were interested in longer courses. Most respondents also wished to obtain board certification.7 The AMA played a significant role in supporting the expansion of training opportunities, encouraging all accredited hospitals to assess their capacity to determine the number of additional residents they could accommodate. The AMA also awarded hospitals with existing internship programs temporary accreditation to allow them to add extended training through residency programs.7

Medical schools devised creative solutions to meet the needs of returning physician veterans and capitalize on the available educational benefits. Postgraduate refresher courses that varied in length from hours to months were developed focusing on an array of topics. In addition to basic medical principles, courses covered general topics, such as advances in medicine, to specialty topics, such as nutrition or ophthalmology.7 Although the courses could not be counted toward board certification, participation increased by almost 300% in the 1945/1946 academic year relative to the previous year.7 Increasing access to the longer training courses, including internships and residencies, was often achieved through experiences outside the clinical setting. Yale University modified its curriculum to reduce time devoted to lectures on published materials and encourage active learning and community outreach.10 Northwestern University assigned residents to spend 1 of their 3 years “out of residence” in basic science and clinical instruction provided by the medical school. Tuition assistance from the GI Bill supported the additional expenses incurred by the medical school to fund laboratory space, equipment, and the salaries of the basic science instructors and administrative staff.11

Public Law 79-293

Public Law 79-293 was passed on January 3, 1946, establishing the Department of Medicine and Surgery within the VA. The law, which became the basis for Title 38 chapters 73 and 74, allowed VA hospitals flexibility to hire doctors, dentists, and nurses without regard to the civil service regulations and salary restrictions associated with other federal positions.6

Concerns about quality of care had been mounting for years, and the release of several sensationalized and critical articles motivated VA leadership to make sweeping changes. One article described neglect at VA hospitals.12 Excessive paperwork and low economic benefits were identified as barriers to the recruitment of qualified clinicians at the VA.2 The VA Special Medical Advisory Group investigating the claims recommended that the VA encourage their hospitals to affiliate with medical schools to improve the quality of care. This group also recommended that new VA hospitals be constructed near academic medical centers to allow access to consultants.2 Three large veterans service organizations (American Legion, Veterans of Foreign Wars, and Disabled American Veterans) conducted their own investigations in response to the media reports. The organizations reported that the quality of care in most VA hospitals was already on par with the community but indicated that the VA would benefit from expansion of medical research and training, increased bed capacity, reduction in the administrative burden on clinicians, and increased salaries for clinical staff.2

 

 

Policy Memorandum No. 2

The relationship between VA and academic medical centers was solidified on January 30, 1946, with adoption of Policy Memorandum No. 2.13 This memorandum allowed for the establishment of relationships with academic medical centers to provide “the veteran a much higher standard of medical care than could be given him with a wholly full-time medical staff.” Shortly after this memorandum was signed, residents from Northwestern University and the University of Illinois at Chicago began clinical rotations at the Hines VA facility in Chicago, Illinois.2 By 1947, 62 medical schools had committed to an affiliation with local VA hospitals and 21 deans’ committees were in operation, which were responsible for the appointment of physician residents and consultants. The AMA extended direct membership privileges to VA physicians, and by 1947 the number of residency positions doubled nationally.14,15 The almost universal support of the relationship between VA and academic affiliates provided educational opportunities for returning veterans and raised standards for medical education nationally.

Current State

Since the passage of PL 79-293 and PM No. 2, the VA-academic health professions education partnership has grown to include 113,000 trainees rotating through 150 VA medical centers annually from more than 1400 colleges and universities.16 Most VA podiatrists, psychologists, optometrists, and physicians working in VA medical centers also trained at VA, and trainees are 37% more likely to consider a job at VA after completing their clinical rotations. This unique partnership began 76 years ago and continues to provide clinicians “for VA and the nation.”

References

1. Glasson WH. History of military pension legislation in the United States. Columbia University Press; 1900.

2. Lewis BJ. Veterans Administration medical program relationship with medical schools in the United States. Dissertation. The American University; 1969.

3. Kracke RR. The role of the medical college in the medical care of the veteran. J Med Assoc State Ala. 1950;19(8):225-230.

4. US Department of Veterans Affairs, Office of Public Affairs. VA History in Brief. VA Pamphlet 80-97-2. Washington, DC: United States Department of Veterans Affairs; 1997.

5. Servicesmen’s Readjustment Act of 1944. 38 USC § 370 (1944).

6. To establish a Department of Medicine and Surgery in the Veterans’ Administration. 38 USC § 73-74 (1946). Accessed August 2, 2022.

7. Lueth HC. Postgraduate wishes of medical officers: final report on 21,029 questionnaires. J Am Med Assoc. 1945; 127(13):759-770.

8. Johnson V, Arestad FH, Tipner A. Medical education in the United States and Canada: forty-sixth annual report on medical education in the United States and Canada by the Council on Medical Education and Hospitals of the American Medical Association. J Am Med Assoc. 1946;131(16):1277-1310.

9. Chesney AM. Some impacts of the specialty board movement on medical education. J Assoc Am Med Coll. 1948;23(2):83-89.

10. Hiscock IV. New frontiers in health education. Can J Public Health. 1946;37(11):452-457.

11. Colwell AR. Principles of graduate medical instruction: with a specific plan of application in a medical school. J Am Med Assoc. 1945;127(13):741-746.

12. Maisel, AQ. The veteran betrayed. How long will the Veterans’ Administration continue to give third-rate medical care to first-rate men? Cosmopolitan. 1945(3):45.

13. US Veterans Administration. Policy Memorandum No. 2: Policy in association of veterans’ hospitals with medical schools. January 30, 1946.

14. American Medical Association. Digest of Official Actions: 1846-1958. JAMA. 1946;132:1094.

15. Wentz DK, Ford CV. A brief history of the internship. JAMA. 1984;252(24):3390-3394. doi:10.1001/jama.1984.03350240036035

16. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education academic year 2022-2021. Accessed August 8, 2022. https://www.va.gov/OAA/docs/OAA_Stats_AY_2020_2021_FINAL.pdf

References

1. Glasson WH. History of military pension legislation in the United States. Columbia University Press; 1900.

2. Lewis BJ. Veterans Administration medical program relationship with medical schools in the United States. Dissertation. The American University; 1969.

3. Kracke RR. The role of the medical college in the medical care of the veteran. J Med Assoc State Ala. 1950;19(8):225-230.

4. US Department of Veterans Affairs, Office of Public Affairs. VA History in Brief. VA Pamphlet 80-97-2. Washington, DC: United States Department of Veterans Affairs; 1997.

5. Servicesmen’s Readjustment Act of 1944. 38 USC § 370 (1944).

6. To establish a Department of Medicine and Surgery in the Veterans’ Administration. 38 USC § 73-74 (1946). Accessed August 2, 2022.

7. Lueth HC. Postgraduate wishes of medical officers: final report on 21,029 questionnaires. J Am Med Assoc. 1945; 127(13):759-770.

8. Johnson V, Arestad FH, Tipner A. Medical education in the United States and Canada: forty-sixth annual report on medical education in the United States and Canada by the Council on Medical Education and Hospitals of the American Medical Association. J Am Med Assoc. 1946;131(16):1277-1310.

9. Chesney AM. Some impacts of the specialty board movement on medical education. J Assoc Am Med Coll. 1948;23(2):83-89.

10. Hiscock IV. New frontiers in health education. Can J Public Health. 1946;37(11):452-457.

11. Colwell AR. Principles of graduate medical instruction: with a specific plan of application in a medical school. J Am Med Assoc. 1945;127(13):741-746.

12. Maisel, AQ. The veteran betrayed. How long will the Veterans’ Administration continue to give third-rate medical care to first-rate men? Cosmopolitan. 1945(3):45.

13. US Veterans Administration. Policy Memorandum No. 2: Policy in association of veterans’ hospitals with medical schools. January 30, 1946.

14. American Medical Association. Digest of Official Actions: 1846-1958. JAMA. 1946;132:1094.

15. Wentz DK, Ford CV. A brief history of the internship. JAMA. 1984;252(24):3390-3394. doi:10.1001/jama.1984.03350240036035

16. US Department of Veterans Affairs, Veterans Health Administration, Office of Academic Affiliations. Health professions education academic year 2022-2021. Accessed August 8, 2022. https://www.va.gov/OAA/docs/OAA_Stats_AY_2020_2021_FINAL.pdf

Issue
Federal Practitioner - 39(9)a
Issue
Federal Practitioner - 39(9)a
Page Number
368-370
Page Number
368-370
Publications
Publications
Topics
Article Type
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article
Article PDF Media

Parent training pays off for children with autism

Article Type
Changed

There’s strong evidence that training parents to guide the development of children with autism reaps consistent benefits, according to a systematic review and meta-analysis of more than 50 high-quality studies.

“Referrals for parent training should now be considered the expected standard for medical practice,” said a member of the research team, Timothy B. Smith, PhD, a professor of psychology at Brigham Young University, Provo, Utah.

Dr. Timothy B. Smith


Programs that show parents how to teach functional skills and address maladaptive behaviors, also known as parent-mediated or parent-implemented interventions, offer an alternative to one-on-one professional services, which are in short supply, according to the paper, which was published in the Journal of Autism and Developmental Disorders.

Methods and results

The meta-analysis included 54 papers based on randomized clinical trials involving 2,895 children, which compared the effects of various parent interventions with professional treatment, treatment as usual, or being on a wait-list to receive an intervention.

Overall the research team reported “moderately strong” average benefits from the parent-mediated interventions (Hedges’ g, 0.553), indicating a medium effect size. Parent interventions had the greatest effect on outcomes involving positive behavior and social skills (0.603), followed by language and communication (0.545), maladaptive behavior (0.519), and life skills (0.239).

Similar benefits were observed regardless of a child’s age or sex or which parent or parents implemented an intervention. The effects also appeared to be consistent regardless of intervention characteristics, such as the number of training sessions parents received, although the researchers noted that many studies did not provide data on such details.

Paul Carbone, MD, a professor of pediatrics at the University of Utah, Salt Lake City, who was not involved in the review, said it demonstrates that such parental engagement is “vitally important” and pediatricians “should not hesitate to refer interested families.”

Dr. Paul Carbone


Dr. Carbone, who is the medical director of an assessment program for children with suspected developmental disabilities, said many training programs for parents have adopted telehealth, adding to their convenience. To make appropriate referrals, primary care clinicians should become acquainted with local programs and learn which outcomes they target, he said.

Dr. Smith noted that primary care physicians are “better trained now than ever” to identify autism spectrum disorder and therefore are among the first to identify those conditions and help parents understand “that their actions at home absolutely make a difference in the child’s development.”

Overcoming limitations, future research needs

The research team attempted to overcome limitations with previous reviews by using comprehensive search terms and other methods to identify relevant studies, including some that had not been published. They included only studies that reflect common practice of training multiple parents simultaneously, they wrote.

Dr. Smith noted that long-term outcomes data and further study to compare effects on children with mild, moderate, and severe autism are needed.

Although logic would suggest greater benefits for children with severe disease, there are no data to demonstrate that, he said.

The authors of the study and Dr. Carbone reported no relevant competing interests.

Publications
Topics
Sections

There’s strong evidence that training parents to guide the development of children with autism reaps consistent benefits, according to a systematic review and meta-analysis of more than 50 high-quality studies.

“Referrals for parent training should now be considered the expected standard for medical practice,” said a member of the research team, Timothy B. Smith, PhD, a professor of psychology at Brigham Young University, Provo, Utah.

Dr. Timothy B. Smith


Programs that show parents how to teach functional skills and address maladaptive behaviors, also known as parent-mediated or parent-implemented interventions, offer an alternative to one-on-one professional services, which are in short supply, according to the paper, which was published in the Journal of Autism and Developmental Disorders.

Methods and results

The meta-analysis included 54 papers based on randomized clinical trials involving 2,895 children, which compared the effects of various parent interventions with professional treatment, treatment as usual, or being on a wait-list to receive an intervention.

Overall the research team reported “moderately strong” average benefits from the parent-mediated interventions (Hedges’ g, 0.553), indicating a medium effect size. Parent interventions had the greatest effect on outcomes involving positive behavior and social skills (0.603), followed by language and communication (0.545), maladaptive behavior (0.519), and life skills (0.239).

Similar benefits were observed regardless of a child’s age or sex or which parent or parents implemented an intervention. The effects also appeared to be consistent regardless of intervention characteristics, such as the number of training sessions parents received, although the researchers noted that many studies did not provide data on such details.

Paul Carbone, MD, a professor of pediatrics at the University of Utah, Salt Lake City, who was not involved in the review, said it demonstrates that such parental engagement is “vitally important” and pediatricians “should not hesitate to refer interested families.”

Dr. Paul Carbone


Dr. Carbone, who is the medical director of an assessment program for children with suspected developmental disabilities, said many training programs for parents have adopted telehealth, adding to their convenience. To make appropriate referrals, primary care clinicians should become acquainted with local programs and learn which outcomes they target, he said.

Dr. Smith noted that primary care physicians are “better trained now than ever” to identify autism spectrum disorder and therefore are among the first to identify those conditions and help parents understand “that their actions at home absolutely make a difference in the child’s development.”

Overcoming limitations, future research needs

The research team attempted to overcome limitations with previous reviews by using comprehensive search terms and other methods to identify relevant studies, including some that had not been published. They included only studies that reflect common practice of training multiple parents simultaneously, they wrote.

Dr. Smith noted that long-term outcomes data and further study to compare effects on children with mild, moderate, and severe autism are needed.

Although logic would suggest greater benefits for children with severe disease, there are no data to demonstrate that, he said.

The authors of the study and Dr. Carbone reported no relevant competing interests.

There’s strong evidence that training parents to guide the development of children with autism reaps consistent benefits, according to a systematic review and meta-analysis of more than 50 high-quality studies.

“Referrals for parent training should now be considered the expected standard for medical practice,” said a member of the research team, Timothy B. Smith, PhD, a professor of psychology at Brigham Young University, Provo, Utah.

Dr. Timothy B. Smith


Programs that show parents how to teach functional skills and address maladaptive behaviors, also known as parent-mediated or parent-implemented interventions, offer an alternative to one-on-one professional services, which are in short supply, according to the paper, which was published in the Journal of Autism and Developmental Disorders.

Methods and results

The meta-analysis included 54 papers based on randomized clinical trials involving 2,895 children, which compared the effects of various parent interventions with professional treatment, treatment as usual, or being on a wait-list to receive an intervention.

Overall the research team reported “moderately strong” average benefits from the parent-mediated interventions (Hedges’ g, 0.553), indicating a medium effect size. Parent interventions had the greatest effect on outcomes involving positive behavior and social skills (0.603), followed by language and communication (0.545), maladaptive behavior (0.519), and life skills (0.239).

Similar benefits were observed regardless of a child’s age or sex or which parent or parents implemented an intervention. The effects also appeared to be consistent regardless of intervention characteristics, such as the number of training sessions parents received, although the researchers noted that many studies did not provide data on such details.

Paul Carbone, MD, a professor of pediatrics at the University of Utah, Salt Lake City, who was not involved in the review, said it demonstrates that such parental engagement is “vitally important” and pediatricians “should not hesitate to refer interested families.”

Dr. Paul Carbone


Dr. Carbone, who is the medical director of an assessment program for children with suspected developmental disabilities, said many training programs for parents have adopted telehealth, adding to their convenience. To make appropriate referrals, primary care clinicians should become acquainted with local programs and learn which outcomes they target, he said.

Dr. Smith noted that primary care physicians are “better trained now than ever” to identify autism spectrum disorder and therefore are among the first to identify those conditions and help parents understand “that their actions at home absolutely make a difference in the child’s development.”

Overcoming limitations, future research needs

The research team attempted to overcome limitations with previous reviews by using comprehensive search terms and other methods to identify relevant studies, including some that had not been published. They included only studies that reflect common practice of training multiple parents simultaneously, they wrote.

Dr. Smith noted that long-term outcomes data and further study to compare effects on children with mild, moderate, and severe autism are needed.

Although logic would suggest greater benefits for children with severe disease, there are no data to demonstrate that, he said.

The authors of the study and Dr. Carbone reported no relevant competing interests.

Publications
Publications
Topics
Article Type
Sections
Article Source

FROM JOURNAL OF AUTISM AND DEVELOPMENTAL DISORDERS

Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)
Alternative CME
Disqus Comments
Default
Use ProPublica
Hide sidebar & use full width
render the right sidebar.
Conference Recap Checkbox
Not Conference Recap
Clinical Edge
Display the Slideshow in this Article
Medscape Article
Display survey writer
Reuters content
Disable Inline Native ads
WebMD Article