User login
ICD-10-CM Codes for CCCA, FFA Now Available
in the field of hair loss disorders.
“CCCA and FFA are conditions that require early diagnosis and intervention to prevent irreversible hair loss,” Maria Hordinsky, MD, professor of dermatology at the University of Minnesota, Minneapolis, and a member of the Board of Directors, Scarring Alopecia Foundation (SAF), said in an interview.
“The use of these new codes will make it easier for clinicians to identify affected patients and improve treatment outcomes. It also opens the door for more robust research efforts aimed at understanding the etiology and progression of CCCA and FFA, which could lead to new and more effective treatments in the future. Overall, this development represents a positive step toward improving care for individuals affected by these challenging conditions.”
The new codes — L66.81 for CCCA and L66.12 for FFA — were approved by the Centers for Disease Control and Prevention (CDC) on June 15, 2023, but not implemented until October 1, 2024.
Amy J. McMichael, MD, professor of dermatology at Wake Forest University School of Medicine, Winston-Salem, North Carolina, and a scientific advisor to SAF, told this news organization that Itisha Jefferson, a medical student at Loyola University Chicago’s Stritch School of Medicine, and her peers on the SAF’s Medical Student Executive Board, played a pivotal role in advocating for the codes.
In 2022, Jefferson, who has CCCA, and her fellow medical students helped create the proposals that were ultimately submitted to the CDC.
“They were critical in working with the CDC leaders to get the necessary information submitted and processed,” McMichael said. “They were also amazing at corralling our dermatologist group for the development of the necessary presentations and helped to shepherd us to the finish line for all logistic issues.”
On March 8, 2023, McMichael and Hordinsky made their pitch for the codes in person at the CDC’s ICD-10 Coordination and Maintenance Committee meeting, with McMichael discussing CCCA and Hordinsky discussing FFA.
“We also discussed the lack of standardized tracking, which has contributed to misdiagnoses and inadequate treatment options,” Hordinsky recalled. “We highlighted the importance of having distinct codes for these conditions to improve clinical outcomes, ensure that patients have access to appropriate care, better tracking of disease prevalence, and greater epidemiologic monitoring with access to electronic medical records and other large real-world evidence datasets and databases, the results of which could contribute to health policy decision-making.”
To spread the word about the new codes, McMichael, Hordinsky, and other members of the SAF are working with the original team of medical students, some of whom who are now dermatology residents, to develop an information guide to send to societies and organizations that were supportive of the codes. A publication in the dermatology literature is also planned.
For her part, Jefferson said that she will continue to advocate for patients with scarring alopecia as a medical student and when she becomes a physician. “I hope in the near future we will see an externally led FDA Patient-Focused Drug Development meeting for both CCCA and FFA, further advancing care and research for these conditions,” she said in an interview.
McMichael, Hordinsky, and Jefferson had no relevant disclosures to report.
A version of this article appeared on Medscape.com.
in the field of hair loss disorders.
“CCCA and FFA are conditions that require early diagnosis and intervention to prevent irreversible hair loss,” Maria Hordinsky, MD, professor of dermatology at the University of Minnesota, Minneapolis, and a member of the Board of Directors, Scarring Alopecia Foundation (SAF), said in an interview.
“The use of these new codes will make it easier for clinicians to identify affected patients and improve treatment outcomes. It also opens the door for more robust research efforts aimed at understanding the etiology and progression of CCCA and FFA, which could lead to new and more effective treatments in the future. Overall, this development represents a positive step toward improving care for individuals affected by these challenging conditions.”
The new codes — L66.81 for CCCA and L66.12 for FFA — were approved by the Centers for Disease Control and Prevention (CDC) on June 15, 2023, but not implemented until October 1, 2024.
Amy J. McMichael, MD, professor of dermatology at Wake Forest University School of Medicine, Winston-Salem, North Carolina, and a scientific advisor to SAF, told this news organization that Itisha Jefferson, a medical student at Loyola University Chicago’s Stritch School of Medicine, and her peers on the SAF’s Medical Student Executive Board, played a pivotal role in advocating for the codes.
In 2022, Jefferson, who has CCCA, and her fellow medical students helped create the proposals that were ultimately submitted to the CDC.
“They were critical in working with the CDC leaders to get the necessary information submitted and processed,” McMichael said. “They were also amazing at corralling our dermatologist group for the development of the necessary presentations and helped to shepherd us to the finish line for all logistic issues.”
On March 8, 2023, McMichael and Hordinsky made their pitch for the codes in person at the CDC’s ICD-10 Coordination and Maintenance Committee meeting, with McMichael discussing CCCA and Hordinsky discussing FFA.
“We also discussed the lack of standardized tracking, which has contributed to misdiagnoses and inadequate treatment options,” Hordinsky recalled. “We highlighted the importance of having distinct codes for these conditions to improve clinical outcomes, ensure that patients have access to appropriate care, better tracking of disease prevalence, and greater epidemiologic monitoring with access to electronic medical records and other large real-world evidence datasets and databases, the results of which could contribute to health policy decision-making.”
To spread the word about the new codes, McMichael, Hordinsky, and other members of the SAF are working with the original team of medical students, some of whom who are now dermatology residents, to develop an information guide to send to societies and organizations that were supportive of the codes. A publication in the dermatology literature is also planned.
For her part, Jefferson said that she will continue to advocate for patients with scarring alopecia as a medical student and when she becomes a physician. “I hope in the near future we will see an externally led FDA Patient-Focused Drug Development meeting for both CCCA and FFA, further advancing care and research for these conditions,” she said in an interview.
McMichael, Hordinsky, and Jefferson had no relevant disclosures to report.
A version of this article appeared on Medscape.com.
in the field of hair loss disorders.
“CCCA and FFA are conditions that require early diagnosis and intervention to prevent irreversible hair loss,” Maria Hordinsky, MD, professor of dermatology at the University of Minnesota, Minneapolis, and a member of the Board of Directors, Scarring Alopecia Foundation (SAF), said in an interview.
“The use of these new codes will make it easier for clinicians to identify affected patients and improve treatment outcomes. It also opens the door for more robust research efforts aimed at understanding the etiology and progression of CCCA and FFA, which could lead to new and more effective treatments in the future. Overall, this development represents a positive step toward improving care for individuals affected by these challenging conditions.”
The new codes — L66.81 for CCCA and L66.12 for FFA — were approved by the Centers for Disease Control and Prevention (CDC) on June 15, 2023, but not implemented until October 1, 2024.
Amy J. McMichael, MD, professor of dermatology at Wake Forest University School of Medicine, Winston-Salem, North Carolina, and a scientific advisor to SAF, told this news organization that Itisha Jefferson, a medical student at Loyola University Chicago’s Stritch School of Medicine, and her peers on the SAF’s Medical Student Executive Board, played a pivotal role in advocating for the codes.
In 2022, Jefferson, who has CCCA, and her fellow medical students helped create the proposals that were ultimately submitted to the CDC.
“They were critical in working with the CDC leaders to get the necessary information submitted and processed,” McMichael said. “They were also amazing at corralling our dermatologist group for the development of the necessary presentations and helped to shepherd us to the finish line for all logistic issues.”
On March 8, 2023, McMichael and Hordinsky made their pitch for the codes in person at the CDC’s ICD-10 Coordination and Maintenance Committee meeting, with McMichael discussing CCCA and Hordinsky discussing FFA.
“We also discussed the lack of standardized tracking, which has contributed to misdiagnoses and inadequate treatment options,” Hordinsky recalled. “We highlighted the importance of having distinct codes for these conditions to improve clinical outcomes, ensure that patients have access to appropriate care, better tracking of disease prevalence, and greater epidemiologic monitoring with access to electronic medical records and other large real-world evidence datasets and databases, the results of which could contribute to health policy decision-making.”
To spread the word about the new codes, McMichael, Hordinsky, and other members of the SAF are working with the original team of medical students, some of whom who are now dermatology residents, to develop an information guide to send to societies and organizations that were supportive of the codes. A publication in the dermatology literature is also planned.
For her part, Jefferson said that she will continue to advocate for patients with scarring alopecia as a medical student and when she becomes a physician. “I hope in the near future we will see an externally led FDA Patient-Focused Drug Development meeting for both CCCA and FFA, further advancing care and research for these conditions,” she said in an interview.
McMichael, Hordinsky, and Jefferson had no relevant disclosures to report.
A version of this article appeared on Medscape.com.
Are Three Cycles of Chemotherapy as Effective as Six for Retinoblastoma?
TOPLINE:
The three-cycle regimen also resulted in fewer adverse events and lower costs.
METHODOLOGY:
- The introduction of chemotherapy has increased survival rates for patients with retinoblastoma, but the optimal number of postoperative adjuvant cycles remains unclear due to scant randomized clinical trial data for high-risk patients.
- In the new trial, participants at two premier eye centers in China were randomly assigned to receive either three (n = 94) or six (n = 93) cycles of carboplatin, etoposide, and vincristine (CEV) chemotherapy after enucleation.
- The primary endpoint was 5-year DFS, and the secondary endpoints were overall survival, safety, economic burden, and quality of life.
- Patients were followed up every 3 months for the first 2 years and then every 6 months thereafter, with a median follow-up of 79 months.
- Adverse events were graded using the National Cancer Institute Common Terminology Criteria for Adverse Events (version 5.0).
TAKEAWAY:
- The 5-year DFS rates were 90.4% and 89.2% for the three- and six-cycle groups, respectively, meeting the noninferiority criterion (P = .003).
- The six-cycle group experienced a higher frequency of adverse events, including neutropenia, anemia, and nausea, than the three-cycle group.
- The quality-of-life scores were higher in the three-cycle group, particularly in physical, emotional, and social functioning parameters.
- The total, direct, and indirect costs were significantly lower in the three-cycle group than in the six-cycle group.
IN PRACTICE:
“A three-cycle CEV regimen demonstrated noninferiority, compared with a six-cycle approach, and was and proved to be an efficacious adjuvant chemotherapy regimen for individuals diagnosed with pathologically high-risk retinoblastoma,” the authors of the study wrote.
In an accompanying editorial, Ning Li, MD, and colleagues wrote that the findings “could lead to changes in clinical practice, reducing treatment burden and costs without compromising patient outcomes.”
SOURCE:
This study was led by Huijing Ye, MD, PhD, State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University in Guangzhou, China. Both the study and editorial were published online in JAMA.
LIMITATIONS:
The open-label design of the study might introduce bias, although an independent, blinded committee evaluated the clinical outcomes. The 12% noninferiority margin was notably substantial, considering the rarity of retinoblastoma and the wide range of survival rates. The criteria for adjuvant therapy, especially regarding choroidal invasion, were debatable and required further follow-up to clarify the prognosis related to various pathologic features.
DISCLOSURES:
This study was supported by the Sun Yat-Sen University Clinical Research 5010 Program and the Shanghai Committee of Science and Technology. No relevant conflict of interest was disclosed by the authors of the paper or the editorial.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The three-cycle regimen also resulted in fewer adverse events and lower costs.
METHODOLOGY:
- The introduction of chemotherapy has increased survival rates for patients with retinoblastoma, but the optimal number of postoperative adjuvant cycles remains unclear due to scant randomized clinical trial data for high-risk patients.
- In the new trial, participants at two premier eye centers in China were randomly assigned to receive either three (n = 94) or six (n = 93) cycles of carboplatin, etoposide, and vincristine (CEV) chemotherapy after enucleation.
- The primary endpoint was 5-year DFS, and the secondary endpoints were overall survival, safety, economic burden, and quality of life.
- Patients were followed up every 3 months for the first 2 years and then every 6 months thereafter, with a median follow-up of 79 months.
- Adverse events were graded using the National Cancer Institute Common Terminology Criteria for Adverse Events (version 5.0).
TAKEAWAY:
- The 5-year DFS rates were 90.4% and 89.2% for the three- and six-cycle groups, respectively, meeting the noninferiority criterion (P = .003).
- The six-cycle group experienced a higher frequency of adverse events, including neutropenia, anemia, and nausea, than the three-cycle group.
- The quality-of-life scores were higher in the three-cycle group, particularly in physical, emotional, and social functioning parameters.
- The total, direct, and indirect costs were significantly lower in the three-cycle group than in the six-cycle group.
IN PRACTICE:
“A three-cycle CEV regimen demonstrated noninferiority, compared with a six-cycle approach, and was and proved to be an efficacious adjuvant chemotherapy regimen for individuals diagnosed with pathologically high-risk retinoblastoma,” the authors of the study wrote.
In an accompanying editorial, Ning Li, MD, and colleagues wrote that the findings “could lead to changes in clinical practice, reducing treatment burden and costs without compromising patient outcomes.”
SOURCE:
This study was led by Huijing Ye, MD, PhD, State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University in Guangzhou, China. Both the study and editorial were published online in JAMA.
LIMITATIONS:
The open-label design of the study might introduce bias, although an independent, blinded committee evaluated the clinical outcomes. The 12% noninferiority margin was notably substantial, considering the rarity of retinoblastoma and the wide range of survival rates. The criteria for adjuvant therapy, especially regarding choroidal invasion, were debatable and required further follow-up to clarify the prognosis related to various pathologic features.
DISCLOSURES:
This study was supported by the Sun Yat-Sen University Clinical Research 5010 Program and the Shanghai Committee of Science and Technology. No relevant conflict of interest was disclosed by the authors of the paper or the editorial.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
The three-cycle regimen also resulted in fewer adverse events and lower costs.
METHODOLOGY:
- The introduction of chemotherapy has increased survival rates for patients with retinoblastoma, but the optimal number of postoperative adjuvant cycles remains unclear due to scant randomized clinical trial data for high-risk patients.
- In the new trial, participants at two premier eye centers in China were randomly assigned to receive either three (n = 94) or six (n = 93) cycles of carboplatin, etoposide, and vincristine (CEV) chemotherapy after enucleation.
- The primary endpoint was 5-year DFS, and the secondary endpoints were overall survival, safety, economic burden, and quality of life.
- Patients were followed up every 3 months for the first 2 years and then every 6 months thereafter, with a median follow-up of 79 months.
- Adverse events were graded using the National Cancer Institute Common Terminology Criteria for Adverse Events (version 5.0).
TAKEAWAY:
- The 5-year DFS rates were 90.4% and 89.2% for the three- and six-cycle groups, respectively, meeting the noninferiority criterion (P = .003).
- The six-cycle group experienced a higher frequency of adverse events, including neutropenia, anemia, and nausea, than the three-cycle group.
- The quality-of-life scores were higher in the three-cycle group, particularly in physical, emotional, and social functioning parameters.
- The total, direct, and indirect costs were significantly lower in the three-cycle group than in the six-cycle group.
IN PRACTICE:
“A three-cycle CEV regimen demonstrated noninferiority, compared with a six-cycle approach, and was and proved to be an efficacious adjuvant chemotherapy regimen for individuals diagnosed with pathologically high-risk retinoblastoma,” the authors of the study wrote.
In an accompanying editorial, Ning Li, MD, and colleagues wrote that the findings “could lead to changes in clinical practice, reducing treatment burden and costs without compromising patient outcomes.”
SOURCE:
This study was led by Huijing Ye, MD, PhD, State Key Laboratory of Ophthalmology, Zhongshan Ophthalmic Center, Sun Yat-sen University in Guangzhou, China. Both the study and editorial were published online in JAMA.
LIMITATIONS:
The open-label design of the study might introduce bias, although an independent, blinded committee evaluated the clinical outcomes. The 12% noninferiority margin was notably substantial, considering the rarity of retinoblastoma and the wide range of survival rates. The criteria for adjuvant therapy, especially regarding choroidal invasion, were debatable and required further follow-up to clarify the prognosis related to various pathologic features.
DISCLOSURES:
This study was supported by the Sun Yat-Sen University Clinical Research 5010 Program and the Shanghai Committee of Science and Technology. No relevant conflict of interest was disclosed by the authors of the paper or the editorial.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Humans and Carbs: A Complicated 800,000-Year Relationship
Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.
Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.
However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.
A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.
The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.
The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
How Researchers Found Our Carb-Lover Gene
The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.
Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.
To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.
The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).
Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”
The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)
“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.
However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.”
A previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
Potential Health Impacts on Modern Humans
The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.
Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.
Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.
“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.
It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.
However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
How Will Today’s Processed Carbs Affect Our Genes Tomorrow?
As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.
Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.
“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.
Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
A version of this article appeared on Medscape.com.
Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.
Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.
However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.
A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.
The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.
The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
How Researchers Found Our Carb-Lover Gene
The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.
Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.
To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.
The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).
Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”
The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)
“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.
However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.”
A previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
Potential Health Impacts on Modern Humans
The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.
Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.
Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.
“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.
It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.
However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
How Will Today’s Processed Carbs Affect Our Genes Tomorrow?
As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.
Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.
“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.
Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
A version of this article appeared on Medscape.com.
Trying to reduce your carbohydrate intake means going against nearly a million years of evolution.
Humans are among a few species with multiple copies of certain genes that help us break down starch — carbs like potatoes, beans, corn, and grains — so that we can turn it into energy our bodies can use.
However, it’s been difficult for researchers to pinpoint when in human history we acquired multiple copies of these genes because they’re in a region of the genome that’s hard to sequence.
A recent study published in Science suggests that humans may have developed multiple copies of the gene for amylase — an enzyme that’s the first step in starch digestion — over 800,000 years ago, long before the agricultural revolution. This genetic change could have helped us adapt to eating starchy foods.
The study shows how “what your ancestors ate thousands of years ago could be affecting our genetics today,” said Kelsey Jorgensen, PhD, a biological anthropologist at The University of Kansas, Lawrence, who was not involved in the study.
The double-edged sword has sharpened over all those centuries. On one hand, the human body needs and craves carbs to function. On the other hand, our modern-day consumption of carbs, especially calorie-dense/nutritionally-barren processed carbs, has long since passed “healthy.”
How Researchers Found Our Carb-Lover Gene
The enzyme amylase turns complex carbs into maltose, a sweet-tasting sugar that is made of two glucose molecules linked together. We make two kinds of amylases: Salivary amylase that breaks down carbs in our mouths and pancreatic amylase that is secreted into our small intestines.
Modern humans have multiple copies of both amylases. Past research showed that human populations with diets high in starch can have up to nine copies of the gene for salivary amylase, called AMY1.
To pinpoint when in human history we acquired multiple copies of AMY1, the new study utilized novel techniques, called optical genome mapping and long-read sequencing, to sequence and analyze the genes. They sequenced 98 modern-day samples and 68 ancient DNA samples, including one from a Siberian person who lived 45,000 years ago.
The ancient DNA data in the study allowed the researchers to track how the number of amylase genes changed over time, said George Perry, PhD, an anthropological geneticist at The Pennsylvania State University-University Park (he was not involved in the study).
Based on the sequencing, the team analyzed changes in the genes in their samples to gauge evolutionary timelines. Perry noted that this was a “very clever approach to estimating the amylase copy number mutation rate, which in turn can really help in testing evolutionary hypotheses.”
The researchers found that even before farming, hunter-gatherers had between four and eight AMY1 genes in their cells. This suggests that people across Eurasia already had a number of these genes long before they started growing crops. (Recent research indicates that Neanderthals also ate starchy foods.)
“Even archaic hominins had these [genetic] variations and that indicates that they were consuming starch,” said Feyza Yilmaz, PhD, an associate computational scientist at The Jackson Laboratory in Bar Harbor, Maine, and a lead author of the study.
However, 4000 years ago, after the agricultural revolution, the research indicates that there were even more AMY1 copies acquired. Yilmaz noted, “with the advance of agriculture, we see an increase in high amylase copy number haplotypes. So genetic variation goes hand in hand with adaptation to the environment.”
A previous study showed that species that share an environment with humans, such as dogs and pigs, also have copy number variation of amylase genes, said Yilmaz, indicating a link between genome changes and an increase in starch consumption.
Potential Health Impacts on Modern Humans
The duplications in the AMY1 gene could have allowed humans to better digest starches. And it’s conceivable that having more copies of the gene means being able to break down starches even more efficiently, and those with more copies “may be more prone to having high blood sugar, prediabetes, that sort of thing,” Jorgensen said.
Whether those with more AMY1 genes have more health risks is an active area of research. “Researchers tested whether there’s a correlation between AMY1 gene copies and diabetes or BMI [body mass index]. And so far, some studies show that there is indeed correlation, but other studies show that there is no correlation at all,” said Yilmaz.
Yilmaz pointed out that only 5 or 10% of carb digestion happens in our mouths, the rest occurs in our small intestine, plus there are many other factors involved in eating and metabolism.
“I am really looking forward to seeing studies which truly figure out the connection between AMY1 copy number and metabolic health and also what type of factors play a role in metabolic health,” said Yilmaz.
It’s also possible that having more AMY1 copies could lead to more carb cravings as the enzyme creates a type of sugar in our mouths. “Previous studies show that there’s a correlation between AMY1 copy number and also the amylase enzyme levels, so the faster we process the starch, the taste [of starches] will be sweeter,” said Yilmaz.
However, the link between cravings and copy numbers isn’t clear. And we don’t exactly know what came first — did the starch in humans’ diet lead to more copies of amylase genes, or did the copies of the amylase genes drive cravings that lead us to cultivate more carbs? We’ll need more research to find out.
How Will Today’s Processed Carbs Affect Our Genes Tomorrow?
As our diet changes to increasingly include processed carbs, what will happen to our AMY1 genes is fuzzy. “I don’t know what this could do to our genomes in the next 1000 years or more than 1000 years,” Yilmaz noted, but she said from the evidence it seems as though we may have peaked in AMY1 copies.
Jorgensen noted that this research is focused on a European population. She wonders whether the pattern of AMY1 duplication will be repeated in other populations “because the rise of starch happened first in the Middle East and then Europe and then later in the Americas,” she said.
“There’s individual variation and then there’s population-wide variation,” Jorgensen pointed out. She speculates that the historical diet of different cultures could explain population-based variations in AMY1 genes — it’s something future research could investigate. Other populations may also experience genetic changes as much of the world shifts to a more carb-heavy Western diet.
Overall, this research adds to the growing evidence that humans have a long history of loving carbs — for better and, at least over our most recent history and immediate future, for worse.
A version of this article appeared on Medscape.com.
Rising Stroke Rates in Californians With Sickle Cell Disease
TOPLINE:
METHODOLOGY:
- Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
- A total of 7636 patients with SCD were included in the study cohort.
- Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
- Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
- The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.
TAKEAWAY:
- The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
- Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
- Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
- The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.
IN PRACTICE:
“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.
SOURCE:
This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.
LIMITATIONS:
This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.
DISCLOSURES:
The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
- A total of 7636 patients with SCD were included in the study cohort.
- Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
- Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
- The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.
TAKEAWAY:
- The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
- Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
- Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
- The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.
IN PRACTICE:
“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.
SOURCE:
This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.
LIMITATIONS:
This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.
DISCLOSURES:
The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
TOPLINE:
METHODOLOGY:
- Researchers analyzed data from the California Department of Health Care Access and Innovation (HCAI), covering emergency department and hospitalization records from 1991 to 2019.
- A total of 7636 patients with SCD were included in the study cohort.
- Cumulative incidence and rates for primary and recurrent strokes and transient ischemic attacks (TIAs) were determined pre- and post STOP trial.
- Patients with SCD were identified using ICD-9 and ICD-10 codes, with specific criteria for inclusion based on hospitalization records.
- The study utilized Fine and Gray methodology to calculate cumulative incidence functions, accounting for the competing risk for death.
TAKEAWAY:
- The cumulative incidence of first ischemic stroke in patients with SCD was 2.1% by age 20 and 13.5% by age 60.
- Ischemic stroke rates increased significantly in children and adults in the 2010-2019 period, compared with the preceding decade.
- Risk factors for stroke and TIA included increasing age, hypertension, and hyperlipidemia.
- The study found a significant increase in rates of intracranial hemorrhage in adults aged 18-30 years and TIAs in children younger than 18 years from 2010 to 2019, compared with the prior decade.
IN PRACTICE:
“Neurovascular complications, including strokes and transient ischemic attacks (TIAs), are common and cause significant morbidity in individuals with sickle cell disease (SCD). The STOP trial (1998) established chronic transfusions as the standard of care for children with SCD at high risk for stroke,” the study’s authors wrote.
SOURCE:
This study was led by Olubusola B. Oluwole, MD, MS, University of Pittsburgh in Pennsylvania, and was published online in Blood.
LIMITATIONS:
This study’s reliance on administrative data may have introduced systematic errors, particularly with the transition from ICD-9 to ICD-10 codes. The lack of laboratory results and medication data in the HCAI database limited the ability to fully assess patient conditions and treatments. Additionally, the methodology changes in 2014 likely underreported death rates in people without PDD/EDU encounters in the calendar year preceding their death.
DISCLOSURES:
The authors reported no relevant conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article first appeared on Medscape.com.
Community Outreach Benefits Dermatology Residents and Their Patients
The sun often is rising in the rearview mirror as I travel with the University of New Mexico dermatology team from Albuquerque to our satellite clinic in Gallup, New Mexico. This twice-monthly trip—with a group usually comprising an attending physician, residents, and medical students—provides an invaluable opportunity for me to take part in delivering care to a majority Native American population and connects our institution and its trainees to the state’s rural and indigenous cultures and communities.
Community outreach is an important initiative for many dermatology residency training programs. Engaging with the community outside the clinic setting allows residents to hone their clinical skills, interact with and meet new people, and help to improve access to health care, especially for members of underserved populations.
Limited access to health care remains a pressing issue in the United States, especially for underserved and rural communities. There currently is no standardized way to measure access to care, but multiple contributing factors have been identified, including but not limited to patient wait times and throughput, provider turnover, ratio of dermatologists to patient population, insurance type, and patient outcomes.1 Fortunately, there are many ways for dermatology residents to get involved and improve access to dermatologic services in their communities, including skin cancer screenings, free clinics, and teledermatology.
Skin Cancer Screenings
More than 40% of community outreach initiatives offered by dermatology residency programs are related to skin cancer screening and prevention.2 The American Academy of Dermatology’s free skin cancer check program (https://www.aad.org/member/career/volunteer/spot) offers a way to participate in or even host a skin cancer screening in your community. Since 1985, this program has identified nearly 300,000 suspicious lesions and more than 30,000 suspected melanomas. Resources for setting up a skin cancer screening in your community are available on the program’s website. Residents may take this opportunity to teach medical students how to perform full-body skin examinations and/or practice making independent decisions as the supervisor for medical trainees. Skin cancer screening events not only expand access to care in underserved communities but also help residents feel more connected to the local community, especially if they have moved to a new location for their residency training.
Free Clinics
Engaging in educational opportunities offered through residency programs is another way to participate in community outreach. In particular, many programs are affiliated with a School of Medicine within their institution that allows residents to spearhead volunteer opportunities such as working at a free clinic. In fact, more than 30% of initiatives offered at dermatology residency programs are free general dermatology clinics.2 Residents are in the unique position of being both learners themselves as well as educators to trainees.3 As part of our role, we can provide crucial specialty care to the community by working in concert with medical students and while also familiarizing ourselves with treating populations that we may not reach in our daily clinical work. For example, by participating in free clinics, we can provide care to vulnerable populations who typically may have financial or time barriers that prevent them from seeking care at the institution-associated clinic, including individuals experiencing homelessness, patients who are uninsured, and individuals who cannot take time off work to pursue medical care. Our presence in the community helps to reduce barriers to specialty care, particularly in the field of dermatology where the access shortage in the context of rising skin cancer rates prompts public health concerns.4
Teledermatology
Teledermatology became a way to extend our reach in the community more than ever before during the COVID-19 pandemic. Advances in audio, visual, and data telecommunication have been particularly helpful in dermatology, a specialty that relies heavily on visual cues for diagnosis. Synchronous, asynchronous, and hybrid teledermatology services implemented during the pandemic have gained favor among patients and dermatologists and are still applied in current practice.5,6
For example, in the state of New Mexico (where there is a severe shortage of board-certified dermatologists to care for the state’s population), teledermatology has allowed rural providers of all specialties to consult University of New Mexico dermatologists by sending clinical photographs along with patient information and history via secure messaging. Instead of having the patient travel hundreds of miles to see the nearest dermatologist for their skin condition or endure long wait times to get in to see a specialist, primary providers now can initiate treatment or work-up for their patient’s skin issue in a timely manner with the use of teledermatology to consult specialists.
Teledermatology has demonstrated cost-effectiveness, accuracy, and efficiency in conveniently expanding access to care. It offers patients and dermatologists flexibility in receiving and delivering health care, respectively.7 As residents, learning how to navigate this technologic frontier in health care delivery is imperative, as it will remain a prevalent tool in the future care of our communities, particularly in underserved areas.
Final Thoughts
Through community outreach initiatives, dermatology residents have an opportunity not only to enrich our education but also to connect with and become closer to our patients. Skin cancer screenings, free clinics, and teledermatology have provided ways to reach more communities and remain important aspects of dermatology residency.
- Patel B, Blalock TW. Defining “access to care” for dermatology at academic medical institutions. J Am Acad Dermatol. 2023;89:627-628. doi:10.1016/j.jaad.2023.03.014
- Fritsche M, Maglakelidze N, Zaenglein A, et al. Community outreach initiatives in dermatology: cross-sectional study. Arch Dermatol Res. 2023;315:2693-2695. doi:10.1007/s00403-023-02629-y
- Chiu LW. Teaching tips for dermatology residents. Cutis. 2024;113:E17-E19. doi:10.12788/cutis.1046
- Duniphin DD. Limited access to dermatology specialty care: barriers and teledermatology. Dermatol Pract Concept. 2023;13:E2023031. doi:10.5826/dpc.1301a31
- Ibrahim AE, Magdy M, Khalaf EM, et al. Teledermatology in the time of COVID-19. Int J Clin Pract. 2021;75:e15000. doi:10.1111/ijcp.15000
- Farr MA, Duvic M, Joshi TP. Teledermatology during COVID-19: an updated review. Am J Clin Dermatol. 2021;22:467-475. doi:10.1007/s40257-021-00601-y
- Lipner SR. Optimizing patient care with teledermatology: improving access, efficiency, and satisfaction. Cutis. 2024;114:63-64. doi:10.12788/cutis.1073
The sun often is rising in the rearview mirror as I travel with the University of New Mexico dermatology team from Albuquerque to our satellite clinic in Gallup, New Mexico. This twice-monthly trip—with a group usually comprising an attending physician, residents, and medical students—provides an invaluable opportunity for me to take part in delivering care to a majority Native American population and connects our institution and its trainees to the state’s rural and indigenous cultures and communities.
Community outreach is an important initiative for many dermatology residency training programs. Engaging with the community outside the clinic setting allows residents to hone their clinical skills, interact with and meet new people, and help to improve access to health care, especially for members of underserved populations.
Limited access to health care remains a pressing issue in the United States, especially for underserved and rural communities. There currently is no standardized way to measure access to care, but multiple contributing factors have been identified, including but not limited to patient wait times and throughput, provider turnover, ratio of dermatologists to patient population, insurance type, and patient outcomes.1 Fortunately, there are many ways for dermatology residents to get involved and improve access to dermatologic services in their communities, including skin cancer screenings, free clinics, and teledermatology.
Skin Cancer Screenings
More than 40% of community outreach initiatives offered by dermatology residency programs are related to skin cancer screening and prevention.2 The American Academy of Dermatology’s free skin cancer check program (https://www.aad.org/member/career/volunteer/spot) offers a way to participate in or even host a skin cancer screening in your community. Since 1985, this program has identified nearly 300,000 suspicious lesions and more than 30,000 suspected melanomas. Resources for setting up a skin cancer screening in your community are available on the program’s website. Residents may take this opportunity to teach medical students how to perform full-body skin examinations and/or practice making independent decisions as the supervisor for medical trainees. Skin cancer screening events not only expand access to care in underserved communities but also help residents feel more connected to the local community, especially if they have moved to a new location for their residency training.
Free Clinics
Engaging in educational opportunities offered through residency programs is another way to participate in community outreach. In particular, many programs are affiliated with a School of Medicine within their institution that allows residents to spearhead volunteer opportunities such as working at a free clinic. In fact, more than 30% of initiatives offered at dermatology residency programs are free general dermatology clinics.2 Residents are in the unique position of being both learners themselves as well as educators to trainees.3 As part of our role, we can provide crucial specialty care to the community by working in concert with medical students and while also familiarizing ourselves with treating populations that we may not reach in our daily clinical work. For example, by participating in free clinics, we can provide care to vulnerable populations who typically may have financial or time barriers that prevent them from seeking care at the institution-associated clinic, including individuals experiencing homelessness, patients who are uninsured, and individuals who cannot take time off work to pursue medical care. Our presence in the community helps to reduce barriers to specialty care, particularly in the field of dermatology where the access shortage in the context of rising skin cancer rates prompts public health concerns.4
Teledermatology
Teledermatology became a way to extend our reach in the community more than ever before during the COVID-19 pandemic. Advances in audio, visual, and data telecommunication have been particularly helpful in dermatology, a specialty that relies heavily on visual cues for diagnosis. Synchronous, asynchronous, and hybrid teledermatology services implemented during the pandemic have gained favor among patients and dermatologists and are still applied in current practice.5,6
For example, in the state of New Mexico (where there is a severe shortage of board-certified dermatologists to care for the state’s population), teledermatology has allowed rural providers of all specialties to consult University of New Mexico dermatologists by sending clinical photographs along with patient information and history via secure messaging. Instead of having the patient travel hundreds of miles to see the nearest dermatologist for their skin condition or endure long wait times to get in to see a specialist, primary providers now can initiate treatment or work-up for their patient’s skin issue in a timely manner with the use of teledermatology to consult specialists.
Teledermatology has demonstrated cost-effectiveness, accuracy, and efficiency in conveniently expanding access to care. It offers patients and dermatologists flexibility in receiving and delivering health care, respectively.7 As residents, learning how to navigate this technologic frontier in health care delivery is imperative, as it will remain a prevalent tool in the future care of our communities, particularly in underserved areas.
Final Thoughts
Through community outreach initiatives, dermatology residents have an opportunity not only to enrich our education but also to connect with and become closer to our patients. Skin cancer screenings, free clinics, and teledermatology have provided ways to reach more communities and remain important aspects of dermatology residency.
The sun often is rising in the rearview mirror as I travel with the University of New Mexico dermatology team from Albuquerque to our satellite clinic in Gallup, New Mexico. This twice-monthly trip—with a group usually comprising an attending physician, residents, and medical students—provides an invaluable opportunity for me to take part in delivering care to a majority Native American population and connects our institution and its trainees to the state’s rural and indigenous cultures and communities.
Community outreach is an important initiative for many dermatology residency training programs. Engaging with the community outside the clinic setting allows residents to hone their clinical skills, interact with and meet new people, and help to improve access to health care, especially for members of underserved populations.
Limited access to health care remains a pressing issue in the United States, especially for underserved and rural communities. There currently is no standardized way to measure access to care, but multiple contributing factors have been identified, including but not limited to patient wait times and throughput, provider turnover, ratio of dermatologists to patient population, insurance type, and patient outcomes.1 Fortunately, there are many ways for dermatology residents to get involved and improve access to dermatologic services in their communities, including skin cancer screenings, free clinics, and teledermatology.
Skin Cancer Screenings
More than 40% of community outreach initiatives offered by dermatology residency programs are related to skin cancer screening and prevention.2 The American Academy of Dermatology’s free skin cancer check program (https://www.aad.org/member/career/volunteer/spot) offers a way to participate in or even host a skin cancer screening in your community. Since 1985, this program has identified nearly 300,000 suspicious lesions and more than 30,000 suspected melanomas. Resources for setting up a skin cancer screening in your community are available on the program’s website. Residents may take this opportunity to teach medical students how to perform full-body skin examinations and/or practice making independent decisions as the supervisor for medical trainees. Skin cancer screening events not only expand access to care in underserved communities but also help residents feel more connected to the local community, especially if they have moved to a new location for their residency training.
Free Clinics
Engaging in educational opportunities offered through residency programs is another way to participate in community outreach. In particular, many programs are affiliated with a School of Medicine within their institution that allows residents to spearhead volunteer opportunities such as working at a free clinic. In fact, more than 30% of initiatives offered at dermatology residency programs are free general dermatology clinics.2 Residents are in the unique position of being both learners themselves as well as educators to trainees.3 As part of our role, we can provide crucial specialty care to the community by working in concert with medical students and while also familiarizing ourselves with treating populations that we may not reach in our daily clinical work. For example, by participating in free clinics, we can provide care to vulnerable populations who typically may have financial or time barriers that prevent them from seeking care at the institution-associated clinic, including individuals experiencing homelessness, patients who are uninsured, and individuals who cannot take time off work to pursue medical care. Our presence in the community helps to reduce barriers to specialty care, particularly in the field of dermatology where the access shortage in the context of rising skin cancer rates prompts public health concerns.4
Teledermatology
Teledermatology became a way to extend our reach in the community more than ever before during the COVID-19 pandemic. Advances in audio, visual, and data telecommunication have been particularly helpful in dermatology, a specialty that relies heavily on visual cues for diagnosis. Synchronous, asynchronous, and hybrid teledermatology services implemented during the pandemic have gained favor among patients and dermatologists and are still applied in current practice.5,6
For example, in the state of New Mexico (where there is a severe shortage of board-certified dermatologists to care for the state’s population), teledermatology has allowed rural providers of all specialties to consult University of New Mexico dermatologists by sending clinical photographs along with patient information and history via secure messaging. Instead of having the patient travel hundreds of miles to see the nearest dermatologist for their skin condition or endure long wait times to get in to see a specialist, primary providers now can initiate treatment or work-up for their patient’s skin issue in a timely manner with the use of teledermatology to consult specialists.
Teledermatology has demonstrated cost-effectiveness, accuracy, and efficiency in conveniently expanding access to care. It offers patients and dermatologists flexibility in receiving and delivering health care, respectively.7 As residents, learning how to navigate this technologic frontier in health care delivery is imperative, as it will remain a prevalent tool in the future care of our communities, particularly in underserved areas.
Final Thoughts
Through community outreach initiatives, dermatology residents have an opportunity not only to enrich our education but also to connect with and become closer to our patients. Skin cancer screenings, free clinics, and teledermatology have provided ways to reach more communities and remain important aspects of dermatology residency.
- Patel B, Blalock TW. Defining “access to care” for dermatology at academic medical institutions. J Am Acad Dermatol. 2023;89:627-628. doi:10.1016/j.jaad.2023.03.014
- Fritsche M, Maglakelidze N, Zaenglein A, et al. Community outreach initiatives in dermatology: cross-sectional study. Arch Dermatol Res. 2023;315:2693-2695. doi:10.1007/s00403-023-02629-y
- Chiu LW. Teaching tips for dermatology residents. Cutis. 2024;113:E17-E19. doi:10.12788/cutis.1046
- Duniphin DD. Limited access to dermatology specialty care: barriers and teledermatology. Dermatol Pract Concept. 2023;13:E2023031. doi:10.5826/dpc.1301a31
- Ibrahim AE, Magdy M, Khalaf EM, et al. Teledermatology in the time of COVID-19. Int J Clin Pract. 2021;75:e15000. doi:10.1111/ijcp.15000
- Farr MA, Duvic M, Joshi TP. Teledermatology during COVID-19: an updated review. Am J Clin Dermatol. 2021;22:467-475. doi:10.1007/s40257-021-00601-y
- Lipner SR. Optimizing patient care with teledermatology: improving access, efficiency, and satisfaction. Cutis. 2024;114:63-64. doi:10.12788/cutis.1073
- Patel B, Blalock TW. Defining “access to care” for dermatology at academic medical institutions. J Am Acad Dermatol. 2023;89:627-628. doi:10.1016/j.jaad.2023.03.014
- Fritsche M, Maglakelidze N, Zaenglein A, et al. Community outreach initiatives in dermatology: cross-sectional study. Arch Dermatol Res. 2023;315:2693-2695. doi:10.1007/s00403-023-02629-y
- Chiu LW. Teaching tips for dermatology residents. Cutis. 2024;113:E17-E19. doi:10.12788/cutis.1046
- Duniphin DD. Limited access to dermatology specialty care: barriers and teledermatology. Dermatol Pract Concept. 2023;13:E2023031. doi:10.5826/dpc.1301a31
- Ibrahim AE, Magdy M, Khalaf EM, et al. Teledermatology in the time of COVID-19. Int J Clin Pract. 2021;75:e15000. doi:10.1111/ijcp.15000
- Farr MA, Duvic M, Joshi TP. Teledermatology during COVID-19: an updated review. Am J Clin Dermatol. 2021;22:467-475. doi:10.1007/s40257-021-00601-y
- Lipner SR. Optimizing patient care with teledermatology: improving access, efficiency, and satisfaction. Cutis. 2024;114:63-64. doi:10.12788/cutis.1073
Resident Pearls
- Outreach initiatives can help residents feel more connected to their community and expand access to care.
- Skin cancer screenings, free clinics, and teledermatology are a few ways residents may get involved in their local communities.
New Data on DOAC Initiation After Stroke in AF: Final Word?
ABU DHABI, UAE — The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.
Results of the OPTIMAS trial, the largest trial to address this question, showed that
In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.
The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.
“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.
Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”
Werring pointed out that starting anticoagulation early also had important logistical advantages.
“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
Clinical Dilemma
Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.
“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.
So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.
The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.
The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.
There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.
Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).
The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.
Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.
Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
Applicable to Real-World Practice
A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.
Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.
Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.
During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.
Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.
In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”
Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
CATALYST Findings
The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.
The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).
The results were consistent across all subgroups, all suggesting an advantage for early DOAC.
Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.
The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).
At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
‘Practice Changing’ Results
Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”
“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.
“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.
Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.
During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.
“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”
Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”
The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
A version of this article appeared on Medscape.com.
ABU DHABI, UAE — The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.
Results of the OPTIMAS trial, the largest trial to address this question, showed that
In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.
The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.
“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.
Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”
Werring pointed out that starting anticoagulation early also had important logistical advantages.
“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
Clinical Dilemma
Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.
“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.
So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.
The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.
The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.
There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.
Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).
The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.
Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.
Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
Applicable to Real-World Practice
A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.
Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.
Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.
During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.
Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.
In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”
Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
CATALYST Findings
The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.
The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).
The results were consistent across all subgroups, all suggesting an advantage for early DOAC.
Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.
The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).
At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
‘Practice Changing’ Results
Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”
“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.
“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.
Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.
During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.
“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”
Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”
The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
A version of this article appeared on Medscape.com.
ABU DHABI, UAE — The long-standing debate as to when to start anticoagulation in patients with an acute ischemic stroke and atrial fibrillation (AF) looks as though it’s settled.
Results of the OPTIMAS trial, the largest trial to address this question, showed that
In addition, a new meta-analysis, known as CATALYST, which included all four randomized trials now available on this issue, showed a clear benefit of earlier initiation (within 4 days) versus later (5 days and up) on its primary endpoint of new ischemic stroke, symptomatic intracerebral hemorrhage, and unclassified stroke at 30 days.
The results of the OPTIMAS trial and the meta-analysis were both presented at the 16th World Stroke Congress (WSC) 2024. The OPTIMAS trial was also simultaneously published online in The Lancet.
“Our findings do not support the guideline recommended practice of delaying DOAC initiation after ischemic stroke with AF regardless of clinical stroke severity, reperfusion or prior anticoagulation,” said OPTIMAS investigator David Werring, PhD, University College London in England.
Presenting the meta-analysis, Signild Åsberg, MD, Uppsala University, Uppsala, Sweden, said his group’s findings “support the early start of DOACs (within 4 days) in clinical practice.”
Werring pointed out that starting anticoagulation early also had important logistical advantages.
“This means we can start anticoagulation before patients are discharged from hospital, thus ensuring that this important secondary prevention medication is always prescribed, when appropriate. That’s going to be a key benefit in the real world.”
Clinical Dilemma
Werring noted that AF accounts for 20%-30% of ischemic strokes, which tend to be more severe than other stroke types. The pivotal trials of DOACs did not include patients within 30 days of an acute ischemic stroke, creating a clinical dilemma on when to start this treatment.
“On the one hand, we wish to start anticoagulation early to reduce early recurrence of ischemic stroke. But on the other hand, there are concerns that if we start anticoagulation early, it could cause intracranial bleeding, including hemorrhagic transformation of the acute infarct. Guidelines on this issue are inconsistent and have called for randomized control trials in this area,” he noted.
So far, three randomized trials on DOAC timing have been conducted, which Werring said suggested early DOAC treatment is safe. However, these trials have provided limited data on moderate to severe stroke, patients with hemorrhagic transformation, or those already taking oral anticoagulants — subgroups in which there are particular concerns about early oral anticoagulation.
The OPTIMAS trial included a broad population of patients with acute ischemic stroke associated with AF including these critical subgroups.
The trial, conducted at 100 hospitals in the United Kingdom, included 3648 patients with AF and acute ischemic stroke who were randomly assigned to early (≤ 4 days from stroke symptom onset) or delayed (7-14 days) anticoagulation initiation with any DOAC.
There was no restriction on stroke severity, and patients with hemorrhagic transformation were allowed, with the exception of parenchymal hematoma type 2, a rare and severe type of hemorrhagic transformation.
Approximately 35% of patients had been taking an oral anticoagulant, mainly DOACs, prior to their stroke, and about 30% had revascularization with thrombolysis, thrombectomy, or both. Nearly 900 participants (25%) had moderate to severe stroke (National Institutes of Health Stroke Scale [NIHSS] score ≥ 11).
The primary outcome was a composite of recurrent ischemic stroke, symptomatic intracranial hemorrhage, unclassifiable stroke, or systemic embolism incidence at 90 days. The initial analysis aimed to show noninferiority of early DOAC initiation, with a noninferiority margin of 2 percentage points, followed by testing for superiority.
Results showed that the primary outcome occurred in 3.3% of both groups (adjusted risk difference, 0.000; 95% CI, −0.011 to 0.012), with noninferiority criteria fulfilled. Superiority was not achieved.
Symptomatic intracranial hemorrhage occurred in 0.6% of patients in the early DOAC initiation group vs 0.7% of those in the delayed group — a nonsignificant difference.
Applicable to Real-World Practice
A time-to-event analysis of the primary outcome showed that there were fewer outcomes in the first 30 days in the early DOAC initiation group, but the curves subsequently came together.
Subgroup analysis showed consistent results across all whole trial population, with no modification of the effect of early DOAC initiation according to stroke severity, reperfusion treatment, or previous anticoagulation.
Werring said that strengths of the OPTIMAS trial included a large sample size, a broad population with generalizability to real-world practice, and the inclusion of patients at higher bleeding risk than included in previous studies.
During the discussion, it was noted that the trial included few (about 3%) patients — about 3% — with very severe stroke (NIHSS score > 21), with the question of whether the findings could be applied to this group.
Werring noted that there was no evidence of heterogeneity, and if anything, patients with more severe strokes may have had a slightly greater benefit with early DOAC initiation. “So my feeling is probably these results do generalize to the more severe patients,” he said.
In a commentary accompanying The Lancet publication of the OPTIMAS trial, Else Charlotte Sandset, MD, University of Oslo, in Norway, and Diana Aguiar de Sousa, MD, Central Lisbon University Hospital Centre, Lisbon, Portugal, noted that the “increasing body of evidence strongly supports the message that initiating anticoagulation early for patients with ischaemic stroke is safe. The consistent absence of heterogeneity in safety outcomes suggests that the risk of symptomatic intracranial haemorrhage is not a major concern, even in patients with large infarcts.”
Regardless of the size of the treatment effect, initiating early anticoagulation makes sense when it can be done safely, as it helps prevent recurrent ischemic strokes and other embolic events. Early intervention reduces embolization risk, particularly in high-risk patients, and allows secondary prevention measures to begin while patients are still hospitalized, they added.
CATALYST Findings
The CATALYST meta-analysis included four trials, namely, TIMING, ELAN, OPTIMAS, and START, of early versus later DOAC administration in a total of 5411 patients with acute ischemic stroke and AF. In this meta-analysis, early was defined as within 4 days of stroke and later as 5 days or more.
The primary outcome was a composite of ischemic stroke, symptomatic, intracerebral hemorrhage, or unclassified stroke at 30 days. This was significantly reduced in the early group (2.12%) versus 3.02% in the later group, giving an odds ratio of 0.70 (95% CI, 0.50-0.98; P =.04).
The results were consistent across all subgroups, all suggesting an advantage for early DOAC.
Further analysis showed a clear benefit of early DOAC initiation in ischemic stroke with the curves separating early.
The rate of symptomatic intracerebral hemorrhage was low in both groups (0.45% in the early group and 0.40% in the later group) as was extracranial hemorrhage (0.45% vs 0.55%).
At 90 days, there were still lower event rates in the early group than the later one, but the difference was no longer statistically significant.
‘Practice Changing’ Results
Commenting on both studies, chair of the WSC session where the results of both OPTIMAS trial and the meta-analysis were presented, Craig Anderson, MD, The George Institute for Global Health, Sydney, Australia, described these latest results as “practice changing.”
“When to start anticoagulation in acute ischemic stroke patients with AF has been uncertain for a long time. The dogma has always been that we should wait. Over the years, we’ve become a little bit more confident, but now we’ve got good data from randomized trials showing that early initiation is safe, with the meta-analysis showing benefit,” he said.
“These new data from OPTIMAS will reassure clinicians that there’s no excessive harm and, more importantly, no excessive harm across all patient groups. And the meta-analysis clearly showed an upfront benefit of starting anticoagulation early. That’s a very convincing result,” he added.
Anderson cautioned that there still may be concerns about starting DOACs early in some groups, including Asian populations that have a higher bleeding risk (these trials included predominantly White patients) and people who are older or frail, who may have extensive small vessel disease.
During the discussion, several questions centered on the lack of imaging data available on the patients in the studies. Anderson said imaging data would help reassure clinicians on the safety of early anticoagulation in patients with large infarcts.
“Stroke clinicians make decisions on the basis of the patient and on the basis of the brain, and we only have the patient information at the moment. We don’t have information on the brain — that comes from imaging.”
Regardless, he believes these new data will lead to a shift in practice. “But maybe, it won’t be as dramatic as we would hope because I think some clinicians may still hesitate to apply these results to patients at high risk of bleeding. With imaging data from the studies that might change.”
The OPTIMAS trial was funded by University College London and the British Heart Foundation. Werring reported consulting fees from Novo Nordisk, National Institute for Health and Care Excellence, and Alnylam; payments or speaker honoraria from Novo Nordisk, Bayer, and AstraZeneca/Alexion; participation on a data safety monitoring board for the OXHARP trial; and participation as steering committee chair for the MACE-ICH and PLINTH trials. Åsberg received institutional research grants and lecture fees to her institution from AstraZeneca, Boehringer Ingelheim, Bristol Myers Squibb, and Institut Produits Synthése. Sandset and de Sousa were both steering committee members of the ELAN trial. Anderson reported grant funding from Penumbra and Takeda China.
A version of this article appeared on Medscape.com.
FROM WSC 2024
Eruption of Multiple Linear Hyperpigmented Plaques
THE DIAGNOSIS: Chemotherapy-Induced Flagellate Dermatitis
Based on the clinical presentation and temporal relation with chemotherapy, a diagnosis of bleomycininduced flagellate dermatitis (FD) was made, as bleomycin is the only chemotherapeutic agent from this regimen that has been linked with FD.1,2 Laboratory findings revealed eosinophilia, further supporting a druginduced dermatitis. The patient was treated with oral steroids and diphenhydramine to alleviate itching and discomfort. The chemotherapy was temporarily discontinued until symptomatic improvement was observed within 2 to 3 days.
Flagellate dermatitis is characterized by unique erythematous, linear, intermingled streaks of adjoining firm papules—often preceded by a prodrome of global pruritus—that eventually become hyperpigmented as the erythema subsides. The clinical manifestation of FD can be idiopathic; true/mechanical (dermatitis artefacta, abuse, sadomasochism); chemotherapy induced (peplomycin, trastuzumab, cisplatin, docetaxel, bendamustine); toxin induced (shiitake mushroom, cnidarian stings, Paederus insects); related to rheumatologic diseases (dermatomyositis, adult-onset Still disease), dermatographism, phytophotodermatitis, or poison ivy dermatitis; or induced by chikungunya fever.1
The term flagellate originates from the Latin word flagellum, which pertains to the distinctive whiplike pattern. It was first described by Moulin et al3 in 1970 in reference to bleomycin-induced linear hyperpigmentation. Bleomycin, a glycopeptide antibiotic derived from Streptomyces verticillus, is used to treat Hodgkin lymphoma, squamous cell carcinoma, and germ cell tumors. The worldwide incidence of bleomycin-induced FD is 8% to 22% and commonly is associated with a cumulative dose greater than 100 U.2 Clinical presentation is variable in terms of onset, distribution, and morphology of the eruption and could be independent of dose, route of administration, or type of malignancy being treated. The flagellate rash commonly involves the trunk, arms, and legs; can develop within hours to 6 months of starting bleomycin therapy; often is preceded by generalized itching; and eventually heals with hyperpigmentation.
Possible mechanisms of bleomycin-induced FD include localized melanogenesis, inflammatory pigmentary incontinence, alterations to normal pigmentation patterns, cytotoxic effects of the drug itself, minor trauma/ scratching leading to increased blood flow and causing local accumulation of bleomycin, heat recall, and reduced epidermal turnover leading to extended interaction between keratinocytes and melanocytes.2 Heat exposure can act as a trigger for bleomycin-induced skin rash recall even months after the treatment is stopped.
Apart from discontinuing the drug, there is no specific treatment available for bleomycin-induced FD. The primary objective of treatment is to alleviate pruritus, which often involves the use of topical or systemic corticosteroids and oral antihistamines. The duration of treatment depends on the patient’s clinical response. Once treatment is discontinued, FD typically resolves within 6 to 8 months. However, there can be a permanent postinflammatory hyperpigmentation in the affected area.4 Although there is a concern for increased mortality after postponement of chemotherapy,5 the decision to proceed with or discontinue the chemotherapy regimen necessitates a comprehensive interdisciplinary discussion and a meticulous assessment of the risks and benefits that is customized to each individual patient. Flagellate dermatitis can reoccur with bleomycin re-exposure; a combined approach of proactive topical and systemic steroid treatment seems to diminish the likelihood of FD recurrence.5
Our case underscores the importance of recognizing, detecting, and managing FD promptly in individuals undergoing bleomycin-based chemotherapy. Medical professionals should familiarize themselves with this distinct adverse effect linked to bleomycin, enabling prompt discontinuation if necessary, and educate patients about the condition’s typically temporary nature, thereby alleviating their concerns.
- Bhushan P, Manjul P, Baliyan V. Flagellate dermatoses. Indian J Dermatol Venereol Leprol. 2014;80:149-152.
- Ziemer M, Goetze S, Juhasz K, et al. Flagellate dermatitis as a bleomycinspecific adverse effect of cytostatic therapy: a clinical-histopathologic correlation. Am J Clin Dermatol. 2011;12:68-76. doi:10.2165/11537080-000000000-00000
- Moulin G, Fière B, Beyvin A. Cutaneous pigmentation caused by bleomycin. Article in French. Bull Soc Fr Dermatol Syphiligr. 1970;77:293-296.
- Biswas A, Chaudhari PB, Sharma P, et al. Bleomycin induced flagellate erythema: revisiting a unique complication. J Cancer Res Ther. 2013;9:500-503. doi:10.4103/0973-1482.119358
- Hanna TP, King WD, Thibodeau S, et al. Mortality due to cancer treatment delay: systematic review and meta-analysis. BMJ. 2020;371:m4087. doi:10.1136/bmj.m4087
THE DIAGNOSIS: Chemotherapy-Induced Flagellate Dermatitis
Based on the clinical presentation and temporal relation with chemotherapy, a diagnosis of bleomycininduced flagellate dermatitis (FD) was made, as bleomycin is the only chemotherapeutic agent from this regimen that has been linked with FD.1,2 Laboratory findings revealed eosinophilia, further supporting a druginduced dermatitis. The patient was treated with oral steroids and diphenhydramine to alleviate itching and discomfort. The chemotherapy was temporarily discontinued until symptomatic improvement was observed within 2 to 3 days.
Flagellate dermatitis is characterized by unique erythematous, linear, intermingled streaks of adjoining firm papules—often preceded by a prodrome of global pruritus—that eventually become hyperpigmented as the erythema subsides. The clinical manifestation of FD can be idiopathic; true/mechanical (dermatitis artefacta, abuse, sadomasochism); chemotherapy induced (peplomycin, trastuzumab, cisplatin, docetaxel, bendamustine); toxin induced (shiitake mushroom, cnidarian stings, Paederus insects); related to rheumatologic diseases (dermatomyositis, adult-onset Still disease), dermatographism, phytophotodermatitis, or poison ivy dermatitis; or induced by chikungunya fever.1
The term flagellate originates from the Latin word flagellum, which pertains to the distinctive whiplike pattern. It was first described by Moulin et al3 in 1970 in reference to bleomycin-induced linear hyperpigmentation. Bleomycin, a glycopeptide antibiotic derived from Streptomyces verticillus, is used to treat Hodgkin lymphoma, squamous cell carcinoma, and germ cell tumors. The worldwide incidence of bleomycin-induced FD is 8% to 22% and commonly is associated with a cumulative dose greater than 100 U.2 Clinical presentation is variable in terms of onset, distribution, and morphology of the eruption and could be independent of dose, route of administration, or type of malignancy being treated. The flagellate rash commonly involves the trunk, arms, and legs; can develop within hours to 6 months of starting bleomycin therapy; often is preceded by generalized itching; and eventually heals with hyperpigmentation.
Possible mechanisms of bleomycin-induced FD include localized melanogenesis, inflammatory pigmentary incontinence, alterations to normal pigmentation patterns, cytotoxic effects of the drug itself, minor trauma/ scratching leading to increased blood flow and causing local accumulation of bleomycin, heat recall, and reduced epidermal turnover leading to extended interaction between keratinocytes and melanocytes.2 Heat exposure can act as a trigger for bleomycin-induced skin rash recall even months after the treatment is stopped.
Apart from discontinuing the drug, there is no specific treatment available for bleomycin-induced FD. The primary objective of treatment is to alleviate pruritus, which often involves the use of topical or systemic corticosteroids and oral antihistamines. The duration of treatment depends on the patient’s clinical response. Once treatment is discontinued, FD typically resolves within 6 to 8 months. However, there can be a permanent postinflammatory hyperpigmentation in the affected area.4 Although there is a concern for increased mortality after postponement of chemotherapy,5 the decision to proceed with or discontinue the chemotherapy regimen necessitates a comprehensive interdisciplinary discussion and a meticulous assessment of the risks and benefits that is customized to each individual patient. Flagellate dermatitis can reoccur with bleomycin re-exposure; a combined approach of proactive topical and systemic steroid treatment seems to diminish the likelihood of FD recurrence.5
Our case underscores the importance of recognizing, detecting, and managing FD promptly in individuals undergoing bleomycin-based chemotherapy. Medical professionals should familiarize themselves with this distinct adverse effect linked to bleomycin, enabling prompt discontinuation if necessary, and educate patients about the condition’s typically temporary nature, thereby alleviating their concerns.
THE DIAGNOSIS: Chemotherapy-Induced Flagellate Dermatitis
Based on the clinical presentation and temporal relation with chemotherapy, a diagnosis of bleomycininduced flagellate dermatitis (FD) was made, as bleomycin is the only chemotherapeutic agent from this regimen that has been linked with FD.1,2 Laboratory findings revealed eosinophilia, further supporting a druginduced dermatitis. The patient was treated with oral steroids and diphenhydramine to alleviate itching and discomfort. The chemotherapy was temporarily discontinued until symptomatic improvement was observed within 2 to 3 days.
Flagellate dermatitis is characterized by unique erythematous, linear, intermingled streaks of adjoining firm papules—often preceded by a prodrome of global pruritus—that eventually become hyperpigmented as the erythema subsides. The clinical manifestation of FD can be idiopathic; true/mechanical (dermatitis artefacta, abuse, sadomasochism); chemotherapy induced (peplomycin, trastuzumab, cisplatin, docetaxel, bendamustine); toxin induced (shiitake mushroom, cnidarian stings, Paederus insects); related to rheumatologic diseases (dermatomyositis, adult-onset Still disease), dermatographism, phytophotodermatitis, or poison ivy dermatitis; or induced by chikungunya fever.1
The term flagellate originates from the Latin word flagellum, which pertains to the distinctive whiplike pattern. It was first described by Moulin et al3 in 1970 in reference to bleomycin-induced linear hyperpigmentation. Bleomycin, a glycopeptide antibiotic derived from Streptomyces verticillus, is used to treat Hodgkin lymphoma, squamous cell carcinoma, and germ cell tumors. The worldwide incidence of bleomycin-induced FD is 8% to 22% and commonly is associated with a cumulative dose greater than 100 U.2 Clinical presentation is variable in terms of onset, distribution, and morphology of the eruption and could be independent of dose, route of administration, or type of malignancy being treated. The flagellate rash commonly involves the trunk, arms, and legs; can develop within hours to 6 months of starting bleomycin therapy; often is preceded by generalized itching; and eventually heals with hyperpigmentation.
Possible mechanisms of bleomycin-induced FD include localized melanogenesis, inflammatory pigmentary incontinence, alterations to normal pigmentation patterns, cytotoxic effects of the drug itself, minor trauma/ scratching leading to increased blood flow and causing local accumulation of bleomycin, heat recall, and reduced epidermal turnover leading to extended interaction between keratinocytes and melanocytes.2 Heat exposure can act as a trigger for bleomycin-induced skin rash recall even months after the treatment is stopped.
Apart from discontinuing the drug, there is no specific treatment available for bleomycin-induced FD. The primary objective of treatment is to alleviate pruritus, which often involves the use of topical or systemic corticosteroids and oral antihistamines. The duration of treatment depends on the patient’s clinical response. Once treatment is discontinued, FD typically resolves within 6 to 8 months. However, there can be a permanent postinflammatory hyperpigmentation in the affected area.4 Although there is a concern for increased mortality after postponement of chemotherapy,5 the decision to proceed with or discontinue the chemotherapy regimen necessitates a comprehensive interdisciplinary discussion and a meticulous assessment of the risks and benefits that is customized to each individual patient. Flagellate dermatitis can reoccur with bleomycin re-exposure; a combined approach of proactive topical and systemic steroid treatment seems to diminish the likelihood of FD recurrence.5
Our case underscores the importance of recognizing, detecting, and managing FD promptly in individuals undergoing bleomycin-based chemotherapy. Medical professionals should familiarize themselves with this distinct adverse effect linked to bleomycin, enabling prompt discontinuation if necessary, and educate patients about the condition’s typically temporary nature, thereby alleviating their concerns.
- Bhushan P, Manjul P, Baliyan V. Flagellate dermatoses. Indian J Dermatol Venereol Leprol. 2014;80:149-152.
- Ziemer M, Goetze S, Juhasz K, et al. Flagellate dermatitis as a bleomycinspecific adverse effect of cytostatic therapy: a clinical-histopathologic correlation. Am J Clin Dermatol. 2011;12:68-76. doi:10.2165/11537080-000000000-00000
- Moulin G, Fière B, Beyvin A. Cutaneous pigmentation caused by bleomycin. Article in French. Bull Soc Fr Dermatol Syphiligr. 1970;77:293-296.
- Biswas A, Chaudhari PB, Sharma P, et al. Bleomycin induced flagellate erythema: revisiting a unique complication. J Cancer Res Ther. 2013;9:500-503. doi:10.4103/0973-1482.119358
- Hanna TP, King WD, Thibodeau S, et al. Mortality due to cancer treatment delay: systematic review and meta-analysis. BMJ. 2020;371:m4087. doi:10.1136/bmj.m4087
- Bhushan P, Manjul P, Baliyan V. Flagellate dermatoses. Indian J Dermatol Venereol Leprol. 2014;80:149-152.
- Ziemer M, Goetze S, Juhasz K, et al. Flagellate dermatitis as a bleomycinspecific adverse effect of cytostatic therapy: a clinical-histopathologic correlation. Am J Clin Dermatol. 2011;12:68-76. doi:10.2165/11537080-000000000-00000
- Moulin G, Fière B, Beyvin A. Cutaneous pigmentation caused by bleomycin. Article in French. Bull Soc Fr Dermatol Syphiligr. 1970;77:293-296.
- Biswas A, Chaudhari PB, Sharma P, et al. Bleomycin induced flagellate erythema: revisiting a unique complication. J Cancer Res Ther. 2013;9:500-503. doi:10.4103/0973-1482.119358
- Hanna TP, King WD, Thibodeau S, et al. Mortality due to cancer treatment delay: systematic review and meta-analysis. BMJ. 2020;371:m4087. doi:10.1136/bmj.m4087
A 28-year-old man presented for evaluation of an intensely itchy rash of 5 days’ duration involving the face, trunk, arms, and legs. The patient recently had been diagnosed with classical Hodgkin lymphoma and was started on a biweekly chemotherapy regimen of adriamycin, bleomycin, vinblastine, and dacarbazine 3 weeks prior. He reported that a red, itchy, papular rash had developed on the hands 1 week after starting chemotherapy and improved with antihistamines. Symptoms of the current rash included night sweats, occasional fever, substantial unintentional weight loss, and fatigue. He had no history of urticaria, angioedema, anaphylaxis, or nail changes.
Physical examination revealed widespread, itchy, linear and curvilinear hyperpigmented plaques on the upper arms, shoulders, back (top), face, and thighs, as well as erythematous grouped papules on the bilateral palms (bottom). There was no mucosal or systemic involvement.
Six Tips for Media Interviews
As a physician, you might be contacted by the media to provide your professional opinion and advice. Or you might be looking for media interview opportunities to market your practice or side project. And if you do research, media interviews can be an effective way to spread the word. It’s important to prepare for a media interview so that you achieve the outcome you are looking for.
Keep your message simple. When you are a subject expert, you might think that the basics are obvious or even boring, and that the nuances are more important. However, most of the audience is looking for big-picture information that they can apply to their lives. Consider a few key takeaways, keeping in mind that your interview is likely to be edited to short sound bites or a few quotes. It may help to jot down notes so that you cover the fundamentals clearly. You could even write and rehearse a script beforehand. If there is something complicated or subtle that you want to convey, you can preface it by saying, “This is confusing but very important …” to let the audience know to give extra consideration to what you are about to say.
Avoid extremes and hyperbole. Sometimes, exaggerated statements make their way into medical discussions. Statements such as “it doesn’t matter how many calories you consume — it’s all about the quality” are common oversimplifications. But you might be upset to see your name next to a comment like this because it is not actually correct. Check the phrasing of your key takeaways to avoid being stuck defending or explaining an inaccurate statement when your patients ask you about it later.
Ask the interviewers what they are looking for. Many medical topics have some controversial element, so it is good to know what you’re getting into. Find out the purpose of the article or interview before you decide whether it is right for you. It could be about another doctor in town who is being sued; if you don’t want to be associated with that story, it might be best to decline the interview.
Explain your goals. You might accept or pursue an interview to raise awareness about an underrecognized condition. You might want the public to identify and get help for early symptoms, or you might want to create empathy for people coping with a disease you treat. Consider why you are participating in an interview, and communicate that to the interviewer to ensure that your objective can be part of the final product.
Know whom you’re dealing with. It is good to learn about the publication/media channel before you agree to participate. It may have a political bias, or perhaps the interview is intended to promote a specific product. If you agree with and support their purposes, then you may be happy to lend your opinion. But learning about the “voice” of the publication in advance allows you to make an informed decision about whether you want to be identified with a particular political ideology or product endorsement.
Ask to see your quotes before publication. It’s good to have the opportunity to make corrections in case you are accidentally misquoted or misunderstood. It is best to ask to see quotes before you agree to the interview. Some reporters may agree to (or even prefer) a written question-and-answer format so that they can directly quote your responses without rephrasing your words. You could suggest this, especially if you are too busy for a call or live meeting.
As a physician, your insights and advice can be highly beneficial to others. You can also use media interviews to propel your career forward. Doing your homework can ensure that you will be pleased with the final product and how your words were used.
Dr. Moawad, Clinical Assistant Professor, Department of Medical Education, Case Western Reserve University School of Medicine, Cleveland, Ohio, has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
As a physician, you might be contacted by the media to provide your professional opinion and advice. Or you might be looking for media interview opportunities to market your practice or side project. And if you do research, media interviews can be an effective way to spread the word. It’s important to prepare for a media interview so that you achieve the outcome you are looking for.
Keep your message simple. When you are a subject expert, you might think that the basics are obvious or even boring, and that the nuances are more important. However, most of the audience is looking for big-picture information that they can apply to their lives. Consider a few key takeaways, keeping in mind that your interview is likely to be edited to short sound bites or a few quotes. It may help to jot down notes so that you cover the fundamentals clearly. You could even write and rehearse a script beforehand. If there is something complicated or subtle that you want to convey, you can preface it by saying, “This is confusing but very important …” to let the audience know to give extra consideration to what you are about to say.
Avoid extremes and hyperbole. Sometimes, exaggerated statements make their way into medical discussions. Statements such as “it doesn’t matter how many calories you consume — it’s all about the quality” are common oversimplifications. But you might be upset to see your name next to a comment like this because it is not actually correct. Check the phrasing of your key takeaways to avoid being stuck defending or explaining an inaccurate statement when your patients ask you about it later.
Ask the interviewers what they are looking for. Many medical topics have some controversial element, so it is good to know what you’re getting into. Find out the purpose of the article or interview before you decide whether it is right for you. It could be about another doctor in town who is being sued; if you don’t want to be associated with that story, it might be best to decline the interview.
Explain your goals. You might accept or pursue an interview to raise awareness about an underrecognized condition. You might want the public to identify and get help for early symptoms, or you might want to create empathy for people coping with a disease you treat. Consider why you are participating in an interview, and communicate that to the interviewer to ensure that your objective can be part of the final product.
Know whom you’re dealing with. It is good to learn about the publication/media channel before you agree to participate. It may have a political bias, or perhaps the interview is intended to promote a specific product. If you agree with and support their purposes, then you may be happy to lend your opinion. But learning about the “voice” of the publication in advance allows you to make an informed decision about whether you want to be identified with a particular political ideology or product endorsement.
Ask to see your quotes before publication. It’s good to have the opportunity to make corrections in case you are accidentally misquoted or misunderstood. It is best to ask to see quotes before you agree to the interview. Some reporters may agree to (or even prefer) a written question-and-answer format so that they can directly quote your responses without rephrasing your words. You could suggest this, especially if you are too busy for a call or live meeting.
As a physician, your insights and advice can be highly beneficial to others. You can also use media interviews to propel your career forward. Doing your homework can ensure that you will be pleased with the final product and how your words were used.
Dr. Moawad, Clinical Assistant Professor, Department of Medical Education, Case Western Reserve University School of Medicine, Cleveland, Ohio, has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
As a physician, you might be contacted by the media to provide your professional opinion and advice. Or you might be looking for media interview opportunities to market your practice or side project. And if you do research, media interviews can be an effective way to spread the word. It’s important to prepare for a media interview so that you achieve the outcome you are looking for.
Keep your message simple. When you are a subject expert, you might think that the basics are obvious or even boring, and that the nuances are more important. However, most of the audience is looking for big-picture information that they can apply to their lives. Consider a few key takeaways, keeping in mind that your interview is likely to be edited to short sound bites or a few quotes. It may help to jot down notes so that you cover the fundamentals clearly. You could even write and rehearse a script beforehand. If there is something complicated or subtle that you want to convey, you can preface it by saying, “This is confusing but very important …” to let the audience know to give extra consideration to what you are about to say.
Avoid extremes and hyperbole. Sometimes, exaggerated statements make their way into medical discussions. Statements such as “it doesn’t matter how many calories you consume — it’s all about the quality” are common oversimplifications. But you might be upset to see your name next to a comment like this because it is not actually correct. Check the phrasing of your key takeaways to avoid being stuck defending or explaining an inaccurate statement when your patients ask you about it later.
Ask the interviewers what they are looking for. Many medical topics have some controversial element, so it is good to know what you’re getting into. Find out the purpose of the article or interview before you decide whether it is right for you. It could be about another doctor in town who is being sued; if you don’t want to be associated with that story, it might be best to decline the interview.
Explain your goals. You might accept or pursue an interview to raise awareness about an underrecognized condition. You might want the public to identify and get help for early symptoms, or you might want to create empathy for people coping with a disease you treat. Consider why you are participating in an interview, and communicate that to the interviewer to ensure that your objective can be part of the final product.
Know whom you’re dealing with. It is good to learn about the publication/media channel before you agree to participate. It may have a political bias, or perhaps the interview is intended to promote a specific product. If you agree with and support their purposes, then you may be happy to lend your opinion. But learning about the “voice” of the publication in advance allows you to make an informed decision about whether you want to be identified with a particular political ideology or product endorsement.
Ask to see your quotes before publication. It’s good to have the opportunity to make corrections in case you are accidentally misquoted or misunderstood. It is best to ask to see quotes before you agree to the interview. Some reporters may agree to (or even prefer) a written question-and-answer format so that they can directly quote your responses without rephrasing your words. You could suggest this, especially if you are too busy for a call or live meeting.
As a physician, your insights and advice can be highly beneficial to others. You can also use media interviews to propel your career forward. Doing your homework can ensure that you will be pleased with the final product and how your words were used.
Dr. Moawad, Clinical Assistant Professor, Department of Medical Education, Case Western Reserve University School of Medicine, Cleveland, Ohio, has disclosed no relevant financial relationships.
A version of this article appeared on Medscape.com.
Climate Change Linked to Lung Cancer in Never-Smokers
The incidence of lung cancer in never-smokers (LCINS) is increasing, and experts think climate change may be driving the uptick.
LCINS differs histologically and epidemiologically from smoking-related cancers, occurring almost always as adenocarcinomas and mostly affecting women and individuals of Asian ancestry, according to a study published in Nature Reviews Clinical Oncology in January 2024. Cases of LCINS are estimated to be the fifth most common cause of cancer-related deaths worldwide.
These potential culprits are varied and sometimes interrelated — and they underscore the need for continued emphasis on environmental hazards, the panelists agreed.
Focusing on climate change — and taking action at the individual level — is a good place to start, said Leticia M. Nogueira, PhD, scientific director of health services research in the Surveillance and Health Equity Science Department of the American Cancer Society.
Long-Term Exposure to Wildfires Linked to Increased Cancer Risk
Climate change is associated with climate-driven disasters such as more intense hurricanes and more frequent wildfires that can expose populations to environmental carcinogens, Nogueira explained.
Such weather events disrupt the care of patients with cancer and lead to poorer outcomes, according to her own research. They also contribute to the rising incidence of LCINS, she said.
In a population-based study published in The Lancet Planetary Health, long-term exposure to wildfires was associated with an increased risk for lung cancer and brain tumors. Individuals exposed to a wildfire within 50 km of their residential locations in the prior decade has a 4.9% relatively higher incidence of lung cancer and a 10% relatively higher incidence of brain tumors.
“These findings are relevant on a global scale given the anticipated effects of climate change on wildfire frequency and severity,” the authors concluded, noting the study limitations and the need for further research.
How Clinicians Can Help
Nogueira urged attendees to take action to help improve healthcare outcomes.
“Let’s not forget that the healthcare system is one of the most emission-intensive industries in the world. Emissions from the US healthcare system exceed emission from the entire UK, and we can be doing much better.
“There is something for each one of us here today to do: We can champion environmentally responsible efforts at our institutions, we can engage with disaster preparedness and response ... and we can document ongoing suffering to increase awareness and incentivize action,” she said.
In a commentary published in CA: A Cancer Journal for Clinicians, Nogueira and her colleagues further addressed the links between climate change and cancer and listed various sources of greenhouse gas emissions and proposed interventions, including those associated with the healthcare industry.
“If you look at this list and say ‘No way — there is no chance my institution will do any of that,’ let me ask you something: Are you allowed to smoke on campus? How do you think that happened? How do you think that started?” she said, invoking Archimedes’ famous quote, “Give me a lever long enough, and I shall move the world.”
“You most certainly have the power to make a difference,” Nogueira said. “So recognize where your points of influence are – move your lever, move the world.”
A version of this article appeared on Medscape.com.
The incidence of lung cancer in never-smokers (LCINS) is increasing, and experts think climate change may be driving the uptick.
LCINS differs histologically and epidemiologically from smoking-related cancers, occurring almost always as adenocarcinomas and mostly affecting women and individuals of Asian ancestry, according to a study published in Nature Reviews Clinical Oncology in January 2024. Cases of LCINS are estimated to be the fifth most common cause of cancer-related deaths worldwide.
These potential culprits are varied and sometimes interrelated — and they underscore the need for continued emphasis on environmental hazards, the panelists agreed.
Focusing on climate change — and taking action at the individual level — is a good place to start, said Leticia M. Nogueira, PhD, scientific director of health services research in the Surveillance and Health Equity Science Department of the American Cancer Society.
Long-Term Exposure to Wildfires Linked to Increased Cancer Risk
Climate change is associated with climate-driven disasters such as more intense hurricanes and more frequent wildfires that can expose populations to environmental carcinogens, Nogueira explained.
Such weather events disrupt the care of patients with cancer and lead to poorer outcomes, according to her own research. They also contribute to the rising incidence of LCINS, she said.
In a population-based study published in The Lancet Planetary Health, long-term exposure to wildfires was associated with an increased risk for lung cancer and brain tumors. Individuals exposed to a wildfire within 50 km of their residential locations in the prior decade has a 4.9% relatively higher incidence of lung cancer and a 10% relatively higher incidence of brain tumors.
“These findings are relevant on a global scale given the anticipated effects of climate change on wildfire frequency and severity,” the authors concluded, noting the study limitations and the need for further research.
How Clinicians Can Help
Nogueira urged attendees to take action to help improve healthcare outcomes.
“Let’s not forget that the healthcare system is one of the most emission-intensive industries in the world. Emissions from the US healthcare system exceed emission from the entire UK, and we can be doing much better.
“There is something for each one of us here today to do: We can champion environmentally responsible efforts at our institutions, we can engage with disaster preparedness and response ... and we can document ongoing suffering to increase awareness and incentivize action,” she said.
In a commentary published in CA: A Cancer Journal for Clinicians, Nogueira and her colleagues further addressed the links between climate change and cancer and listed various sources of greenhouse gas emissions and proposed interventions, including those associated with the healthcare industry.
“If you look at this list and say ‘No way — there is no chance my institution will do any of that,’ let me ask you something: Are you allowed to smoke on campus? How do you think that happened? How do you think that started?” she said, invoking Archimedes’ famous quote, “Give me a lever long enough, and I shall move the world.”
“You most certainly have the power to make a difference,” Nogueira said. “So recognize where your points of influence are – move your lever, move the world.”
A version of this article appeared on Medscape.com.
The incidence of lung cancer in never-smokers (LCINS) is increasing, and experts think climate change may be driving the uptick.
LCINS differs histologically and epidemiologically from smoking-related cancers, occurring almost always as adenocarcinomas and mostly affecting women and individuals of Asian ancestry, according to a study published in Nature Reviews Clinical Oncology in January 2024. Cases of LCINS are estimated to be the fifth most common cause of cancer-related deaths worldwide.
These potential culprits are varied and sometimes interrelated — and they underscore the need for continued emphasis on environmental hazards, the panelists agreed.
Focusing on climate change — and taking action at the individual level — is a good place to start, said Leticia M. Nogueira, PhD, scientific director of health services research in the Surveillance and Health Equity Science Department of the American Cancer Society.
Long-Term Exposure to Wildfires Linked to Increased Cancer Risk
Climate change is associated with climate-driven disasters such as more intense hurricanes and more frequent wildfires that can expose populations to environmental carcinogens, Nogueira explained.
Such weather events disrupt the care of patients with cancer and lead to poorer outcomes, according to her own research. They also contribute to the rising incidence of LCINS, she said.
In a population-based study published in The Lancet Planetary Health, long-term exposure to wildfires was associated with an increased risk for lung cancer and brain tumors. Individuals exposed to a wildfire within 50 km of their residential locations in the prior decade has a 4.9% relatively higher incidence of lung cancer and a 10% relatively higher incidence of brain tumors.
“These findings are relevant on a global scale given the anticipated effects of climate change on wildfire frequency and severity,” the authors concluded, noting the study limitations and the need for further research.
How Clinicians Can Help
Nogueira urged attendees to take action to help improve healthcare outcomes.
“Let’s not forget that the healthcare system is one of the most emission-intensive industries in the world. Emissions from the US healthcare system exceed emission from the entire UK, and we can be doing much better.
“There is something for each one of us here today to do: We can champion environmentally responsible efforts at our institutions, we can engage with disaster preparedness and response ... and we can document ongoing suffering to increase awareness and incentivize action,” she said.
In a commentary published in CA: A Cancer Journal for Clinicians, Nogueira and her colleagues further addressed the links between climate change and cancer and listed various sources of greenhouse gas emissions and proposed interventions, including those associated with the healthcare industry.
“If you look at this list and say ‘No way — there is no chance my institution will do any of that,’ let me ask you something: Are you allowed to smoke on campus? How do you think that happened? How do you think that started?” she said, invoking Archimedes’ famous quote, “Give me a lever long enough, and I shall move the world.”
“You most certainly have the power to make a difference,” Nogueira said. “So recognize where your points of influence are – move your lever, move the world.”
A version of this article appeared on Medscape.com.
Disc Degeneration in Chronic Low Back Pain: Can Stem Cells Help?
TOPLINE:
Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.
METHODOLOGY:
- The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
- Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
- The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
- The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.
TAKEAWAY:
- At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
- The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
- The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
- The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.
IN PRACTICE:
“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.
SOURCE:
The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.
DISCLOSURES:
The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.
METHODOLOGY:
- The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
- Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
- The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
- The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.
TAKEAWAY:
- At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
- The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
- The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
- The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.
IN PRACTICE:
“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.
SOURCE:
The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.
DISCLOSURES:
The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.
TOPLINE:
Allogeneic bone marrow–derived mesenchymal stromal cells (BM-MSCs) are safe but do not show efficacy in treating intervertebral disc degeneration (IDD) in patients with chronic low back pain.
METHODOLOGY:
- The RESPINE trial assessed the efficacy and safety of a single intradiscal injection of allogeneic BM-MSCs in the treatment of chronic low back pain caused by single-level IDD.
- Overall, 114 patients (mean age, 40.9 years; 35% women) with IDD-associated chronic low back pain that was persistent for 3 months or more despite conventional medical therapy and without previous surgery, were recruited across four European countries from April 2018 to April 2021 and randomly assigned to receive either intradiscal injections of allogeneic BM-MSCs (n = 58) or sham injections (n = 56).
- The first co-primary endpoint was the rate of response to BM-MSC injections at 12 months after treatment, defined as improvement of at least 20% or 20 mm in the Visual Analog Scale for pain or improvement of at least 20% in the Oswestry Disability Index for functional status.
- The secondary co-primary endpoint was structural efficacy, based on disc fluid content measured by quantitative T2 MRI between baseline and month 12.
TAKEAWAY:
- At 12 months post-intervention, 74% of patients in the BM-MSC group were classified as responders compared with 68.8% in the placebo group. However, the difference between the groups was not statistically significant.
- The probability of being a responder was higher in the BM-MSC group than in the sham group; however, the findings did not reach statistical significance.
- The average change in disc fluid content, indicative of disc regeneration, from baseline to 12 months was 37.9% in the BM-MSC group and 41.7% in the placebo group, with no significant difference between the groups.
- The incidence of adverse events and serious adverse events was not significantly different between the treatment groups.
IN PRACTICE:
“BM-MSC represents a promising opportunity for the biological treatment of IDD, but only high-quality randomized controlled trials, comparing it to standard care, can determine whether it is a truly effective alternative to spine fusion or disc replacement,” the authors wrote.
SOURCE:
The study was led by Yves-Marie Pers, MD, PhD, Clinical Immunology and Osteoarticular Diseases Therapeutic Unit, CHRU Lapeyronie, Montpellier, France. It was published online on October 11, 2024, in Annals of the Rheumatic Diseases.
LIMITATIONS:
MRI results were collected from only 55 patients across both trial arms, which may have affected the statistical power of the findings. Although patients were monitored for up to 24 months, the long-term efficacy and safety of BM-MSC therapy for IDD may not have been fully captured. Selection bias could not be excluded because of the difficulty in accurately identifying patients with chronic low back pain caused by single-level IDD.
DISCLOSURES:
The study was funded by the European Union’s Horizon 2020 Research and Innovation Programme. The authors declared no conflicts of interest.
This article was created using several editorial tools, including AI, as part of the process. Human editors reviewed this content before publication. A version of this article appeared on Medscape.com.